January 25, 2007
Medical Diagnosis and Software DiagnosisInteresting article in The New Yorker about how doctors think when they diagnose illness, or to quote the article, "the process by which doctors interpret their patients' symptoms and weigh test results in order to arrive at a diagnosis and a plan of treatment". The author, Jerome Groopman, makes the point that medical school students spend a lot of time memorizing facts, and a lot of time learning applications of those facts, but not a lot of time thinking about how to ensure they make correct diagnoses.
He describes three types of errors. The first is "representativeness", being overly influenced by what is typically true: "[Doctors] fail to consider possibilities that contradicts their mental templates of a disease, and thus attribute symptoms to the wrong cause." The example given was a very fit man who came in complaining of chest pain, but not the pain normally associated with coronary-artery disease; as a result the doctor was assuming that everything was OK, and was surprised when the patient had a heart attack the next day (like every patient discussed in the article, he survived).
The second type is "availability error", "the tendency to judge the likelihood of an event by the ease with which relevant examples come to mind." In the example, a doctor who had recently seen a lot of patients with pneumonia was quick to diagnose a new patient with it, even though she actually had aspirin toxicity. He over-emphasized the symptoms which were associated with pneumonia, and ignored the ones that were not, because pneumonia was the diagnosis that came to mind (the article also mentions that psychologists call this "confirmation bias", "confirming what you expect to find by selectively accepting or ignoring information"; I'll ignore the obvious political comment that could be made here).
The third type is "affective error", making decisions based on what we wish were true. In the example cited, a doctor had failed to perform a particularly embarrassing examination on a patient because he liked him personally, and was hoping that he did not have an infection.
There's an interesting quote about how doctors begin diagnosing patients as soon as they meet them: "Even before they conduct an examination, they are interpreting a patient's appearance...Doctor's theories about what is wrong continue to evolve as they listen to the patient's heart, or press on his liver. But research shows tht most physicians already have in mind two or three possible diagnoses within minutes of meeting a patient, and that they tend to develop their hunches from very incomplete information."
What this made me think of is how developers approach debugging software (it also made me think, to a lesser extent, about how we approach interviewing). Although debugging is not the life-or-death situation that medical diagnosis can be, and there are ways in which debugging software can be both harder (computers can't tell you how they feel) and easier (we can look as deeply as we want into a computer), there are some aspects that are remarkably similar (such as "it doesn't happen in my office"). Working in an emergency room with a patient who is in critical condition has some (note that I only said some) of the pressue of working in a build lab trying to figure out why the new build of Windows is crashing in your driver. Programmers do start out with a hypothesis based on the early symptoms reported, and then set about proving or disproving that hypothesis. The mistakes of representativeness, availability errors, and affective errors remind me of mistakes I have made while debugging, for similar reasons.
And I think that the cognitive dimension of debugging has been just as ignored as the cognitive dimension of medical evaluation. The medical school tenet of "see one, do one, teach one" accurately describes the way in which debugging knowledge is learned and passed on by developers. Doctors are starting to look into how doctors think (search for the name "Pat Crosskerry" to find some of this); I wonder if someone has, or will, look into how programmers think.
Posted by AdamBa at January 25, 2007 09:57 PM
TrackBack URL for this entry:
A bit over a decade ago, I went to an Ear/Nose/Throat specialist with a ruptured ear drum. Twice he made initial diagnosis that turned out to be wildly wrong. Fortunately he did tests to make sure, and caught his errors. It turned out that I had a life threatening condition, and needed surgery. And so, I'm still around a decade later.
Posted by: Mike Swaim at January 26, 2007 05:37 AM
A minor comment is that the first guy was only 40; had he been, say, 70, the doctor would likely have looked for coronary disease. But the main comment I would make is how this contradicts Malcolm Gladwell's Blink. While doctors are always making "blink type" initial diagnoses, they are too often wrong. Come to think of it, Gladwell hasn't been doing too well recently. His attempt to exculpate the Enron executive suite was shot down all around.
Posted by: Marble Chair at January 26, 2007 08:38 AM
You might find this Scientific American article called "The Expert Mind" interesting http://www.sciam.com/article.cfm?chanID=sa006&colID=1&articleID=00010347-101C-14C1-8F9E83414B7F4945.
It describes research into how people get good at something, which is often through effortful study rather than from an innate talent.
Effortful study means that you consciously work to get better at something by attempting tasks that are just out of your reach. For instance rather than hitting a bucket of golf balls randomly try and hit a 100 balls at the same target. Or rather than implementing another linked list algorithm in C, implement it in assembly code, or implement a Red-Black tree, or help out debugging an unfamiliar area of your product.
Posted by: Andrew at January 26, 2007 04:55 PM
Many people suffer because of diagnosis incorrectly put initially why that physicians do not hurry to recognize the fault, whether there are the successful judicial claims, won by patients? Where about it is possible to esteem? WBR LeoP
Posted by: World Health at February 3, 2007 09:29 AM