« The Importance of Experimenting | Main | Teaching a Horse to Talk »

January 19, 2010

Atul Gawande Visits Microsoft

Atul Gawande, who I have blogged about in the past, came to Microsoft last Monday on his book tour for The Checklist Manifesto.

He spoke for a little bit and then took questions. If you've read his book there wasn't anything particularly new in what he said, although it was interesting to hear some of the stories directly from him. He has said that about 20% of doctors resist the idea of checklists, so during Q&A I asked him about the adoption of checklists (and similar moves away from "fighter pilot" mode) among resisters in industries like aviation and construction. He said that part of it was the older generation retiring, but there also tended to be a point where the government stepped in and imposed rules, which then led to everybody needing to adopt a checklist.

I realized, however, that we have been looking at checklists for software in the wrong way. A typical checklist we have from EE might be a code review checklist which had items like:

  • Do all variables conform to the team's coding conventions?
  • Does every method have an XML comment describing its return code?
  • Does the code avoid using reinterpret_cast?

The problem with these items is that each of them is something you have to check for on almost every line, which means you wind up looking at a little bit of code, then the checklist, then the code, etc. Or you wind up memorizing the checklist, but if you can hold all that stuff in your mind, you might forget something occasionally, which is what a checklist is designed to avoid. As Gawande explains, checklists are not meant for "heat of battle" kind of checks--that is where your existing expertise come in.

Instead, checklists are for use during "pause points"--natural points where people have a moment to consider a checklist, such as the beginning and end of activities--and are for checking the very simple things that "everybody knows" but that people sometimes forget because they have so much to keep in their heads. The proper way to construct a code review checklist would be to conduct a root-cause analysis of issues that slipped through earlier code reviews, but following Gawande's advice, it would include items like:

  • Has the reviewer rebuilt the code from the shelved files to ensure that every change is included?
  • Did the author email his unit test results to the reviewer?
  • After the review, did the reviewer email his results to the coderev alias?

I'm making that up, but hopefully you get the idea. These are really simple things that people mostly do right, but sometimes mess up, and there is a natural time to do them that is NOT when your head is buried in the code.

Posted by AdamBa at January 19, 2010 10:30 PM