Some of the comments in the debate over Los Alamos National Laboratory and the perceived safety problems there (March 2005, pages 10 and 26) have intrigued me. As the radiation safety officer at a community hospital, I deal daily with regulatory bodies and with the business culture of “continuous improvement.” The basic idea of CI is noble and desirable. When there is an incident, even a minor one, only a fool would not want to know why and how it occurred, and whether similar things could be prevented in the future. That is the “improvement” part of the program.

Unfortunately, administrators, regulators, and often staff focus far too much on the “continuous” part, and that leads to interpretations that simply are not consistent with reality. A simple example will suffice. Suppose you have 10 incidents in a year. Through diligence and rethinking policies, the next year there are 8, then 6, and on down. What happens if you are fortunate enough to get to 0? How do you improve from there? When the inevitable next incident occurs, you have now “trended negatively,” and someone will want to know why. And what if you never get to 0, or have a series of years with 1 and 0 incidents? In the world of Los Alamos’s director G. Peter Nanos, you have stagnated, and that is a problem.

In my experience, many, perhaps most, incidents are not caused by gross negligence, but by simple human error. Having well-considered and realistic policies and procedures is vital, but no set of them will ever eliminate human error. If you are lucky enough to get to 0 incidents, don’t expect to stay there.

As for David Herbert’s “trap of expertise,” in my opinion that is a form of human error. It has always happened and will continue as long as human beings are the subject of discussion.

So, do we just give in to fate? Not exactly. Goals are important, but they must be realistic. I think Brad Lee Holian understands that point. After all, the standard for radiation safety is the ALARA principle—as low as reasonably achievable. Note the fourth word. And then there is education, which in my experience is the one thing that does the most to improve any substandard situation. In my facility, I force any staff members who work in an area where radioactive materials or radiation-producing devices are used to attend an annual education class. In some critical areas, the classes are more frequent than annually. (I admit I often bribe employees with food, but you would be amazed at how free pizza can increase someone’s attention span!) At a minimum, we go over the basics of safety and the specifics of their areas. This annual renewal of information is the one thing I have found that can reduce the trap of expertise. But it can never eliminate it.