.

Nursing Error

Marc Green


Note: This is a slightly expanded version of the article, "Nursing Error and Human Nature," Journal of Nursing Law, Vol. 9, pp 37-44, 2004.

A highly experienced hospital nurse selected a vial of Lasix from a stock drawer containing medications and placed it in a cart. She checked the vial's label on three separate occasions before eventually administering the drug. The vial had actually contained KCL, potassium chloride, so the drug killed the patient. The vial was correctly labeled, and the nurse could not explain the lapse.

Such Medication errors are astonishingly common. Following each incident, authorities establish an inquiry to explain the accident's cause. The finger of blame often points at nurses, increasingly to the extent of criminal prosecution. The nurse who administered the fatal KCL, for example, was charged with negligent homicide.

Medical error inquiries are often misguided because they fail to consider some important facts. First, an inquiry that is judgmental and narrowly focused on the nurse is likely to overlook system and procedure faults. Investigators should be attempting to explain rather than to blame. Second, many errors originate in the same cognitive processes and behavioral adaptations that produce efficient and skilled behavior. You cannot have one without the other, so "errors" will always occur. Accidents generally occur because of normal rather than aberrant behavior. Third, human behavior is the least malleable aspect of any system, such as a hospital, where people interact with a complex environment. It is futile to attempt error reduction by changing peoples' cognition and behavior, most of which occurs outside of conscious and voluntary control. The only reliable method for reducing errors is to anticipate the circumstances that will confound normal behavior and then to design safeguards into the situation.

Blaming Vs. Explaining

Assigning blame to nurses is popular because it minimizes the error's impact. People generally prefer simple and obvious explanations. If human error caused the accident, then it can be treated as an isolated incident. There is no need for a system review that might highlight the need for or expensive changes, such as increased staffing or new equipment. Moreover, people who created the system and procedures hold no responsibility.

Further, investigations are colored by the "fundamental surprise" (Lanir, 1986) of an accident. People believe that their system is safe and are genuinely surprised when an accident refutes this assumption. The search for cause slants toward explanations that are judgmental, and "proximal," meaning that they narrowly focus on the people closest in space and time to the accident and ignore the system as a whole. Nurses are at the 'sharp' end of the system, the ones in closest proximity to the actual event. They perform the action immediately preceding the consequences, such as the drug effect, whose accuracy is being monitored.

A case of medical error in Colorado demonstrates the point. Police charged three nurses with negligent homicide following an infant's death from a fatal overdose. A subsequent analysis (Smetzer, 1998), however, a uncovered chain of numerous errors from the time of prescription to the time of injection. Police did not charge a physician who wrote a cryptic prescription or a pharmacist who misread the dosage.

Accident investigators have a fundamental choice; is the goal to blame or to explain? Assigning blame to individuals accomplishes very little. It fails to explain why such medical accidents are so common and so difficult to prevent and, more importantly, does not reduce likelihood of future accidents. Even thorough inquiries often do no more than pinpoint places in the system where the error occurred. They still do not explain why so many people make so many errors. For that, authorities should heed the advice of Kay (1971), who said, "We shall understand accidents when we understand human nature."

One prominent authority on medical error (Leape, 1994) agrees that error reduction requires an understanding of human cognition. However, he also said, "Most errors result from aberrations in mental functioning" (p 1853). His use of the term "aberration," however, only further encourages assignment of blame. In fact, my main point in this article is that Leape has it exactly backwards. Errors commonly result from normal, adaptive and intelligent behavior. Skill and error are often differentiated by outcome, not by the quality of the behavior producing the outcome.

In understanding the cause of medication accidents, the starting point is a simple observation of human nature: almost no one sets out to make an error. The nurse performed the action because, given her or her mindset, because it seemed reasonable at the time. After all, the nurse did not know or expect that the action would cause accident and injury. The ultimate explanation for a medical error then lies in the factors that produced the mindset and made the action appear reasonable.

These causal factors are the limitations of human mental capacity, the effects of learning and experience and the physical design of the environment. Humans have finite mental capacity, so evolution has equipped us with strategies to help us act more efficiently. These strategies are learned through adaptation and experience interacting with a particular situation. Unfortunately, people inevitably meet circumstances where the same strategies that enable skilled and expert behavior backfire and cause error. One irony of medical error is that the most experienced and able people are likely to make the most egregious and unfathomable errors. They have the most experience, the greatest skill and the strongest expectations.

One very common type of accident is caused by the "look but fail to see" error, which grows out of normal and usually beneficial aspects of human nature and mental functioning. The accident involving the nurse, for example, is typical: someone looks directly at a label, warning or sign and fails to see what, in retrospect, should have been plainly visible. Such errors are common in all walks of life. There are many accidents where a driver failed to "see" a clearly visible pedestrian, vehicle or bicycle, a train crewman failed to see a stop signal, a worker failed to see the blade of a saw, etc. I recently reviewed a case where a worker set off an explosion because his torch melted a plastic valve holding compressed gas even though the warning to avoid heating the valve was plainly visible.

These errors are due to "inattentional blindness," (Green, 2002) a phenomenon so pervasive that it occurs more than half of the time under some circumstances. Inattentional blindness is not caused by carelessness or stupidity. Rather, it is the natural result of two fundamental aspects of the human condition: limited mental attention and high adaptability.

Errors Are Sometimes Caused By "Normal," Not Abnormal Behavior

Seeing is far more complicated that most believe. The intuitive notion of seeing that we point our eyes toward something, its image forms in the eye and it is perceived. This, however, is fundamentally incorrect because seeing requires much more mental processing. At every moment, the environment bombards the senses with vast quantities of information. This deluge far exceeds our mental processing ability, so we need some way to filter away irrelevant information while allowing the relevant to be fully processed and perceived. This filter is called attention, and without it, we would not be able to focus on the task at hand.

The existence of attentional filtering has two important consequences. First, most of the world goes unnoticed. For example, notice the feeling of the sole of the foot against the inside of the shoe. Notice the movement of the chest as breathes come in and out. Listen to the background sounds. All of this sensory input was filtered away until I pointed it out. The attentional filter had long ago decided that these sensations were irrelevant and were not worth the mental resources needed to consciously perceive them.

Second, the more attention paid to one task, the less that is available for others. People who focus narrowly are more likely to be inattentionally blind. In one dramatic instance, an airliner crashed because the pilot and co-pilot became so engrossed in a blinking panel light that they failed to notice that the aircraft was headed straight into the ground. More commonly, talking on a cell phone, for example, may take attention away important driving tasks (Green, 2001).

Attention's main job is to select a subset of the world for complete processing. To be effective, attention must correctly identify important information so that it is not filtered away. Safety professionals have long been interested in the issue of conspicuity, getting people to notice unexpected objects. While the traditional focus has been on innate sensory factors such as color, contrast and motion, recent research suggests that conspicuity results more from learning and adaptation. Experienced people develop expectations and mental models that permit pre-programming of behavior and minimization of thought for routine, frequently performed tasks. If people were required to stop and think about every sight, sound and decision, then they would be virtually immobilized. Instead, adaptation allows people to compensate for limited attention by only noticing what is likely to be important.

Adaptation, however, is a two-edged sword because it improves performance when circumstances are stable but can create errors when the situation changes. There are several ways that expectation might have contributed to the errant KCL administration. First, the nurse had often administered KCL but had not used Lasix for almost six months. When she reached into the drawer, she simply followed her typical routine of removing the familiar KCL vial. This is an example of a "capture" error, where one behavior hijacks another in midstream. For example, to call my dentist, I must dial 10 digits. The first 6 digits are the same as those of my residence. I frequently call home (I travel a lot) but seldom call my dentist. There have been occasions, however, when I meant to dial my dentist but ended up calling my home number. Both acts begin with the same behavior, dialing the same 6 numbers, but the stronger habit of dialing home habit overrules my conscious intention of dialing the dentist. This is an example of a more general phenomenon called "automatic behavior. People who learn a task, such as playing a piece on the piano, start by focusing attention on each response. With practice, the individual responses chain together to become a single response. Once started, the response continues without attentional supervision. The action proceeds on "muscle memory," the proprioceptive guidance from the hands themselves rather from visual guidance. However, any unusual change in the situation will go unnoticed. The series of behaviors, walking to the drawer, reaching in, and extracting a bottle had chained together to become a single response. One initiated, it ran its familiar course. In this case, the familiar course was retrieving a vial of KCL.

Automatic behavior is an especially frequent cause of error. One nurse, for example, accidentally turned off the alarm on computerized equipment monitoring a critically ill patient. Normal operation required her to set the alarm to "on" and then to confirm the choice with several more responses on computer keyboard. She viewed a series of computer screens containing information about the alarm system. After each, she was to press the enter key again to confirm and to see the next screen. With experience, the responses chained and became automatic. She would set the alarm and begin merely hitting the enter key rapidly - tap, tap, tap - without really monitoring the screen information. She had done this many times before and the screens had never revealed any important information, so she began (unconsciously) conserving attention and increasing efficiency by ignoring the "irrelevant" information. On this one occasion, she missed the screen saying that the alarm was still off. The response chain, once started, had run off without supervision.

Expectation also prevented the nurse who administered the KCL from reading the vial's label. As a novice, she likely selected the vial and then carefully scrutinized the label. Reading print on a small vial is an arduous task requiring close attention. Most people will adapt and "cue generalize" by relying on a simpler sensory cue, selecting by familiar location in the drawer of by packing color, shape, etc. The stock drawer in the KCL incident had no dividers so the vial positions could become scrambled, possibly confusing the habit of reaching in a familiar location. The errant choice was also likely abetted by visual confusion. One study (Patient Safety, 1998) found that in several previous accidents KCL had been frequently confused with other drugs, including Lasix, due to similar packaging. Since her expectation developed and shift or selection cues developed without awareness, she could not explain the lapse.

These results are hardly surprising given recent studies of eye movements during routine behavior (see Land, 2006 for a review). People engaged in goal-directed tasks learn to direct attention almost exclusively toward objects that are relevant to task completion. These studies find that control of attention is directed primarily by top-down task relevance and expectation much more than by any sensory conspicuity factor. As a consequence, attempts to avoid errors by using attracting attention with sensory cues such as color, size and shape and especially text are likely to very have limited effectiveness.

It may seem unintuitive that someone could look at a label and fail to grasp the meaning, but sensing is not perceiving. Normally people go from sensing sound and images to perceiving meaning so quickly that they are unaware that these are two different mental processes. The classic example is the cocktail party where you are having a conversation. You understand the words of your partner and are vaguely aware of the buzz of unintelligible conversations in the background. That is, you sense the sounds, but do not perceive their meaning. In the same way, the nurse can see the label but not understand its meaning, usually due to divided attention and/or expectation.

The same cue generalization that caused the initial selection error helped defeat her three "safety" checks, where she repeatedly failed to notice that the label said KCL. It was further strengthened by the psychological phenomenon "confirmation bias," the strong tendency to seek information that confirms already held beliefs. The nurse had learned that her initial selection was based on a reliable cue for selection, so there would be no additional benefit from rereading the label. After all, it would only confirm the "correct" selection that was already made.

Basing safety on the assumption that people will always correctly read labels or instructions accurately is begging for disaster, as a demonstrated by a similar error in the Colorado case. The pediatric pharmacist was absent, and the replacement pharmacist was not used to dealing with infants. She misread the dosage, perceiving an extra 0, so that it was 10 times greater that it should have been for an infant. In fact, she misread the dosage similarly in two separate sources. The pharmacist likely perceived the higher number because it was what she expected to see from her experience with adults. Moreover, confirmation bias further made the attempt to check the dosage in a second source unreliable. People cannot be relied upon to be infallible in checking their own work. While the old adage says that "seeing is believing", it is also true that "believing is seeing".

These situations could just as easily be viewed as reasonable application of experience as "errors." They are better described as violated expectations, where the rare circumstances conspired to confound highly adaptive and intelligent behavior. The difference between acting on reasonable expectation and error is the outcome, not the behavior itself. The very definition of skill is the ability to use these mental shortcuts to increase efficiency that would otherwise be lost to the attentional bottleneck. If every decision required careful conscious thought and scrutiny of all information sources, then an organization would grind to a halt.

The nurse performing the administration did not know about the negative outcome prior to acting, so the reasonableness of his/ her behavior cannot be viewed in light of its consequences. Unfortunately, most accident investigations are blinded by several cognitive biases. The most prominent is hindsight bias, the tendency to judge actions on what is known now instead of what was known then. Unlike the people performing the errant act, the investigators in hindsight can focus attention on the one critical aspect of the situation and ignore the irrelevant. They are not limited by time pressure, fatigued by overwork, required to divert attention to performing or planning other tasks or blinded by expectations. And, course, they know the outcome. They too often attempt to disembody behavior by removing it from its context. All behavior is embedded as much in the past as in the present. Much behavior is inexplicable unless viewed in light of the past events and experiences that shaped it.

Another bias is to substitute labels for explanations. The nurse administering the drug was presumably negligent because she was "inattentive." Consider the logic used to arrive at this conclusion:

Question: Why didn't the nurse correctly read the label?
Answer: Because she was inattentive.
Question: How do we know that she was inattentive?
Answer: Because she didn't read the label.

This is circular reasoning which cannot be disproved. It occurs when people use a label to merely restate what is already known in new terms. "Inattention" is no more than shorthand description for what occurred. Of course, the nurse did not attend the label. We already knew that, so there is no additional information provided by attaching the label "inattentive" to her. The real question is why the label failed to engage attention and why her actions seemed reasonable to her at the time. As explained above, the answer lies in the normal operation of attention and her experience.

Further, the critical cause more lies in the physical environment. In the Lasix/KCL mix up, the problem was the drug storage; the stock drawer had no clear-cut dividers to separate drugs in different areas from one another. In fact, it could be questioned whether the KCL should have been in the same stock drawer at all. In other cases, it is similar packaging or other aspects of the system design.

While causation can be attributed to the person or the circumstances, humans have a strong bias to blaming people. This "fundamental attribution error" (Ross, 1997) has been found across a wide range of situations and is almost impervious to refutation. It occurs although it is more accurate to assume that most people are likely to perform the same actions in the same circumstances.

Fundamental attribution error helps to arrive at a simple explanation. If a medical technician, for example, presses the wrong key on a computer keyboard and delivers a fatal dose of radiation, as in the Therac-25 deaths, it is simpler to explain the event by saying that he was careless (another popular descriptive label) rather than to work through the complexities of faulty computer interface design, an organization system that failed to correct known errors or inadequate training. The same innate mental functioning that strives for efficiency and simplicity both causes accidents and misleads those who investigate them.

Error Reduction Requires Modifying Design, Not Behavior

The journalist H. L. Mencken once said that "there is always a well-known solution to every human problem-neat, plausible, and wrong." This is especially true in medical error prevention. There is no point in telling people to "pay attention." because most skilled behavior is automatic and not under conscious control. People, by definition, are not aware that they aren't aware.

Similarly, calls for more instruction and education are also futile. Instruction only affects conscious, voluntary actions. It might set beginners on the right track, and prevent "knowledge based errors" that occur when someone makes an incorrect conscious decision and then carries it out as intended (consciously intend to administer KCL and administer KCL) but won't affect people who form the correct intention but perform the wrong act (consciously intend to administer Lasix but administer KCL). Experienced and skilled people are usually operating in an automatic or semi-automatic mode and are not consciously making decisions.

Strict enforcement of rules and procedures sounds good, but skilled practitioners base their actions on experience and efficiency and generally do not follow rules exactly. There is great irony when medical staff are chastised for failing to follow some procedure. A hospital, or any other organization, is quickly brought to its knees when employees actually follow procedures to the letter and "work to rule."

Since attempting to change cognitive functioning is futile, the best accident reduction strategy is to anticipate human cognition and to design systems that prevent the likely error. The Lasix/KCL confusion was highly predictable given the circumstances. The environment could have been altered to minimize the likely effects of expectation and automatic behavior. The stock drawer containing the medications could have had dividers to prevent drug positions from changing and the packaging could have been more distinctive. While there are improvements, "inattentional blindness" could still occur. The better strategy would be to simply prevent any possibility of error by keeping the KCL off the floor, which is becoming common practice. However, some nurses are resisting the change because KCL may be needed quickly. This highlights one problem with creating safe systems; the same steps might reduce errors would definitely create a cumbersome and inefficient operation. Safety is never the only goal of any organization so it must be weighed against cost.

Similarly, the accident involving the nurse who turned off the alarm system could have been have predicted and corrected by anyone with even a rudimentary knowledge of human cognition and computer interface design. The confirmation responses could have made unpredictable and varied each time depending on the screen contents. The user would then be forced to actually read the screen in order to know what to do next and be prevented from developing automaticity. The accident also demonstrates another common design flaw, failure to clearly signal the system state. In the infamous Therac-25 radiation incidents, the computer failed to properly reveal which beam was operational. The inadvertent switching off the alarm might also have been prevented by additional feedback, perhaps a tone or voice message, which indicated whether the alarm was on or off.

Blaming the nurse is not likely to produce better-designed systems and equipment. Quite the contrary, it masks the real causes of accidents that exist in the world rather than in the head and prevents remedial action. Humans will always make errors because of normal cognitive process, so it is impossible to eliminate accidents purely by managing behavior. Instead, errors can best be prevented by proper understanding of human nature and by designing systems that take human nature into account. Errors may be inevitable, but accidents are not.

References

Green, M. (2001) Do Mobile phones pose an unacceptable risk? A Complete look at the adequacy of the evidence?" Risk Management, November, 40-48.

Green, M. (2001) Inattentional Blindness, Occupational Health & Safety Canada, Jan/Feb, pp. 23-29.

Kay, H. (1971). Accidents: Some facts and theories. P. Warr Psychology at Work (pp. 121-145). Baltimore: Penguin.

Land, M. (2006). Eye movements and the control of actions in everyday life. Prog Retin Eye Res., 25, 296-324.

Lanir, Z. (1986) Fundamental Surprise. Eugene Oregon. Decision Research.

Leape, L. (1994) Error in medicine. Journal of the American Medical Association, 272, pp 1851-1858.

Patient Safety (1998) Sentinel Event Alert, Joint Commission on Accreditation of Healthcare Organizations. Issue 1.

Ross, L. (1977) The intuitive psychologist and his shortcomings: Distortions in the attribution process. In L. Berkowitz (Ed.), Advances in Experimental Social Psychology Vol 10 (New York: Academic Press), pp. 173-220

Smetzer, J. (1998) Lesson From Colorado: beyond blaming individuals. Nurse Management, 29, 49-51.