.

Medical Equipment: Good Design or Bad Design?

Marc Green


Imagine that you were designing a new computerized system for monitoring the life signs of critical care patients. As a critical system, you need to include alarms for high risk situations. You plan on having the floor nurse who activates the device for a new patient set the alarms. But you also recognize the need for the nurse to review and explicitly confirm all settings before activation. The nurse will setup the system, review the device settings and confirm that the system is ready for use before activation.

A hypothetical design for the operator-interface of your patient monitoring system might look like this. The nurse enters the patient information into a form on the system's computerized display. Next, the computer presents the nurse with a series of screens that show the system's alarm and data monitoring settings. The nurse reviews a screen, confirms the input by hitting "enter" and then views the next screen. After three screens of information have been confirmed, the system is set and ready for use.

Is this a good design?

The answer is no. There was a similar device whose design caused a patient death because a nurse failed to detect an alarm malfunction. When the hospital first introduced the device, the nurse initially viewed each screen and inspected its contents. With experience, however, her responses "chained" and became automatic. She would confirm each screen's settings by quickly hitting the enter key, without really reviewing the screen's information. On the fateful day, the alarm was not set due to a malfunction. However, the nurse failed to notice. The patient subsequently arrested but there was no alarm and the patient died.

This error was highly predicable by anyone knowledgeable in human factors. As people learn a task, their behavior goes from being "controlled" to "being automatic." In controlled behavior, people closely monitor the world and how their movements affect changes. With practice, two things happen. First, the person ceases to pay close attention to visual input. Unexpected changes will likely go unnoticed. Second, individual responses chain together to form a single extended response. The action proceeds on "muscle memory," the proprioceptive guidance from the hands themselves rather than from visual information. The first enter response provides the cue for the next. Having a series of identical responses, such as pressing the enter key 3 times in a row, strongly encourages response chaining.

For every new patient, the nurse would set the alarm, and invariably receive a message confirming that the alarm was set. She unconsciously learned that the screens contained no new information and that her attention was better allocated to other matters. She also likely exhibited "confirmation bias," a powerful mental tendency to seek evidence that verifies already-held beliefs. When the alarm system failed, she missed the screen saying that the alarm was still off. The response chain, once started, had run off without conscious supervision.

This error was not due to a mental lapse by the nurse. On the contrary, she had learned what all skilled professionals learn - to ignore irrelevant information and to attend only to what matters. The error was "normal" and predictable and due to faulty operator interface design.

Conclusion

The lesson is clear: while human factors is important for making more efficient and more usable medical devices, it is also important for making safer medical devices. Unfortunately, these goals sometimes conflict. Proper human factors design requires consideration of both usability and safety.

For a more compete discussion on these issues, see Green, M. (2004). Nursing Error and Human Nature," Journal of Nursing Law, 9, 37-44.