Apply Human Factors Design to Ambulatory Safety

April 2014 - Vol.11 No. 4 - Page #12


Q&A with Rollin J. “Terry” Fairbanks, MD, MS
Director of the National Center for Human Factors in Healthcare

Pharmacy Purchasing & Products: What is human factors engineering (HFE) and how does it apply to the health care environment? 
Rollin J. “Terry” Fairbanks: Human factors applies knowledge about human performance strengths and limitations in the design of interactive systems of people, equipment, and environments to ensure their effectiveness, safety, and ease of use.1 HFE is an applied science that draws from multiple disciplines to consider human capabilities and limitations when designing devices, products, and processes. Any time humans interact with systems, human factors—such as perceptual, cognitive, emotional, social, cultural, and biomechanical (physical ergonomics) concerns—must be taken into consideration. HFE concerns emerged during World War II military and aviation operations, and were leveraged to improve nuclear power safety and aviation safety in the 1970s. In recent years, HFE has been utilized in health care to improve patient safety and increase efficiency.

Traditionally in health care, our approach has been that when an error occurs, we simply train staff to do better next time. However, the reality is that any time humans are involved in a process, there is the potential for error—and this human factor is inevitably repeated by others in similar roles, so individual training does not have a major impact. HFE is predicated on the belief that while human error cannot be eliminated, the harm associated with human errors can be reduced or mitigated. Too often in health care, a name, blame, train, and shame approach is undertaken after an error has occurred. However, this approach is not effective, as its ultimate goal—eliminating human error—is impossible. In fact, focusing too many resources on the attempt to eliminate individual human error is misdirected because it can distract from the modifiable system contributions; instead, to ensure safety, the goal must be to design systems that are capable of identifying errors before they reach the patient. For example, when two medication vials are the same size and have similar labeling, this creates the potential for error that may injure a patient. Although nurses may carefully double-check medications before they are administered, the opportunity for error cannot be eliminated. However, ensuring that medication packaging is easily distinguishable, and that look-alike/sound-alike medications are stored apart from one another, helps reduce the risk of error.

PP&P: What are the benefits of applying HFE within health care?
Fairbanks: Utilizing HFE in health care can improve patient safety and outcomes, increase efficiencies, improve communication among health care providers, and increase cost savings through prevention and mitigation of adverse events. The goals of HFE include making it easier for health care providers to comply with appropriate practices, reducing failure modes, and automating routine tasks.  Awareness of human factors, such as mental workload, distractions, the physical environment, device and product design, and process, can provide many benefits. For example, HFE2:

  • Helps health care providers understand why errors occur and which errors are most likely to threaten patient safety 
  • Describes how certain tools can help to lessen the likelihood of patient harm, such as the automatic coupling of orders that must occur together to ensure safety. For example, a new warfarin order should automatically trigger an order for daily INR checks
  • Improves the hospital culture of safety 
  • Enhances teamwork and communication among health care staff
  • Improves the design of health care systems and processes
  • Improves the design of medical devices
  • Helps identify what went wrong and what could go wrong in the future

The two most significant benefits of applying HFE to health care are increased patient safety and improved operational efficiency. When human factors are considered in the design of systems and processes, this affects the error rate, as well as the type of errors that will likely occur. Developing an understanding of which errors are inevitable is useful when creating processes that are less likely to allow errors. When an error does occur, it is crucial to have methods in place to mitigate the impact of these errors. Many progressive hospital systems are developing Go Teams that are activated after an adverse event occurs to provide transparency with the patient and their family, support the caregivers involved, and begin an immediate review to maximize learning from the event.  

HFE is a broad discipline, and numerous sub-disciplines exist within its framework. One of these sub-disciplines is teamwork, communication, team leadership, and team training. Crew Resource Management (CRM)—a fundamental of HFE that comes from the aviation industry—is the application of human factors knowledge to the conduct of flight operations with the objective of efficiently using all available resources (ie, equipment, systems, and people) to achieve safety. One of the main goals of CRM is to elicit feedback and promote communication among all members of the flight team, including the pilot, co-pilot, and all other team members. CRM seeks to decrease the impact of the traditional hierarchy wherein the captain is in charge and other members of the flight team feel they should not question their judgment. When certain staff members perceive themselves as lower in the hierarchy, they are less likely to speak up even when they identify possible safety concerns, and this reduces the power of the team. The goal of CRM is to encourage all staff members to feel comfortable questioning possible safety concerns, and to give them tools to help enable this sometimes difficult step. 

While the aviation industry has been effective in developing this type of culture, we have not had the same successes in health care. HFE principles suggest that patient safety will improve if all members of the health care team—eg, physicians, physician assistants, nurses, pharmacists, and technicians—feel comfortable questioning patient care decisions that appear incorrect or have been made in error. Utilizing a hospital-wide CRM approach in the health care environment improves communication, and thereby improves safety and reduces inefficiencies. 

PP&P: How does the adoption of HFE impact the bottom line?
Fairbanks: Besides the obvious answer, that fewer errors and adverse events offer fewer instances of possible litigation against the hospital and lower costs related to the aftercare of patients who have been injured—a hospital informed by HFE is more efficient. One way to take advantage of improved efficiency through HFE is to employ a HFE consultant with expertise in performing a root cause analysis to improve safety and increase efficiency. Including safety engineers, such as human factors specialists, may be useful to help us gain a fresh perspective. While it is a common sentiment in health care that we should be able to solve all of our problems in-house, it is prudent to recognize when seeking outside assistance can be beneficial. An HFE expert can provide a comprehensive analysis of all areas where HFE principles would improve patient care and increase efficiencies. The Human Factors and Ergonomics Society Web site contains a directory of members who can provide a comprehensive human factors analysis ( 

PP&P: How can errors be avoided or mitigated in an environment like the emergency department (ED), where change is a constant?
Fairbanks: It is true that the ED is a complex, constantly changing environment. However, even in busy environments, the types of change—and thus the types of errors that may occur—are nonetheless often predictable. As such, the more variables that exist in an environment on a day-to-day basis, the more important it is to employ HFE principles. For example, errors commonly occur when our routine activities change unexpectedly. We have all experienced this exiting a building where the doors must be pushed to open. Because we expect doors with handles to be pulled open, we will unconsciously pull the door, even one that is clearly marked push. This is because the door’s design conveys a stronger message than the sign instructing us how to operate it. Similar concepts can be applied in health care. For example, when attempting to mitigate a potential error during tasks that providers do every day—such as retrieving a commonly used medication from the automated dispensing cabinet—it is unlikely that warning signs or labels will be effective in mitigating hazards. HFE concepts must be taken into account.

In any complex environment like the ED, staff still must complete numerous routine tasks every day. If we are not conscious of ensuring design aligns with how devices should be operated, error may occur. For example, the interchange valves for oxygen are green, and yellow for air. However, the valves are interchangeable, so a green tip can inadvertently be placed on the air-flow valve, which can result in a patient being deprived oxygen. This underscores the fact that poorly designed color coding can actually increase the risk of error. In the ED, it is unwise to rely on a health care worker to become better at following instructions, policies, and procedures—error is better averted by ensuring the design of processes is less error-prone. Therefore, when a new task must be undertaken in the ED, carefully designed systems will reduce the likelihood of error.

PP&P: How can HFE inform the development of new technologies and reduce the rate of technology-related medication errors?
Fairbanks: Hospital automation design must be informed by human factors to ensure technologies function safely. One of the best examples of how HFE can improve automation safety is the changes that have been made to infusion pumps since their first generation design.3 Early versions of infusion pumps were designed to improve accuracy and continuity of IV infusions by allowing nurses to program an hourly rate and volume. However, studies showed that these early devices were involved in approximately 35% to 60% of the estimated 770,000 ADEs that occur annually in the US.3-5 The majority of these errors resulted from nurses manually inputting incorrect pump settings. The most common errors included unit errors, multiple of ten errors, miscalculations, and push-button mistakes.6 Fatal errors occurred as a result of decimal point entry mistakes when programming infusion pumps (eg, programming morphine at 90 mL/hr rather than 9.0 mL/hr, causing a 10-fold overdose).3 Another common error in early generation infusion pumps was the double key bounce, wherein one push of the button was intended, but the button was inadvertently pressed twice. Thus, although the introduction of smart infusion pumps has the potential to bring improvements to medication safety, the original devices demonstrated how the risk of adverse events could be impacted by the level of consideration given to human factors during the design process.3-5 (See SIDEBAR 1.) 

PP&P: How can a hospital develop a culture of safety that promotes near-miss and error reporting by staff?
Fairbanks: Education is required to change hospital culture to support near-miss and error reporting. Staff must receive training that enables them to understand the concept that the design of devices and IT systems impacts error rates. Once staff comprehends this, they are better prepared to recognize hazards and unsafe working conditions that could affect patient safety. When staff is able to recognize the potential for error, they are more likely to report concerns and thereby avert error. Additional tools should include use of non-punitive reporting systems, employing root cause analysis and incident reviews to examine system factors, and studies of near misses. 

Too often in health care, we are praised for making do with less; we accept the given environment and believe that doing our jobs well means that we make things safe despite the difficult conditions we work in. However, HFE gives us new tools to improve safety; we must first change systems to eliminate areas where harm may be introduced. Relying on humans to “do it better next time” simply does not work. Changing this mindset throughout an organization requires hospital-wide support, including that of senior leadership and the c-suite, to be effective.

See Sidebar 2 for additional HFE resources.



  1. Hendricksen K, Dayton E, Keyes MA, et al. Understanding Adverse Events: A Human Factors Framework. In: Hughes RG, ed. Patient Safety and Quality: An Evidence-Based Handbook for Nurses. Rockville, MD: AHRQ; 2008.
  2. Clinical Human Factors Group. Implementing Human Factors in Healthcare. Patient Safety First. Accessed March 17, 2014.
  3. Centre for Global eHealth Innovation. Healthcare Human Factors Group: University Health Network. Smart Medication Delivery Systems: Infusion Pumps (April 2009). Accessed March 17, 2014.)
  4. Reeves J. Smart pump technology reduces errors. Newsletter: Anesthesia Patient Safety Foundation. 2003;18(1):1-16. Accessed March 17, 2014
  5. Tourville J. How technology is helping Children’s Medical Center of Dallas reach zero-error tolerance. US Pharm. 2003;28:80-86.
  6. Murdoch LJ, Cameron VL. Smart infusion technology: a minimum safety standard for intensive care? Br J Nurs. 2008;17(10):630-636.

Rollin J. “Terry” Fairbanks, MD, MS, is the director of the National Center for Human Factors in Healthcare in Washington DC, director of MedStar Health’s Simulation Training & Education Lab (SiTEL), associate professor of emergency medicine at Georgetown University, and adjunct associate professor of industrial and systems engineering at the University at Buffalo.

Examples of Human FactorIssues in Smart Pump User Interface Design*

  • The design of the infusion pump screen confuses the user, or the infusion pump does not respond as it should (ie, with a warning or alarm) when inappropriate data is entered.
  • The infusion pump screen does not make clear which units of measurement the user is expected to enter. For example, the user may enter weight in pounds when the infusion pump requires it in kilograms.
  • Pump labels or components become damaged under routine use. For example, cleaning the pump, as the user-maintainer believes is acceptable practice, may damage the pump, making it unreliable for clinical use. Users with long fingernails may damage the print on the pump keys, making them unreadable.
  • User instructions or cues for mechanical set-up are not specific or clear enough. For example, an instruction to attach a tubing set in all required tube holder-clips before closing the pump’s access door may be unclear, resulting in clamped tubing and under-infusion.
  • Inadequately designed alarm functions and settings cause users to miss problems or respond late. For example, an alarm indicating low battery charge may not be displayed in time for a user to prevent pump shut-off during a critical infusion while a patient is in transport. False (nuisance) alarms may decrease users’ sensitivity to all alarms.
  • The infusion pump screen design is clunky or confusing to users, causing a delay in therapy. For example, the Start Infusion key may be located next to the Power key, and a user may turn off the infusion pump instead of initiating infusion. In some cases, programmed settings are lost when a user turns the pump off, and the infusion settings have to be re-entered after the pump restarts.
  • Warnings are displayed so often that users come to ignore them (similar to nuisance alarms), are not detailed enough to prevent misuse, or represent values in ways that are unfamiliar to the user.
  • Warning messages are unclear. In the example below, it is unclear if the user is confirming the warning message or the infusion settings.

Volume in the syringe is inadequate to deliver the programmed dose. PRESS CONFIRM

  • User manuals are confusing, inadequate, outdated, or unavailable. This is particularly of concern for home-based users.
  • When communicating the critical aspects of the pump’s operational, default, or piggyback status, the system does not employ user-friendly language or does not give enough information to guide users through appropriate actions.

*FDA Web site. Medical Devices. Examples of Reported Infusion Pump Problems. Accessed March 17, 2014.

Human Factors Engineering Resources

Video Presentations 
Skiles J, Fairbanks T. Was It Really a Miracle on the Hudson? Video presentation at National Patient Safety Foundation’s 14th Annual Patient Safety Congress. Washington, DC: May 23-25, 2012. Accessed March 17, 2014. 

Annie’s Story: How A System’s Approach Can Change Safety Culture.
Accessed March 27, 2014. 

Web Sites
Bad Human Factors Designs
A scrapbook of illustrated examples of things that are hard to use because they do not follow human factors principles.

FDA Human Factors Program

Grout J. Mistake-proofing the design of health care processes. (Prepared under an IPA with Berry College.) AHRQ Publication No. 07-0020. Rockville, MD: Agency for Healthcare Research and Quality; May 2007.  
Accessed March 14, 2014.

Human Factors and Ergonomics Society

University of Chicago Cognitive Technologies Laboratory

Veterans Affairs National Center for Patient Safety

Carayon P. Handbook of Human Factors and Ergonomics in Health Care and Patient Safety. Mahwah, NJ: Lawrence Erlbaum Associates (Taylor & Francis); 2007.

Casey SM, Casey S. Set Phasers on Stun: And Other True Tales of Design, Technology, and Human Error. 2nd ed. Santa Barbara, CA: Aegean Publishing Co; 1998.

Dekker S. Patient Safety: A Human Factors Approach. Boca Raton, FL: CRC Press (Taylor & Francis Group); 2011.

Gosbee JW, Gosbee, eds. Using Human Factors Engineering to Improve Patient Safety: Problem Solving on the Front Line. Oakbrook Terrace, IL: Joint Commission Resources; 2010.

Norman DA. The Design of Everyday Things. New York, NY: Bantam Doubleday Dell Publishing Group; 1988.

Perrow C. Normal Accidents: Living With High-Risk Technologies. Updated ed. Princeton NJ: Princeton University Press; 1999.

Reason J. Human Error. New York, NY: Cambridge University Press; 1990.

Wickens CD, Hollands JG, Parasuraman R, Banbury S. Engineering Psychology & Human Performance. 4th ed. London, UK: Pearson; 2012.

Short Courses in Medical Human Factors

  • University of Wisconsin Center for Quality & Productivity Improvement Systems Engineering Initiative for Patient Safety
  • Mayo Clinic School of Continuous Professional Development


Like what you've read? Please log in or create a free account to enjoy more of what has to offer.

Current Issue