Imagine a pilot is upgrading to a new aircraft or a maintainer is training to learn a more complex system. Both of these individuals are very well versed in their current positions but are entering new territory with this training. What are some human factor issues that might arise in a circumstance like this? Address at least 2 of the human performance factors
When we think of performance, it is tempting to consider a race car and how we assess its performance by asking how fast it can negotiate a turn while not losing its grip on pavement, or by learning how long it takes to accelerate from zero to 60 miles per hour. What about us? Human performance can be defined as the manner, speed, and accuracy with which individuals accomplish tasks. It is a measure of human activity that expresses how well a human has carried out an assigned, well-defined task, or a portion of a task (task element). Human performance is a function of speed and accuracy. Here we focus mostly on the accuracy component of human performance. If a task is not performed “accurately” in accordance with its requirements, an error has occurred. Accidents are generally caused by situations in which a person’s capabilities are inadequate or are overwhelmed in an adverse situation. Humans are subject to such a wide range of varying situations and circumstances that not all can be easily foreseen. Careful attention should therefore be given to all the factors that may have influenced the person involved. In other words, consideration must be given not only to the human error (failure to perform as required) but also to why the error occurred. Variables that affect human performance can be grouped into seven categories: physical factors, physiological factors, psychological factors, psychosocial factors, hardware factors, task factors, and environmental factors. These factors are now briefly reviewed.
1. Physical factors include body dimensions and size (anthropometric measurements), age, strength, aerobic capacity, motor skills, and body senses such as visual, auditory, olfactory, and vestibular.
2. Physiological factors include general health, mental blood flow and oxygenation, and medical conditions such as low blood sugar, irregular heart rates, incapacitation, illusions, and history of injury, disability, or disease. Also included in this category are human conditions brought on by lifestyle such as the use of drugs, alcohol, or medication; nutrition; exercise; sports; leisure activities; hobbies; physical stress; and fatigue.
3. Psychological factors include mental and emotional states, mental capacity to process information, and personality types (introverts and extroverts). Some human personality traits include the following:
● Motivation is a desire of an individual to complete the task at hand. Motivation affects one’s ability to focus all the necessary faculties to carry out the task.
● Memory allows us to benefit from experience. It is the mental faculty that allows us to prepare and act upon plans. Memory can be improved through the processes of association, visualization, rehearsal, priming, mnemonics, heuristics, and chaining. Memory management organizes remembering skills in a structured procedure while considering time and criticality. It is a step-by-step process to increase the accuracy and completeness of remembering.
● Complacency can lead to a reduced awareness of danger. The high degree of automation and reliability present in today’s aircraft and the routines involved in their operation are all factors that may cause complacency.
● Attention (or its deficit) determines what part of the world exists for you at the moment. Conscious control of attention is needed to balance the environment’s pull on attention. An intrapersonal accident prevention approach would describe the hazardous states of attention as distraction, preoccupation, absorption, and attention mismanagement—the inability to cope with tasks requiring flexible attention and focused tracking and steering. The inability to concentrate can lead to lack of (situational) awareness, which has been identified as a contributing factor in many accidents and incidents.
● Attitude strongly influences the functioning of attention and memory. Attitudes are built from thought patterns. An intrapersonal approach to the attitudes of team members attempts to identify the desirable ranges between such hazardous thought patterns as macho–wimp, impulsive–indecisive, invulnerable–paranoid, resigned–compulsive, and antiauthority–brainwashed.
● Perceptions can be faulty. What we perceive is not always what we see or hear. Initial perceptions and perceptions based solely on intended actions are especially susceptible to error. An intrapersonal approach prescribes ways to make self-checking more efficient and reliable.
● Self-discipline is an important element of organized activities. Lack of self-discipline encourages negligence and poor performance.
● Risk taking is considered by some to be a fundamental trait of human behavior. It is present in all of us to varying extents since an element of risk is present in most normal daily activities. Risk will be present as long as aircraft fly and penalties for failure are high. Accordingly, the taking of risks needs to be carefully weighed against the perceived benefits.
● Judgment and decision making are unique capabilities of humans. They enable us to evaluate data from a number of sources in the light of education or past experience and to come to a conclusion. Good judgment is vital for safe aircraft operations. Before a person can respond to a stimulus, he or she must make a judgment. Usually good judgment and sound decision making are the results of training, experience, and correct perceptions. Judgment, however, may be seri-ously affected by psychological pressures (or stress) or by other human traits, such as personality, emotion, ego, and temperament. Good judgment is vitally important to making correct decisions.
● Aeronautical decision making (ADM): The FAA describes ADM as a system-atic approach to the mental process used by aircraft pilots to consistently determine the best course of action in response to a given set of circumstances (FAA Advisory Circular 60-22). The FAA Pilot’s Handbook of Aeronautical
DECIDE Model of Aeronautical Decision Making
The decision maker detects the fact that change has occurred. The decision maker estimates the need to counter or react to the change. The decision maker chooses a desirable outcome (in terms of success) for the flight. The decision maker identifies actions which could successfully control the change. The decision maker takes the necessary action. The decision maker evaluates the effect(s) of his/her action countering the change. FIGURE 3-5 The DECIDE model has been recognized worldwide as an effective continuous loop decision-making process. (Source: www.faa.gov) 99 Knowledge recommends the DECIDE model to provide the pilot with a logical way of making decisions. The DECIDE acronym means to Detect, Estimate, Choose a course of action, Identify solutions, Do the necessary actions, and Evaluate the effects of the actions (Figure 3-5).
4. Psychosocial factors include mental and emotional states due to death in the family or personal finances, mood swings, and stresses due to relations with family, friends, coworkers, and the work environment. Some of the factors that cause stress are inadequate rest, too much cognitive activity, noise, vibration and glare in the cockpit, anxiety over weather and traffic conditions, anger, frustration, and other emotions. Stress causes fatigue and degrades performance and decision making, and the overall effect of multiple stresses is cumulative. Interactions with coworkers are influenced by two important variables, namely, peer pressure and ego.
● Peer pressure can build to dangerous levels in competitive environments with high standards, such as aviation, in which a person’s self-image is based on a high standard of performance relative to his or her peers. Such pressure can be beneficial in someone with the necessary competence and self-discipline, but it may be dangerous in a person with inferior skill, knowledge, or judgment. For example, a young, inexperienced pilot may feel the need to prove himself or herself and may, therefore, attempt tasks beyond his or her capability. Humans have many conflicting “needs,” and the need to prove oneself is not limited to the young or inexperienced. Some persons, because of training or background, have a fear that others may consider them lacking in courage or ability. For such persons, the safe course of action may be perceived as involving an unacceptable “loss of face.”
● Ego relates to a person’s sense of individuality or self-esteem. In moderate doses, it has a positive effect on motivation and performance. A strong ego is usually associated with a domineering personality. For pilots in command, this trait may produce good leadership qualities in emergency situations but it may also result in poor crew or resource management. The domineering personality may discourage advice from others or may disregard established procedures, previous training, or good airmanship. Piloting an aircraft is one situation in which an overriding ego or sense of pride is hazardous. Although usually not specifically identified as such in accident reports, these traits may often be hidden behind such statements as “descended below minima,” “failed to divert to an alternate,” “attempted operation beyond experience/ability level,” “continued flight into known adverse weather,” and so forth.
5. Hardware factors include the design of equipment, displays, controls, software, and the interface with humans in the system. 6. Task factors include the nature of the task being performed (vigilance and inspection tasks versus assembly operations), workload (work intensity, multitasking, and/or time constraints), and level of training. 7. Environmental factors include noise, temperature, humidity, partial pressure of oxygen, vibration, and motion/acceleration. It is important to note that the factors discussed above can act alone or in combination with two or more other factors to further degrade human performance in the occupational setting. These factors can produce synergistic effects on human performance. Some examples include an air traffic controller monitoring air traffic during extremely low air traffic volume while on allergy medication or a quality control inspector monitoring low-defect-rate products while on cold medication. Fitness for Duty Human performance is so important that the FAA has crafted a regulation dealing solely with the topic of fitness for duty. For many people, if they wake up not feeling well one morning, there is still a good chance they could survive the work day without putting anyone in danger. However, for aviation professionals, even the smallest disruption of balance in one’s body could severely compromise the safety of hundreds of people. Recent accidents have highlighted the need for future awareness campaigns and areas of research to improve fitness for duty. The FAA explains fitness for duty in FAR 117.5 as being “physiologically and mentally prepared and capable of performing assigned duties at the highest degree of safety.” Unfortunately, there are gray lines in the meaning of the FAA definition as it is hard to standardize and quantify what physiological and mental fitness is for each professional. Although the presence of stress in one’s life has clear effects on performance, and alcohol can obviously severely impact performance, the fatigue element is less obvious and requires elaboration. Although it can be hard to pinpoint, fatigue is classified by the following:
1. Weariness from mental or bodily exertion
2. Decreased capacity or complete inability of an organism, an organ, or a part to function normally because of excessive stimulation or prolonged exertion
Fatigue can decrease short-term memory capacity, impairs neurobehavioral
performance, leads to more errors of commission and omission, and increases attentional failures. A study of medical residents looked at the impact of fatigue and found that the risk of making a fatigue-related mistake that harmed a patient soared by an incredible 700% when medical residents reported working five marathon shifts in a single month (Hallinan, 2009). Think about how these same findings could cause breakdowns in safety for aviation professionals during long shifts. Some studies have shown sleep deprivation in pilots causing performance reductions equivalent to having a blood-alcohol level of 0.08% (Paul & Miller, 2007). In other words, being fatigued can deteriorate our performance as much as if we were drunk.
There are three types of fatigue: transient, cumulative, and circadian. Transient fatigue is brought on by sleep deprivation or extended hours awake. The second type, cumulative fatigue, is repeated mild sleep restriction or tiredness due to being awake for extended hours across a series of days. Last, circadian fatigue, is the reduced performance during nighttime hours, particularly between 2 am and 6 am for those who are used to being awake during daytime and asleep at night. Contributing factors to all types of fatigue include personal sleep needs, sleeping opportunities, physical conditioning, diet, age, alcohol, stress, smoking, sleep disorders, mental distress, sleep apnea, and even heart disease. It is imperative to recognize the symptoms of fatigue because it is associated with accidents, reduced safety margins, and reduced operational efficiencies. In1993 the NTBS started including fatigue as a probable cause of some accidents, starting with the report on the uncontrolled collision with terrain by an American International Airways Flight 808, a Douglas DC-8 carrying cargo to Guantanamo Bay, Cuba.
Fatigue is a significant problem in aviation because of long shifts, circadian
disruptions, and the sometimes unpredictable work hours. Attempts to regulate rest are limited by conditions that are still not adequately addressed, such as the following:
1. Reporting for duty while being fatigued because stress from family or personal sources does not allow adequate sleep.
2. Commuting from home to the location where a trip will commence may require significant time awake and induce stress prior to actually commencing professional duties.
3. A crewmember may be provided an adequate rest period prior to a flight but may be unable to obtain quality rest due to ambient noise, such as when there is a party or a loud television in an adjoining hotel room, construction noise, or if the crewmember misuses allotted rest time for personal activities that are not conducive to rest, such as sightseeing or working a second job.
4. Traveling between time zones and not being able to be on a consistent sleep schedule.
Taking the above reasons into consideration, it is fortunate that there have not been more aviation accidents and incidents as a result of fatigue, particularly when contemplating that fatigue does pose a threat not only to pilots but also to air traffic controllers, aviation maintenance technicians, and everyone who can affect the safety value chain in commercial aviation. The professional life of many people in aviation can best be described as shift work, which is a schedule that falls outside the traditional 9 am to 5 pm work day and which can include evening, night, morning, or rotating periods. Negative side effects of shift work include acute/chronic fatigue, sleep disturbance, disruption of circadian rhythm, impaired performance, cardiovascular problems, and family/ social disruption. When operating while fatigued, controllers may commit an error, such as providing less than required separation between two or more aircraft and assigning closed runways to aircraft. Potential errors in maintenance may include incorrect assembly of parts, not properly installing equipment, and overlooking items that need attention. To combat fatigue in the future, airlines have developed Fatigue Risk Management Plans (FRMPs). In 2010, the President of the United States signed Public Law 111-216, the Airline Safety and Federal Aviation Administration Extension Act of 2010, which focuses on improving aviation safety. Section 212(b) of the Act requires each air carrier conducting operations under Title 14 of the Code of Federal Regulations part 121 to develop, implement, and maintain an FRMP. Such plans consist of an air carrier’s management strategy, including policies and procedures that reduce the risks of flight crewmember fatigue and improve flight crewmember alertness. Further information on FRMP is contained in FAA Advisory Circular 120-103A, entitled “Fatigue Risk Management Systems for Aviation Safety.” These guidelines provide a source of reference for managers directly responsible for mitigating fatigue, detailed step-by-step guidelines on how to build, implement, and maintain an FRMP, and describe the elements of an FRMP that comply with industry good practice. A typical FRMP structure consists of nine elements, all of which the carriers must address in their own plan:
1. Senior Level Management Commitment to Reducing Fatigue and Improving Flight Crew Member Alertness
2. Scope and Fatigue Management Policies and Procedures
3. Current Flight Time and Duty Period Limitations
4. Rest Scheme Consistent with Limitations
5. Fatigue Reporting Policy
6. Education and Awareness Training Program
7. Fatigue Incident Reporting Process
8. System for Monitoring Flight Crew Fatigue
9. FRMP Evaluation Program
By using this system, fatigue becomes a part of the safety management system in the same way other aspects of health, environment, productivity, and safety are managed. Additionally, the FAA has a few recommendations to mitigate fatigue, which are as follows:
1. Shift rotation time should be no less than 10 hours.
2. Utilize modeling and scheduling tools to assist in mitigating fatigue-promoting schedules.
3. Educate schedulers and workforce about issues regarding shiftwork and fatigue.
4. Promote application of personal and operational counter-fatigue strategies.
The term communication usually includes all facets of information transfer. It is an essential part of teamwork, and language clarity is central to the communication process. Different types of communication, and verbal communication in particular, remain one of the weakest links in the modern aviation system. More than 70% of the reports to the Aviation Safety Reporting System involve some type of oral communication problem related to the operation of an aircraft. Technologies, such as airport surface lights or data link communication, have been available for years to circumvent some of the problems inherent in ATC associated with verbal information transfer. Sometimes, however, solutions bring unintended negative consequences. For example, one potential problem with ATC data link communication is the loss of the “party line” effect (hearing the instructions to other pilots), which removes an important source of information for building pilot SA about the ATC environment. That being said, the so-called party line is also a source of errors for pilots who act on instructions provided to other aircraft or who misunderstand instructions that differ from what they anticipated by listening to the party line. Switching ATC communication from hearing to visual, as is the case with reading data link communiqués, also can increase pilot workload under some conditions. Further human factors studies are necessary to define the optimum uses of visual and voice communications. Miscommunication between aircrews and air traffic controllers has been long recognized as a leading type of human error. It has also been an area rich in potential for interventions. Examples are the restricted or contrived lexicon (e.g., the phrase say again hails from military communications, where it was mandated to avoid confusing the words repeat and retreat); the phonetic alphabet (“alpha,” “bravo,” etc.); and stylized pronunciations (e.g., “niner” to prevent confusion of the spoken words nine and five). Adequate communication requires that the recipient receive, understand, and can act on the information gained. For example, radio communication is one of the few areas of aviation in which complete redundancy is not incorporated. Consequently, particular care is required to ensure that the recipient receives and fully understands a radio communication. There is more to communication than the use of clear, simple, and concise language. For instance, intelligent compliance with directions and instructions requires knowledge of why these are necessary in the first place. Trust and confidence are essential ingredients of good communication. For instance, experience has shown that the discovery of hazards through incident or hazard reporting is only effective if the person communicating the information is confident that no retribution will follow her or his reporting of a mistake. The horrific ground collision between two Boeing 747 aircraft in Tenerife in 1977 resulted in the greatest loss of life in an aviation accident and featured a key communication error as part of the accident sequence. Today, controllers restrict the word cleared to two circumstances—cleared to take off and cleared to land—although other uses of the word are not prohibited. In the past, a pilot might have been cleared to start engines, cleared to push back, or cleared to cross a runway. The recommendation would have the controller say, “Cross runway 27,” and “Pushback approved,” reserving the word cleared for its most flight-critical use. The need for linguistic intervention never ends, as trouble can appear in unlikely places. For example, pilots reading back altimeter settings often abbreviate by omitting the first digit from the number of inches of barometric pressure. For example, 29.97 (inches of mercury) is read back “niner niner seven.” Since barometric settings are given in millibars in many parts of the world, varying above and below the sea level pressure standard value of 1013, the readback “niner niner seven” might be interpreted reasonably but inaccurately as 997 millibars. The obvious corrective-action strategy would be to require full readback of all four digits. A long-range intervention and contribution to safety would be to accept the more common (in aviation) English system of measurement, eliminating meters, kilometers, and millibars once and for all. Whether English or metric forms should both be used in aviation, of course, is arguable and raises sensitive cultural issues. At this time, the English system clearly prevails, since English is the ICAO-mandated international language of aviation, as stressed in a decree by ICAO regarding language proficiency on January 1, 2008.
Humans and Automation
It is exciting when we get a new cell phone, isn’t it? Sure, it requires somewhat of a learning curve to master the new features, but once we determine how to use the new capabilities we feel so much more powerful with our new technology. That learning curve, however, can mean that you inadvertently delete your song list. That is a very minor mistake that is recoverable. Wrestling with the automated features on a rental car makes the situation more serious, often requiring renters to sit in the parking lot of the rental car agency after receiving the car, with the engine running, while the driver figures out just how to figure out the fancy gadgetry. Or worse, the driver may try to figure out how to operate the technology in mid drive, creating distraction that could lead to an accident. Now imagine strapping into your brand new Airbus 350 and undertaking the same learning curve with the flight deck automation. A simple mistake is no longer a laughing matter. After all, you would be in a high consequence environment where a small error may not delete a song list but result in the aircraft running off a taxiway or flying into terrain killing hundreds of passengers. It is not that funny anymore, is it? On a much more serious note, the confusion caused by automation has sur-faced time and time again in accidents. In 1992 an Airbus 320 operating as Air Inter Flight 148 crashed into terrain near Strasbourg, France, when pilots mis-understood the descent mode that was selected in the auto flight system. Two years later, an Airbus 300 operating as China Airlines 140 stalled on approach to Nagoya, Japan, when the automation confused the pilots and an unintentional go-around was initiated during approach to landing. More recently in 2013, a Boeing 777 operating as Asiana Flight 214 crashed on approach to San Francisco, partly because the crew did not understand how the aircraft’s automation would behave in certain flight modes. According to Michael Feary, a research psychologist for NASA, part of the problem was that the information provided in the flight computer displays was written in “engineer-speak” versus “pilot-speak.” How automation combines with human error is a fascinating subject. You could write an entire book on the topic. In fact, many very good books have already been written. As a minimum, we should become familiar with three terms: automation surprise, mode confusion, and GIGO (garbage in, garbage out). Automation surprise is when a person has a mental model of the expected performance of technology and then encounters something different. In some cases, we anticipate a response to the last selected mode but do not notice that the automation has reverted to a sub-mode, and thus, behaves differently than expected. Designers strive to make automation intuitive; however, operators may encounter functioning that they have not been taught and which is not announced by the automation control heads or associated displays. A study that surveyed 1,268 pilots revealed that 61% of them still were surprised by the flight deck automation from time to time (BASI, 1999). For an example of automation surprise from everyday life, imagine that your laptop has a factory preset to check for updates every week and to restart if there are any. While writing a critical document that is due within the hour, a dialogue pops up detailing that the computer is going to start the scan and restart. You were away for a moment getting coffee and return to find the laptop rebooting, much to your surprise. If you are lucky you did not lose much content on the document. Within automation surprise, one specific aspect is mode confusion, which describes how selecting a certain mode for automation operation may actually result in a different mode being engaged, resulting in a very confusing situation. Mode confusion happens when we do not completely grasp the inner workings of automation or if we fail to monitor the automation as it operates, much as we would monitor a colleague to stay in the loop as to what is happening. One such situation is called an indirect mode change, when the automation changes mode without an explicit command by the operator, which is a byproduct of reaping the benefits of having automation perform what was previously manual functions. This happens more frequently than one would expect. A research study of 1,268 pilots revealed that 73% of them reported having inadvertently selected a wrong automation mode (BASI, 1999). Another study found that most mode confusion events occurred during unusual uses of the automation, such as during aborted takeoffs or disengagement of automatic modes during approach to landing (Sarter & Woods, 1995). In 1995 the Federal Aviation Administration Aircraft Certification Directorate conducted a study about how modern glass cockpit automation correlates to aviation accidents stemming from automation confusion. The study demonstrated concern about pilot understanding, use, and modal awareness of automation, suggesting that when a pilot is not sufficiently engaged, awareness decreases. Another potential hazard that can result from mode confusion is altitude deviation. Using the ASRS database, NASA looked at 500 reports and found this to be the most reported automation problem. To correct this problem, the researchers recommend better and redundant feedback loops to detect anomalies, as well as new system designs altogether. When mode confusion occurs, it can have significant consequences. How often are errors committed, though, and of what type? A University of Texas research team conducted Line Operations Safety Audits on 184 crews across three airlines between 1997 and 1998. They found that 68% of crews committed at least one error, the most frequent error being associated with automation. Sixty-five percent of these automation errors were a failure to verify settings. Incorrect switch or mode selection accounted for 21%. To understand the types of mode confusion that occurs, a NASA study used 30 flights aboard Boeing 757/767 aircraft to help categorize the types. The researchers divided mode confusion into two areas. The first was misidentification of an automation behavior when the actual behavior was different from what was expected. The second was when a pilot acted on the assumption that the machine was in one mode, when it was actually in another. An example of mode confusion previously mentioned, Asiana 214, occurred in part because the flight officer controlling the Boeing 777 was in manual control of the aircraft yoke but mistakenly believed that the auto-throttle mode he had selected would control the airspeed. When the pilot noticed the dangerously slow speed on approach and realized that he was confused about the auto-throttle mode, he intervened by trying to take manual control of the aircraft instead of using the automation, but it was too late to prevent the accident. Another source of human error when using automation is GIGO, an acronym for garbage in, garbage out. It is a phrase that refers to the fact that computers, operating through logical processes, will unquestionably process erroneous, even senseless, data that are input into the system and still produce an output.
This output, though, is usually undesired and often nonsensical. GIGO highlights just how important the human remains in automated operations. Until we are able to develop automation that is intelligent and which can therefore sense that a mistaken input is probably being made and warn the operator of the suspected mistake, the GIGO phenomena will continue to occur. We can see GIGO in the 2014 story of U.S. Airways Flight 1702, an Airbus 320 departing from Philadelphia to Fort Lauderdale. According to NTSB documents, prior to departing the gate for takeoff, one of the pilots entered the wrong runway into the flight computer. The captain noticed, prompting the other pilot to change the runway information but not the parameters needed for the new runway. At takeoff, as the aircraft accelerated through 80 knots of airspeed, the audio alert in the cockpit announced “retard, retard, retard.” This was an automated audio signal prompting the pilots to reduce thrust setting. Unfortunately, neither of the pilots knew what the warning meant during take-off because it was associated with landings. Initially the captain decided to continue with the takeoff and address the discrepancy once airborne, but once in the air realized that something was seriously amiss and decided to touch down again and stop the aircraft. This action resulted in the plane’s tail striking the runway and causing the nose gear to collapse as it hit the ground and skidded to a halt 2,000 feet later, stopping on the runway’s left edge. Later, Airbus said that the “retard” audio annunciation had sounded because the automated system reverted to landing mode since the parameters needed for the runway were not changed. As we can see, a mistake in the input of data caused the automation to revert to an unexpected mode and produce a surprising response. It should be noted that, while this section has focused on the interaction of pilots with automation, everyone else in the safety value chain is also affected. Flight attendants must learn how to operate the increasingly complex systems used for entertainment, cooking, and communication. Aviation maintenance technicians increasingly rely on software systems for reference guidance and tracking task accomplishment. Dispatchers often rely on several software packages that often have changing features due to software upgrades, thus presenting numerous opportunities for confusion and impeding dispatcher SA of numerous flights under their purview. Lastly, the element of distraction created by smartphones can be incessant and affects every single person on the safety value chain. This is a new source of dis-traction that has emerged over the past 10 years and which, some claim, is an epidemic. One of the authors of this chapter remembers flying as a pilot of an airliner and being very distracted, at the critical moment of rotation on takeoff, when the other pilot’s cell phone rang loudly on the flight deck. How many times have you checked your smartphone while reading this chapter? How many times have you answered a text or checked social media while reading this chapter? How have those actions impacted your processing of the concepts in this chapter . . . the same concepts that could one day possibly save your life? It gives you something to think about, doesn’t it?