Accident Analysis
Aviation Accident
Other than the Colgan Air accident, another human factor aviation accident would be the crash of Air France Flight 447, which occurred in 2009. The aircraft, an Airbus A330-200, crashed into the Atlantic Ocean while on the route from Rio de Janeiro to Paris, with 228 passengers and crew on board (Kharoufah et al., 2018). A preliminary report by the French aviation authority found that the accident was caused by some human factors, including crew error and faulty equipment (O’Brien, 2019). The pilots had become confused by erroneous cockpit readings and failed to react properly to the situation. As a result, they lost control of the aircraft, and it crashed into the ocean.
Accident Events
Around three hours into the flight, as the plane crossed the Atlantic Ocean about halfway between Brazil and Africa, it began a series of automated messages indicating problems with several of its onboard systems. The last message indicated that the autopilot had disengaged and that the aircraft was in a stall condition. The plane crashed into the Atlantic Ocean near Île-de-Pirate, resulting in the deaths of all aboard. The official report from the French aviation authorities determined that the crash was caused by a series of pilot errors (O’Brien, 2019). The pilots were flying through a storm and failed to properly adjust their aircraft’s speed and altitude according to the conditions. This caused the plane to lose lift and stall, leading to it eventually crashing into the ocean. Poor communication between the pilots also played a role in the crash, as they did not properly discuss their strategy for flying through the storm.
Plane crashes are often the result of a series of small errors that accumulate until they cause the plane to go down. According to Davidson and Brennan (2019), poor communication between pilots is one such error that can easily lead to a crash. When there is poor communication between the pilots, it can lead to confusion and chaos in the cockpit (Material One, n.d.). In order for pilots to be able to safely fly a plane, it is essential that they are able to communicate effectively with one another. Subramani et al. (2018) also confirmed that plane crashes could often be the result of poor communication between pilots. One example occurred in 1977 when two Boeing 747s collided on a runway in Tenerife.
The collision was ultimately attributed to a misunderstanding between the pilots caused by their poor English-language skills. As a result, 583 people died in the accident (Subramani et al., 2018). Pilots need to be able to communicate effectively with one another in order to ensure togetherness with regard to what is happening on board the plane. If there is a misunderstanding or if something is not communicated correctly, it could potentially lead to an accident. That is why it is so important for pilots to be able to speak each other’s language fluently and clearly.
There were several factors that directly caused, but some indirectly contributed to the crash of Air France Flight 447. Some of these indirect causes were the pilots’ failure to follow their checklists, their inadequate understanding of the aircraft’s automation, and their lack of experience in using that automation (Peysakhovich et al., 2018). Generally, pilots’ failure to follow their checklists can lead to a plane crash. Checklists are designed to be a comprehensive and systematic guide for pilots to follow during every phase of flight. By failing to follow their checklists, pilots can miss critical steps that could potentially lead to a plane crash (Material One, n.d.). In the wake of the tragic Germanwings plane crash in 2015, investigators found that the co-pilot had deliberately crashed the plane after locking the captain out of the cockpit (Pasha and Stokes, 2018). Pasha and Stokes (2018) also affirmed that the investigation revealed that the co-pilot had not followed proper procedures for starting the descent, and as a result, he had flown the plane into a mountain range. If he had followed his checklist, he would have prevented the accident.
Other contributing factors to the Air France Flight 447 crash included the pilots’ fatigue and the difficulty of handling unexpected situations in high-workload conditions. As Yiu et al. (2022) observed, fatigue impairs pilots’ cognitive performance, decision-making, and situational awareness. This can lead to difficulty handling unexpected situations in high-workload conditions. Fatigue has been identified as a major factor in aviation accidents. Fatigue among pilots can be attributed to a number of factors, including long work hours, circadian rhythm disruption, monotonous work, and sleep loss (Material Three, n.d.). In order to mitigate the effects of fatigue, the Federal Aviation Administration (FAA) has put in place regulations limiting the number of hours a pilot can fly in a given day or week. The FAA similarly requires that pilots have a minimum amount of rest before flying again. Yılmaz et al. (2022) added that there are many potential causes of fatigue among pilots. The researchers mentioned that some of the fatigue causes include long periods of travel, lack of sleep, and exposure to high levels of noise and radiation.
In order to combat fatigue, many airlines have implemented fatigue management programs that include things like mandatory breaks and rest periods, education on the risks of fatigue, and screening for signs of fatigue. Özel (2021) similarly emphasized different means of reducing fatigue; one way is to ensure that pilots have a good night’s sleep before their flight. Another way is to ensure that pilots have plenty of breaks during their flights. Additionally, pilots should be made aware of the dangers of fatigue and should be discouraged from working long hours. Finally, airplane design can also reduce pilot fatigue by providing more comfortable seats and better lighting.
SHEL Model Analysis
The SHEL model is a system used to analyze and prevent plane accidents. The model stands for Software, Hardware, Environment, and Liveware. Each of these aspects is important in preventing accidents, and the model aids in identifying potential problems so that they can be fixed before an accident occurs. The software includes the programs and codes that run the aircraft (Material One, n.d.). If there are any errors in this software, it could lead to a crash. Hardware refers to the physical parts of the aircraft, anything from the engine to the landing gear. If any of these components fail, it could lead to an accident. The environment includes factors such as weather conditions and air traffic congestion.
The SHEL model is a useful tool for analyzing the relationships (mismatch) and interactions between human and their environment. Software, Hardware, Environment, and Liveware can all be seen as elements that need to be in balance for an individual or system to function optimally. The second “L” in the SHEL model refers to the liveware (the people who operate and maintain the aircraft) in aviation (Croft et al., 2017). This category is often overlooked, but it is a critical part of aviation safety. For aircraft to be safe, it is not just important that the systems are functioning correctly; it is likewise essential that the people operating and maintaining those systems are capable of doing their jobs safely.
There were several factors that contributed to the Air France Flight 447 crash. One of the primary causes was a problem with the plane’s software, which caused it to lose altitude quickly (Oliver et al., 2017). Other key software problems that contributed to the accident included an incorrect display of airspeed, erroneous stall warnings, and an automated rudder system that went into override mode. Additionally, there were issues with the plane’s hardware, including its electrical system and cockpit sensors (Peysakhovich et al., 2018). This led to human error, as the pilots were flying in automatic mode and did not realize they were losing altitude.
The Air France Flight 447 crash was equally caused by a combination of factors, including pilot error, weather conditions, and malfunctioning equipment. The pilots could not regain control of the airplane after it entered into a stall condition due to turbulence from an intense thunderstorm (Filburn, 2020). Investigation revealed that the pilots had become confused by conflicting automation messages and were ultimately unable to diagnose and correct the situation in time. Moreover, the Air France Flight 447 crash is a prime example of how human errors can lead to disastrous consequences. In this accident, the pilots became disoriented due to a thunderstorm and ended up crashing the plane into the ocean (Filburn, 2020). If they had been better trained and more familiar with their surroundings, they would have been able to get back on track and land safely. This accident is a good reminder that technology can only do so much; in the end, it is up to humans to use that technology correctly and make smart decisions in difficult situations.
Policies Suggestions and Recommendations
Based on the findings of the human errors that led to the Air France Flight 447 crash, poor communication, failure to the checklist, and little knowledge of aircraft automation. Other causes of the accident were high workloads and fatigue among the pilots. If I was to reduce the human factors leading to these accidents, I would suggest a few policies. Firstly, I would suggest the initiation of cockpit resource management (CRM) training for all pilots. Gross et al. (2019) found that CRM assists pilots in communicating effectively during high-stress, emergency situations. The goal of CRM training will be to help pilots work together as a team to manage the aircraft and its passengers safely. By emphasizing clear communication and cooperation, CRM training can help pilots make better decisions during times of stress (Material Four, n.d.). In addition to cockpit resource management training, many airlines similarly require their pilots to undergo psychological screening.
The screening helps identify any personality traits or mental health issues that might interfere with a pilot’s ability to fly safely. By identifying and addressing potential hazards early on, airlines can help ensure the safety of their passengers and crew. Helmreich et al. (2017) added that CRM training teaches pilots how to communicate effectively with one another in order to make better decisions as a team. Pilots must be able to share information quickly and accurately and work together to resolve any conflicts that may arise. CRM training aid pilots in developing the skills needed to do this effectively to maintain situational awareness and safely complete their mission.
Secondly, I would suggest a policy that requires pilots to use standard phraseology when communicating with air traffic control (ATC) and each other. Using standard phraseology when communicating with air traffic control is important for several reasons (Wu et al., 2019). First, standard phraseology ensures that all pilots speak the same language, making communication between pilots and air traffic control more fluid and efficient (Material Five, n.d.). Second, standard phraseology will help avoid confusion and misunderstanding between pilots and air traffic control. Finally, standard phraseology is required by law to maintain aviation safety.
Thirdly, I would suggest establishing restrictions on pilot duty time and rest requirements. Pilots are only allowed to fly for a certain number of hours per day and must get a certain amount of sleep each night in order to be rested and alert for their next flight (Material Three, n.d.). Zaslona et al., 2018) support the idea that well-rested pilots can better communicate with one another during flights. The authors showed that pilots who were given more rest could maintain their focus and vigilance for longer periods. Marcus and Rosekind (2017) also researched the effects of fatigue on pilots, and it has been shown that pilot fatigue can lead to decreased communication during flights and increases in errors and accidents. To help mitigate these risks, the FAA has put in place regulations governing how long pilots can be on duty and how much rest they need to get between flights.
Fourthly, I would propose introducing a policy requiring pilot training on handling specific emergency situations, including loss of control of the aircraft. It is important for pilots to be trained on how to handle specific emergencies, including loss of control of the aircraft because it helps them stay safe while flying (Helmreich and Merritt, 2017). Pilots who have been properly trained and have experience in dealing with emergency situations are better equipped to handle dangerous situations should they arise. This not only keeps them safe but also the passengers on board their aircraft. Naor et al. (2020) also mentioned that it is crucial that they be able to react quickly and effectively to ensure the safety of everyone on board. Training aid pilots become familiar with the procedures that need to be followed to address a particular emergency and allows them to practice these procedures until they can perform them instinctively (Material Six, n.d.). This type of training can make a huge difference in terms of the outcome of an emergency situation.
The fifth suggestion would be that air traffic controllers be assisted by technology to monitor the location and trajectory of planes in order to avoid collisions. There are several technologies that can help pilots avoid collisions during their flights. Some of these technologies are Automatic Dependent Surveillance-Broadcast (ADS-B), Traffic Collision Avoidance Systems (TCAS), and Ground Proximity Warning Systems (GPWS). ADS-B technology allows pilots to see other aircraft in the vicinity by providing real-time location information transmitted from other aircraft and ground stations (Wang et al., 2017). This helps pilots to avoid collisions during flights. Additionally, ADS-B provides pilots with weather information that can help them make better decisions while flying. Additionally, TCAS is a system that helps pilots avoid collisions during flights.
The system uses information from on-board transponders to determine the positions of nearby aircraft and then provides pilots with audio and visual warnings to help them avoid potential collisions. The TCAS system is designed to work both in Visual Flight Rules (VFR) and Instrument Flight Rules (IFR) environments, and it has been credited with preventing many mid-air collisions (Corraro et al., 2022). If a collision is imminent, TCAS will issue an audio warning to the pilots and will provide instructions on how to avoid the collision. Salgado and de Sousa (2021) supported that TCAS helps pilots avoid collisions during flights by providing both audible and visual warnings to pilots. These systems work by monitoring the airspace around an airplane for other airplanes and issuing warnings to the pilot if it detects a potential collision. TCAS can provide two types of warnings: resolution advisories (RAs) and traffic advisories (TA). RAs are instructions to change course, and TAs are warnings that there is another airplane in the vicinity but no instruction to take action.
RAs are typically issued when there is less than a 1-mile separation between two airplanes, while TAs are typically issued when there is more than a 1-mile separation between two airplanes. In addition, GPWS is a system that provides pilots with audible and visual warnings of an impending collision with the ground. The basic idea behind a ground proximity warning system is that it will provide pilots with an audio and visual warning whenever the aircraft approaches a terrain feature or obstacle too closely (Material One, n.d.). This allows pilots to take corrective action in time to avoid a collision. GPWS systems use several factors to calculate when a warning should be issued, including the aircraft’s altitude, airspeed, and vertical deviation from the planned flight path. In addition, many GPWS systems also incorporate a terrain masking feature, which considers that some terrain features (such as mountains) are easier to see from certain angles than others.
The sixth suggestion will require all pilots to undergo extensive training on how to use autopilot features to decrease human error. By understanding how autopilot features work, pilots can use them to their advantage in order to avoid potential human error (Brown and Laurier, 2017). For example, if a plane is flying on autopilot and the pilot notices that it is starting to drift off course, they can take manual control of the plane and correct the deviation. Similarly, autopilot features can be used to help pilots manage challenging situations. For instance, the autopilot feature can keep the plane flying straight and level if a plane encounters turbulence. This enables the pilot to focus on handling the situation instead of having to worry about keeping the plane on track.
The seventh submission would focus on designing aircraft automation to take into account multiple errors that could potentially occur in order to avoid a crash. The designing process may be complex; however, the goal is to create a system that can respond effectively to any number of potential errors and which will not put the aircraft at risk. Furthermore, the system must be designed in such a way that it can be easily operated by the flight crew, even in high-pressure situations (Material Seven, n.d.). Ultimately, creating an effective and reliable aircraft automation system is a huge challenge, but it is one that is absolutely essential for ensuring passenger safety.
I would also recommend that aircraft should be maintained and inspected regularly and ensure the pilots are well-rested and do not drink alcohol before flying. The cockpit voice recorder (CVR) and flight data recorder (FDR) should be maintained and inspected regularly to ensure better communication between pilots (Li et al., 2020). These devices are crucial for determining the cause of any accidents or incidents during a flight. CVR records the conversations and sounds in the cockpit of an aircraft for accident investigation. CVRs are usually found in larger aircraft, as smaller aircraft typically do not have the necessary wiring to support such a system. The use of CVRs can help pilots prevent accidents by providing investigators with insight into what went wrong during an incident (Li et al., 2020). For example, suppose a pilot encounters an unexpected problem while flying. In that case, investigators can listen to the CVR in order to determine what was said and done in the cockpit at the time. This information can then be used to help pilots avoid making similar mistakes in the future.
Furthermore, FDR is a device used to record aircraft performance and cockpit conversations during flight. The FDR is also known as a black box because it is typically painted black and located in the aircraft’s tail section, where it is most likely to survive a plane crash. The FDR records a variety of information, including airspeed, altitude, heading, engine power, and control surface positions (Li et al., 2020). This information can help investigators determine the cause of an aviation accident. For example, if the aircraft’s engines failed or if the control surfaces were jammed in a certain position, this information could be used to determine what went wrong.
It is essential for pilots to be well-rested in order to make sound judgments while flying. Even a small amount of alcohol can impair a pilot’s judgment and reflexes, so it is best not to drink at all before flying (Material Two, n.d.). In addition, being well-rested allows pilots to focus better and stay alert during long flights. Fatigue can lead to errors in judgment and may even cause a plane crash. Therefore, it is vital that pilots get a good night’s sleep before taking off. One of the most common ways in which fatigue can lead to accidents during flights is by pilots becoming careless or distracted (Material Three, n.d.). Fatigue can also lead to pilot error, poor decision-making, and slow reaction times. In addition, fatigue can cause physical problems such as blurred vision, dizziness, and lightheadedness. These physical problems can make it difficult for pilots to control the aircraft and land safely (Material Three, n.d.). Fatigue can also reduce a pilot’s ability to think clearly and remember important safety procedures.
I would equally recommend pilots maintain situational awareness and adhere to standard operating procedures during all phases of flight to reduce the risk of accidents. There are a number of standard operating procedures (SOPs) that pilots must follow in order to avoid accidents (Material Seven, n.d.). Some of the most important SOPs include checking the aircraft before takeoff to ensure it is in proper working order, following the pre-flight checklist before takeoff, and ensuring the aircraft is properly loaded and balanced. Similarly, they can maintain a safe separation distance from other aircraft, monitor cockpit instruments, and react appropriately to any warning or malfunction lights, as well as react quickly and safely to emergency situations.
Reference List
(Material Five, n.d.). HF Business: Communication.
(Material Four, n.d.). HF Business: Stress
(Material One, n.d.). Human Factors Concepts.
(Material Seven, n.d.). HF Business: Automation.
(Material Six, n.d.). HF Business: Human Error, Theories, Reliability, and Error Management.
(Material Three, n.d.). HF Business: Fatigue and Sleep.
(Material Two, n.d.). Go Home Airplane: You are Drunk.
Brown, B. and Laurier, E. (2017) ‘The trouble with autopilots: Assisted and autonomous driving on the social road’, In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 416-429).
Corraro, G., Corraro, F., Ciniglio, U., Filippone, E., Peinecke, N. and Theunissen, E. (2022) ‘Implementation and real-time validation of a european remain well clear function for unmanned vehicles’, Aerospace, 9(10), p.531.
Croft, H., Nesbitt, K., Rasiah, R., Levett-Jones, T. and Gilligan, C. (2017) ‘Safe dispensing in community pharmacies: Applying the software, hardware, environment and liveware (SHELL) model’, Clinical Pharmacist, 9(7).
Davidson, M. and Brennan, P.A. (2019) ‘Leading article: What has an Airbus A380 Captain got to do with OMFS? Lessons from aviation to improve patient safety’, British Journal of Oral and Maxillofacial Surgery, 57(5), pp.407-411.
Filburn, T. (2020) ‘Flight system sensor failure’. Commercial Aviation in the Jet Era and the Systems that Make It Possible, (pp. 169-179).
Gross, B., Rusin, L., Kiesewetter, J., Zottmann, J.M., Fischer, M.R., Prückner, S. and Zech, A. (2019) ‘Crew resource management training in healthcare: A systematic review of intervention design, training conditions and evaluation’, BMJ Open, 9(2), p.e025247.
Helmreich, R.L. and Merritt, A.C. (2017) Culture at work in aviation and medicine: National, organizational and professional influences. Routledge.
Helmreich, R.L., Merritt, A.C. and Wilhelm, J.A. (2017) ‘The evolution of crew resource management training in commercial aviation’ In Human Error in Aviation, (pp. 275-288).
Kharoufah, H., Murray, J., Baxter, G. and Wild, G. (2018) ‘A review of human factors causations in commercial air transport accidents and incidents: From to 2000–2016’, Progress in Aerospace Sciences, 99, pp.1-13.
Li, W.C., Braithwaite, G., Wang, T., Yung, M. and Kearney, P. (2020) ‘The benefits of integrated eye tracking with airborne image recorders in the flight deck: A rejected landing case study’, International Journal of Industrial Ergonomics, 78, p.102982.
Marcus, J.H. and Rosekind, M.R. (2017) ‘Fatigue in transportation: NTSB investigations and safety recommendations’, Injury Prevention, 23(4), pp.232-238.
Naor, M., Adler, N., Pinto, G.D. and Dumanis, A. (2020) ‘Psychological safety in aviation new product development teams: case study of 737 MAX airplane’, Sustainability, 12(21), p.8994.
O’Brien, J. (2019) ‘Mystery over the Atlantic: The tragic fate of Air France Flight 447’, The CASE Journal, 15(1), pp. 22–45.
Oliver, N., Calvard, T. and Potočnik, K. (2017) ‘Cognition, technology, and organizational limits: Lessons from the Air France 447 disaster’, Organization Science, 28(4), pp.729-743.
Özel, E. (2021) Factors contributing to the risk of airline pilot and cabin crew fatigue (Master’s thesis, İbn Haldun Üniversitesi, Lisansüstü Eğitim Enstitüsü).
Pasha, T. and Stokes, P.R. (2018) ‘Reflecting on the Germanwings disaster: A systematic review of depression and suicide in commercial airline pilots’, Frontiers in Psychiatry, 9, p.86.
Peysakhovich, V., Lefrançois, O., Dehais, F. and Causse, M. (2018) ‘The neuroergonomics of aircraft cockpits: the four stages of eye-tracking integration to enhance flight safety’, Safety, 4(1), p.8.
Salgado, M.L. and de Sousa, M.S. (2021) ‘Cybersecurity in aviation: The stpa-sec method applied to the TCAS security’, In 2021 10th Latin-American Symposium on Dependable Computing (LADC), (pp. 1-10).
Subramani, S., Garg, S., Singh, A.P. and Sinha, A.C. (2018) ‘Perioperative communication: challenges and opportunities for anesthesiologists’, Journal of Anaesthesiology, Clinical Pharmacology, 34(1), p.5.
Wang, Y., Xiao, G. and Dai, Z. (2017) ‘Integrated display and simulation for automatic dependent surveillance–broadcast and traffic collision avoidance system data fusion’. Sensors, 17(11), p.2611.
Wu, Q., Molesworth, B.R. and Estival, D. (2019) ‘An investigation into the factors that affect miscommunication between pilots and air traffic controllers in commercial aviation’, The International Journal of Aerospace Psychology, 29(1-2), pp.53-63.
Yılmaz, M.K., Erbudak, G. and Gündüz, S. (2022) ‘An exploration of the causes and effects of flight attendant fatigue in Turkish aviation’, International Journal of Research in Business and Social Science, 11(5), pp. 01–17.
Yiu, C.Y., Ng, K.K., Li, X., Zhang, X., Li, Q., Lam, H.S. and Chong, M.H. (2022) ‘Towards safe and collaborative aerodrome operations: Assessing shared situational awareness for adverse weather detection with EEG-enabled Bayesian neural networks’, Advanced Engineering Informatics, 53, p.101698.
Zaslona, J.L., O’Keeffe, K.M., Signal, T.L. and Gander, P.H. (2018) ‘Shared responsibility for managing fatigue: Hearing the pilots’, PLoS One, 13(5), p.e0195530.