With the crash of an Army Aviation Cheetah helicopter near Bomdila in Arunachal Pradesh Thursday, killing two pilots, aviation safety and aircraft accidents are back into focus
by Air Marshal GS Bedi (Retd)
The Federal Aviation Administration, the National Transportation Safety Board, and the National Commission on Military Aviation Safety of the United States record human error as the leading cause of aircraft accidents —in over 88 per cent of the cases. This assertion is supported by statistics on aviation accidents collected over the past five decades, with minor variance in percentage. While errors can occur anywhere, including in design, manufacturing, and maintenance, pilot error is most common in accidents and thus requires greater attention. Poor in-flight decision-making was a contributing factor in 62 of all accidents between 2012 and 2021 across the world, according to the International Air Transport Association Safety Report of 2022.
In the event of an error, it is necessary to determine not only what went wrong but also why it happened. In an aircraft accident investigation, the Human Factors Analysis and Classification System (HFACS) framework adheres to this methodology and attempts to identify various factors, such as direct and contributory reason, individual and organisational failures, and so on. Every investigation identifies the errors, records the lessons learned, and makes a plethora of recommendations to prevent a recurrence, which are widely disseminated throughout the aviation community. Nevertheless, human error continues to occur with painful regularity.
Behavioural Features
Pilots, of course, do not intentionally make poor decisions; they are unaware that they are doing so. It is simpler to navigate through black-and-white circumstances but it is the grey, uncertain situations that pose difficulty. Understanding behaviour is a complete science in itself. Here are some key features:
‘Anchoring bias’ is a cognitive inclination in which a person over-relies on the first piece of information. The new information is not viewed objectively because it is interpreted in relation to an ‘anchor’, prior experience, compulsion, urgency, or simply a sense of commitment. This can cloud one’s judgement and make it difficult to update plans or forecasts as frequently as required. When one is committed to a course of action, one tends to filter all new information through an already established framework, distorting perception. Cognitive dissonance refers to the rejection of new information due to discomfort.
Consider the September 2001 crash of Beechcraft King Air C90 in Uttar Pradesh, which killed Congress leader Madhavrao Scindia, and the September 2009 crash of Bell 430 helicopter in Andhra Pradesh, which killed then-CM Rajasekhara Reddy. The investigation report said that bad weather played a role in both cases.
Another example is the January 2020 helicopter crash in California, in which American basketball player Kobe Bryant died. According to the National Transportation Safety Board (NTSB), the Sikorsky S-76B he was travelling in crashed into a hillside in foggy conditions.
Why didn’t the pilots in all these cases cancel the mission due to deteriorating weather? Anchoring bias — caused by the commitment and cognitive dissonance of having to deny influential passengers a flight — most likely resulted in inaccurate weather interpretation and poor decision-making.
Flip Side To Optimism
‘Optimism bias’ is the tendency to overestimate the likelihood of a positive outcome and underestimate the likelihood of a negative outcome. Without a doubt, optimism is a necessity — it encourages perseverance and inspires confidence in one’s own abilities. Nonetheless, it is crucial to understand how optimism can blind one to negative outcomes and lead to poor decision-making.
On 7 August 2020, a Boeing 737-800, operated by Air India Express, crash-landed at Kozhikode airport in Kerala, killing both the pilots and 19 passengers. The probe by Aircraft Accident Investigation Bureau (AAIB) revealed that the pilot continued with an unsettled approach to touch down halfway up the runway in light rain and tailwind conditions. The pilot in command was an experienced one and had previously operated from that airport. Despite the fact that the quality of the approach demanded a clear go-around decision, he hoped for a successful landing due to an optimism bias.
The Dunning-Kruger effect occurs when a person overestimates their own competence for a variety of reasons, leading to complacency and disregard for established safety procedures — resulting in an accident. According to the IATA Safety Report 2022, noncompliance with standard operating procedures (SOPs) was a contributing factor in 26 per cent of accidents in 2022.
The initial aircraft investigation report of ATR-72-212A, operated by Yeti Airlines Pvt Ltd and piloted by two captains, which crashed at Pokhara Airport in Nepal and killed all 72 people onboard, revealed that the pilot monitoring (or pilot not flying), who was the senior of the two, feathered both engines instead of lowering the flaps. The aircraft’s engines lost power and it crashed short of the runway. When the error occurred, the aircraft was 700 feet above the ground, and the pilots had sufficient time to recognise the error, but they did not. The master warning light in the cockpit, which indicated a problem, was also turned off. An experienced pilot confident in their ability to operate levers and switches clearly ignored the SOP.
Aviation Growth And Safety In India
According to Union Minister of Civil Aviation Jyotiraditya Scindia, India will have around 1,200 planes and fly close to 400 million passengers (both domestic and international) by 2027. He has forecast massive growth and argues that the country is on track to meet these lofty targets.
If the pace of development is to be maintained, safe operations will be critical. To improve aviation safety records, concerted efforts will be required, with the gaps identified. The Directorate General of Civil Aviation (DGCA) and the AAIB have conducted excellent investigations into air accidents and meticulously recorded them, but unlike the United States’ IATA or NTSB, they have not conducted a countrywide study to generate statistical data to show where the problem lies. An investigation into each accident will reveal the problems in that occurrence but may not reveal the pattern. A comprehensive and thorough study on ‘safety in the Indian skies’ conducted by the Ministry of Civil Aviation would be a welcome step.
The pilot is the final link between an aircraft and the accident. Good training and sound decision-making can significantly improve aviation safety. Simply listing mistakes and lessons learned, adding procedures, or blaming the pilots does not appear to produce the desired results. There is most likely a need to understand the underlying psychological conditions that lead to errors. It is believed that in addition to the established curriculum, training in behavioural sciences will equip pilots with objective thinking, leading to sound decision-making in the cockpit. “Ignorance breeds confidence more frequently than does knowledge,” said Charles Darwin.
Air Marshal GS Bedi (Retd), a fighter pilot with 3,700 hours of accident free flying on fighter planes in the Indian Air Force, is former Director General (Inspection and Safety), IAF. Views are personal

@media only screen and (min-width: 480px){.stickyads_Mobile_Only{display:none}}@media only screen and (max-width: 480px){.stickyads_Mobile_Only{position:fixed;left:0;bottom:0;width:100%;text-align:center;z-index:999999;display:flex;justify-content:center;background-color:rgba(0,0,0,0.1)}}.stickyads_Mobile_Only .btn_Mobile_Only{position:absolute;top:10px;left:10px;transform:translate(-50%, -50%);-ms-transform:translate(-50%, -50%);background-color:#555;color:white;font-size:16px;border:none;cursor:pointer;border-radius:25px;text-align:center}.stickyads_Mobile_Only .btn_Mobile_Only:hover{background-color:red}.stickyads{display:none}