“Why does someone with a good flying record act in a haphazard manner? This needs a holistic analysis, if we are to prevent future occurrences. The pilot's error, if one does exist, should be our point of entry into an investigation, not departure.”
These are the words that I wrote to someone on Twitter with whom I was engaged in a discussion as to the reasons for the tragedy of PIA Flight 8303, which crashed in Karachi on the 22nd of May. My interlocutor argued that it was a pilot's error, while I argued that most air crashes do indeed involve human error but that should be a starting point in exploring the systemic problems.
The deaths of 97 of the 99 on board the PIA flight headed to Karachi, just before the Eid holidays, has once again raised serious questions about the state of commercial aviation in the country. While the Prime Minister has ordered an investigation, preliminary review of the events leading up to the crash reveals a series of errors on part of the flight crew as well as the air traffic control. However the question remains as to what were the underlying factors that led to the decisions made that fateful Friday afternoon.
In 1984 Charles Perrow, a Yale University sociologist, wrote a book called Normal Accidents: Living with High Risk Technologies. He argued that despite our technological advances, we will continue to face the risks of catastrophes. This, he said, was primarily due to the fact that we have surrounded ourselves with high-risk systems and due to the way any particular system is connected, accidents and disasters are inevitable, even normal.
But what does this mean? When a plane takes flight, it does so as part of a system such that the security professionals that you encounter at the airport entrance, the clerks at the check in counter, the air traffic control personnel, even the baggage handlers are just as important to your safety as the pilot or maintenance engineers and the onboard computer systems or the staff at the Air Traffic Control. When each aspect of the system is working well, and it does so most of the time, air travel rarely hits the headlines. In fact it is a credit to how far we’ve come since the Wright Brothers took flight that an airport can receive thousands of flights a week and not even receive a mention in the news. But as normal as the state of commercial aviation is, anywhere in the world, the same normalcy lends itself to catastrophic failures when the cogs in the system do not work as they are supposed to. The domino effect results in a series of decisions and outputs that amounts to an unknowing and unwilling game of Russian roulette.
Yet, the number of accidents per the number of miles flown has significantly reduced globally - so much so, that flying is considered much safer than driving. Nevertheless, the crash of PK 8303 has reignited concerns about the safety record of commercial aviation in Pakistan. This is the 5th major air catastrophe to hit Pakistan in the last 15 years. That should be enough to serve as a wake-up call for the authorities.
The investigation report for the 2006 PIA Fokker crash came to light 6 years later in 2012. The one for the Bhoja Air crash in 2012 was made public 3 years later in 2015. The report for the Air Blue crash of 2010 was made public relatively quickly in 6 months. Meanwhile the 2016 PIA crash, that also resulted in the death of Junaid Jamshed among other passengers, is still to be revealed.
So, it is no wonder that there is a lot of skepticism about what a potential report may reveal – if it ever does see the light of day.
The reports from previous crashes point to a combination of technical and human failures. They point to major capacity gaps. However what we do not know is the extent to which any of the findings have been used to improve the systems and, indeed, reduce the risk.
A holistic analysis of the human element remains integral to addressing this crisis. As such, the human element is a reflection of the air crew's interaction with each other, but also with the air traffic controllers. It also pertains to the paradigms in which people interact with machines. The so-called Standard Operating Procedures (SOPs) and aircraft manuals might not be the “standard” after all, as each crisis differs from the others. This has deep consequences, as far as capacity building and training of pilots is concerned, for example. The dissonance in this regard is what Perrow has called “situated cognition,” whereby training platforms are bereft of the context in which skill sets must be deployed, resulting in understandable yet serious mistakes. In other words, real-life scenarios are hard to create during simulations – and as such, we need to rethink the way we get our air crews to prepare for such eventualities.
This was borne out during the now famous landing in the Hudson River by Chelsey “Sully” Sullenberger. In that instance he went against what might have been considered standard protocols and decided to head for the river instead of the airport when he lost both engines due to a bird strike. Everyone of the 155 people on board the aircraft on that cold January morning in 2009 walked away alive. Another pilot, as per the protocols, might have responded differently and perhaps headed for the airport, resulting in a disaster. And given the nature of the catastrophe that resulted in the plane losing both engines, no one would have begrudged his or her decision to do so.
But situated cognition can only be realized if the take-home lessons after each catastrophe are studied, analyzed and inculcated within the broader aviation industry.
The Pakistani government has established an inquiry commission to investigate the crash of PK 8303. There are questions as to the make up of the commission, which includes serving officers of the Pakistan Air Force. Since the Chairman of Pakistan International Airlines is also a serving Air Marshal, there is the possibility of a conflict of interest vis-a-vis the investigation. Such perceptions might not stop the investigation but they could most definitely lead to hampering the potential for uptake of findings. As such, the government needs to give serious consideration to having a broad-based investigation team that not only looks at the technical aspects but also the human dimensions inherent to operations of an aircraft.
Furthermore, the government will do well to ensure that the investigation report of the 2016 crash is released to the public at the earliest. I fear that it will reveal that we have collectively failed to imbibe the findings from previous such investigations, thus rendering them impotent. An investigation should be a means to a holistic outlook on the state of commercial aviation in the country, rather being treated as end unto itself.
The purpose of any investigation is not to assign blame - although it seems that is what the exercise is all about, in our case. Yet, even if we do assign blame to a pilot or human error, the fact remains that the situation could then only be addressed when factors which led to the error have been identified. As such, it is those underlying factors that should be examined. And it should be those underlying factors that are eventually addressed during capacity building initiatives.
My discussion on Twitter went on for a while. The 280-character limit does not easily lend itself to an engaging discussion but what it does allow are thought provoking interactions. The commercial aviation industry is a study in risk management. But it is also a study in contrasts. In effect, we as a society have decided that the benefits of air travel far outweigh the costs.
But when a system is as tightly coupled and as complex as commercial aviation, the risks will be prevalent. Yet there is a need to work collectively towards a mechanism whereby we reduce the risks to the system, to the extent that disasters are turned into near-misses, and near-misses completely eliminated. A timely, comprehensive and holistic investigation will be a step in the right direction.