On January 25, 1990, Avianca Flight 52 from Bogota to New York – a Boeing 707 – ran out of fuel and crashed in Cove Neck, 20 miles from JKF Airport. Sixty-five of its 149 passengers were killed. Air-traffic control was dumbfounded. Ten minutes earlier, they had been casually chatting to the aircraft’s first officer. He had mentioned that his plane was low on fuel, but he had given no indication of the dire trouble he knew they were in.
Why are we telling you this tragic, bizarre story? Because, as you’ll see, it can teach us a huge amount. In fact, aviation as a whole can provide breath-taking insights into leadership, teamwork and communication.
We’ve been talking to qualified pilot Matt Meyer, CEO of law firm Taylor Vinters. Our conversation was a revelation. Matt wowed us with his passion for flying but what grabbed us most were his stories of incredible heroism and devastating disaster. And each tale revealed critical lessons.
We’d always assumed most air accidents were down to technical malfunctions. Not necessarily so, Matt explained. Technical issues are often involved, but what often seals the outcome for stricken aircraft is human behaviour such as crew interaction, communication or decision making. It’s these examples of optimal and suboptimal cockpit behaviour that can teach us so much. Prepare for take-off…
Lesson 1: Decisive communication saves the day
The doomed Avianca flight imparts clear lessons. Lessons which have changed the face of aviation safety today. Avianca now has an excellent safety record. Matt takes up the story: “The aircraft had been routed around due to bad weather, but it was running short of fuel. They had tried to land but couldn’t because of strong winds, and so told air traffic control that they would go around and try again. They carried out a standard missed approach, but soon ran out of fuel and crashed.”
How on earth could this have happened?
Matt explains: “The cockpit team failed to clearly convey the level of emergency they were in. Instead of decisively declaring an emergency, they chose to continue as if nothing was wrong. Why? A cultural issue was probably the root cause. The crew were from a relatively non-confrontational cultural background, which contrasts with JFK which is known for having an assertive air traffic control. The crew seemed unwilling to challenge air traffic control and so fell back on doing what they were told and complying without question. The outcome was terrible…and arguably avoidable.
“What’s fascinating is comparing the conversation between JFK’s air traffic control and the aircraft with the cockpit voice recording of the crew interaction,” says Matt. “To air traffic control, it just sounds like business as usual on-board. But the transcript of the cockpit conversation couldn’t be more different: the pilots were in a state of panic and confusion as the last of their fuel burned away. They were arguing with each other, but as soon as air traffic control came on, they sounded happy and relaxed.”
Matt contrasts the Avianca disaster with another incident. “Ten years later, an American Airlines plane approached JFK Airport in a similar predicament,” he says. “It had been routed around and was also low on fuel. As it approached JFK to land, strong winds meant the pilot was worried he might not get his aircraft down on his allotted runway first time. If that were to happen, he would be forced to fly around again – just like the Colombian plane. So, as he came in to land, he quickly and decisively declared an emergency demanding, in no uncertain terms, that air traffic control clear all runways to prioritise his approach. If you listen to the recording, the pilot is absolutely clear. This clarity managed the emerging threat and ensured the safety of his passengers.”
These examples highlight that decisiveness and clear communication lead to positive outcomes, while hesitation and muddy communication can result in the exact opposite. “A characteristic of both good cockpit management and strong business leadership is the ability to make a decision swiftly when needed, communicate it and take full responsibility for it,” says Matt. “The real risk comes from not making a decision at all.”
Lesson 2: Authoritarian leadership leads to errors
On December 22, 1999, Korean Air Cargo Flight 8509 took off from London Stansted for Milan. Shortly after take-off, it crashed into Hatfield Forest. Tragically, all four crew were killed.
“Around 30 years ago, Korean Air had a bad spell,” says Matt. “They lost several aircraft to accidents within a decade. When they analysed what was happening, they discovered a problem with Korean Air’s cockpit culture. It was too authoritarian. Many of their pilots came from a military background, and Korea itself had a very hierarchical social structure. It is suggested that this meant that flight engineers and first officers were often unwilling to challenge the Captain when things were going wrong.”
“This Stansted crash is a case in point. In this accident, the flight engineer did actually challenge the Captain, telling him that certain instruments weren’t working. There had been a replacement of the defective component the night before and clearly the problem hadn’t been rectified. But he was ignored. He repeated his concerns to no effect. The first officer who had a set of correctly working instruments also failed to speak up and tell the Captain what he should do even though his role in the cockpit was “pilot monitoring”. It was a fatal hesitation. The flight engineer was correct, and they crashed shortly after take-off.”
After Korean Air recognised and accepted that an authoritarian culture may be stifling the voices of talented team members, the airline went on to become one of the safest in the world. “They focused all their efforts on building the right cockpit culture,” says Matt. “What has emerged is a flatter command structure where the captain’s role is all about being a responsible individual who marshals the skills of others. This new culture has led to an exemplary safety record.”
A non-hierarchical set-up is also behind Qantas’s excellent record, says Matt. “Qantas is such a safe airline because the Australian culture embraces challenge,” he says. “You’re not going to find an Australian first officer who won’t speak up if there’s a problem. There’s a culture of ‘polite challenge’ – the crew are encouraged to provide input to the Captain, who takes decisions after listening to their input.”
Matt believes that business leaders can learn plenty from these examples. “Korean Air and Qantas teach us that being a leader isn’t about grabbing the controls when something’s not right or dismissing people if they don’t agree with you. It’s about recognising you’re in a constant learning situation and should always draw on the skills of others. That’s healthy and we should all seek to emulate it.”
Lesson 3: Simulation training leads to better outcomes
On January 15, 2009, US Airways Flight 1549 took off from New York City’s LaGuardia Airport for Charlotte, North Carolina. As it climbed, it struck a flock of geese and lost all engine power. Unable to reach any airport, pilots Chesley Sullenberger and Jeffrey Skiles glided the plane and successfully ditched in the Hudson River. All 155 people aboard were rescued.
The “Miracle on the Hudson” is probably the most famous incident in modern aviation history. But what can we learn from it, and how can we apply it to business?
Matt says: “Sullenberger was an incredibly experienced commercial pilot. And like all pilots, he trained for engine failures and other emergencies in a simulator.” There’s no doubt that sim training contributed to the Miracle on the Hudson.
“In the legal profession – and others – simulation training does not exist. Why not? As a pilot you are constantly put in sims to help you learn and to check that you can cope with certain situations. You do this every six months to practise dealing with emergencies such as an engine failure, an aborted take-off or fire. We never simulate in professional services, so we only ever learn in a live environment. That’s a missed opportunity and a risk.”
He continues: “In aviation, decision making is made easier because you learn how to recognise a certain situation and react appropriately. Pilots practise and practise in the sim until they can do it from muscle memory. Business could do the same. If we did more planning and sim training for business scenarios, fewer professionals would get blindsided. As lawyers, we deal with people’s most significant financial and personal situations. We could look harder at creating a consistent approach to those scenarios.”
We’ll finish this lesson with an unmissable quote from Chesley Sullenberger: “One way of looking at [the Miracle on the Hudson] might be that for 42 years, I’ve been making small, regular deposits in this bank of experience, education and training. And on January 15, the balance was sufficient so that I could make a very large withdrawal.”
Lesson 4: Trust your instruments
Matt Meyer’s grandfather is 102 and holds two Distinguished Flying Crosses for service during World War II. The war hero’s message to his pilot grandson has always been: learn to trust your instruments.
Matt says: “During the Second World War, the RAF would, by necessity, put extremely inexperienced pilots up in the air and many got lost, crashed, or collided. Why? Because they hadn’t had the time to properly learn how to rely on their navigation instruments. My grandfather tells me that he only got through the war because he learned to rely on his instruments. That meant he knew how to get home and so avoided panicking and making fatal mistakes.”
Taylor Vinters’ CEO believes that this offers a clue for business: learn to trust data rather than relying on your gut instinct. “Performance data can give you a very clear picture of what’s going on,” he says. “We should trust it more. The tendency is to be more optimistic about what might happen than the data suggests, or to ignore it because you don’t like the answer. We should make honest assessments of the data and act accordingly.”
Lesson 5: But, don’t blindly rely on automation
On June 1, 2009, Air France Flight 447 took off from Rio de Janeiro for Paris. Four hours later it crashed into the Atlantic Ocean, tragically killing all 228 passengers and crew. The pilots had become disorientated after the aircraft’s altimeters gave false readings.
Matt says: “As per normal on long flights, in this instance the captain was taking a break and a relatively junior pilot was flying during what should have been a straightforward leg. Many long-haul airlines employ lower-hours pilots to handle the cruise when the aircraft is typically on autopilot. Here, the static ports that measure air pressure – how the altimeter works – froze over, resulting in incorrect readings. This caused the autopilot system to disengage, leading to an unfamiliar, high-stress, confusing situation. They thought: ‘The instruments say we’re losing altitude, so we need to point the nose up and increase the power.’ But doing this made the aircraft stall and lose altitude. They kept repeating this error until it was too late.”
Matt suggests this tragic accident provides a lesson about automation and the skills we need to interact with it. He says: “Aircraft automation is about reducing crew workload, increasing consistency, and building in good threat and error management. It has made a very positive contribution to safety. But from day one as a commercial pilot you’re taught to treat automation like a third cockpit colleague – you must monitor it, supervise it and train it. What you don’t do is rely blindly on it. The same is true in business. AI is not something professionals can use without thinking.”
Lesson 6: Constant feedback builds success
Each day, roughly 100,000 commercial aircraft take off and land safely around the world (figure: Air Transport Action Group). Despite the tragic case studies cited above, that’s an incredible safety record. How has the industry achieved this? Matt believes it’s down to an emphasis on learning and improvement.
He says: “If you look at the way a commercial cockpit operates, the Captain of any aircraft – whether it’s EasyJet or British Airways – will brief all crew before every flight. He or she will talk to them about the flight, discuss what they’re going to do, visualise what the issues might be, ask for feedback on whether there are any passengers with disabilities, etc. More importantly, there will be a debrief after landing, too. That culture of constant briefing and debriefing is crucial for the development of safe flying systems.
“Imagine if we were to introduce that culture to professional services – how much faster and better would we all learn? In business we often don’t pause for thought, don’t reflect, and don’t analyse how we can improve our systems. If I was a client engaging with a professional services firm and they told me they had a culture of constant learning through regular briefing and debriefing, I’d be impressed.”
Lesson 7: A non-blame culture accelerates improvement
Matt’s final lesson is another cultural one. He says: “Aviation has developed a brilliant culture of self-reporting. Pilots and crew self-report and report on each other all the time – it’s considered normal and natural, and it’s confidential. All that information goes into improving safety systems, as does telemetry gathered by the aircraft itself. None of it is about liability or coming down heavily on pilots or crew. It’s about amassing more and more rich data about what works and what doesn’t work.
“In professional services and business in general, we talk about creating a culture of continuous improvement, of owning up to our own mistakes, but that clearly isn’t the reality for many. Why? Because there’s still a perception out there that we will be judged negatively as individuals. There’s not enough understanding that the aim is to improve the organisation.”
By looking closely at the aviation industry, we can learn a huge amount. Matt’s breath-taking stories help us to look outside our own bubble at a world where skilled leadership and improvements to workplace culture save lives, not pounds. As Matt says: “The aviation industry gets safer each year; not just because of technology but thanks to improvements in behaviour, culture and cockpit management.” The business world would do well to emulate this ethos…