“The greatest enemy of knowledge is not ignorance, it is the illusion of knowledge.” – Daniel J. Boorstin
Avianca Flight 011 took off from Paris on its way to Spain’s Madrid-Barajas International Airport. It was a routine flight. Upon reaching the Madrid airspace, the pilot asked and received landing clearance. The pilot then unwittingly made a wrong turn on his approach vector. As the aircraft dropped to 2248 feet, the ground-proximity warning systems started sounding. “Terrain! Pull up!” filled the flight deck. The pilot was sure they were on the right path and the altitude was appropriate for the descent to runway 33, so he ignored the warning system. After all, the airport approach controller would surely inform them if they were on the wrong path. Moments later, the outer starboard engine impacted the top of a hill at 163 mph causing the right wing to dig deep into the ground, pitching the plane into a “cartwheel”. The entire airframe spun violently and disintegrated. All 19 crew members and 162 of the 173 passengers perished. In addition to the negligence of the pilot to heed the warning systems, investigators showed that the airport failed to inform the crew that their radar service had been terminated and they were unable to monitor their craft.
This is a tragic story of the illusion of knowledge. So strong was their illusion of what they thought was true that the flight crew dismissed the ground-proximity warning system that could have saved their lives. We often see things like this and blame the pilot for irresponsibly ignoring the clear warning signal. But how often are we guilty of the same thing? There is a tendency for us all to elevate our own mental models to the point of silencing anything that contradicts. Challenging our perceptions or established views is difficult and painful. We like to form simple models for the world, our work and our systems to make our lives easier. With a model, you don’t need to think, you just act. Like muscle memory, we fall into the lowest cognitive energy state. That can be extremely helpful in allowing us to process vast amount of information every single day, but unfortunately, those models can be wrong. And sometimes, those differences make all the difference in the world.
Last year, when we were able to travel, my family and I spent several days in London. We traveled around the city using the Underground railway. As you bounce between Waterloo, Jubilee and Bakerloo among the various cavernous stations, you often hear the public announcement system remind you to be observant and diligent about reporting anything unusual, “See it. Say it. Sorted.” It is to raise awareness of the vital role we all play in keeping ourselves and others safe. That isn’t just appropriate for riders of the London Tube, it applies to all of us, including our jobs.
Seek the truth and strive to see problems clearly. As engineers and scientists, we should always be in pursuit of evidence and truth. When data presents itself, like an early warning system, don’t dismiss it, report it. Investigate it. Adapt to it. Be a warning signal to others. Make it safe for others to approach you with truth. Do you feel safe calling out problems or issues to your leaders and others? Are you hesitant to report or responds to indicators that something may be wrong? Don’t! Truth is gold. Leaders need that insight. Seeing clearly and embracing data, even if it is inconvenient or breaks our illusion, will help us all become better. I challenge you this week, look for truth, honor the truth and speak the truth.
Note: Photo from Creative Commons (link).