Pilots are prone to making poor decisions while flying in bad weather because of irrational thinking habits, according to new research.
Three kinds of "cognitive bias" lead even the most experienced pilots to misjudge bad flying conditions in ways that could cause their plane to crash.
Like all humans, researchers told The Independent, pilots had a tendency to lean towards the first information that is presented - even if it isn't the most authoritative.
That impulse, and the urge to ignore negative advice, plays an important role in whether pilots choose to risk tricky landings or persevere through clouds, the team of experts in New Zealand said.
The "anchoring bias", "confirmation bias" and "outcome bias" have all been identified as irrational psychological impulses that stop pilots turning back during a flight when in reality they should.
Andrew Gibley, senior lecturer in aviation at Massey University, said these tendencies can be found in all kinds of professions - but the consequences in flying can be much more serious.
"Pretty much all the pilots we tested fell prey to these biases," Dr Gilbey said.
"And when they do they're likely to continue a flight into deteriorating weather conditions, when in reality they should be taking a diversion or turn back."
Controversy has long surrounded the cause of the plane crash into Mount Erebus in New Zealand in 1979, in which 257 passengers and crew were killed, with both pilots and technology being blamed.
The study published in Applied Cognitive Psychology asked 754 mostly male pilots to assess the safety of flying situations.
It found that when pilots are initially told the weather seems good, they tend to rate the atmospheric conditions as better for flying, and when they hear the weather is bad, they then rate them less favourably - despite the conditions being the same in both cases.
This "anchoring effect" describes a human tendency to allow the very first piece of information heard to have an undue influence on how a situation is thought about afterwards, the authors said.
Meanwhile the "confirmation bias" sees pilots be as likely to give weight to positive information, such as "it seems safe to land", as negative information, such as "the visibility is very low", when making a decision.
This shows that people will rely on reassuring evidence as much as on discomforting evidence, when in fact in high-risk scenarios such as flying they should give more weight to the problematic information.
Finally, pilots are likely to assess their flying decision as the correct one if they are told the flight went "well" afterwards, and their decision as dangerous if told that it ended up "crashing".
This "outcome bias" shows that people judge their own decision on what happens afterwards, rather than on the information available at the time.
According to Dr Gilbey, no method has yet been found to prevent these poor thinking habits.
"Only a small minority are an exception to these rules," he said.
"We've tried several interventions, including telling people about these biases and what is going on, but it all has pretty much no effect at all.
"It's just a very human thing to do."
He added that these sorts of situations occurred rarely and were more likely in small than large passenger aircraft.
Join our new commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies