Our politicians could learn a great deal from the aviation industry

Mistakes can be avoided, if failures are confronted and learned from

Matthew Syed
Wednesday 23 September 2015 18:49 BST
Comments
An air stewardess has died of malaria
An air stewardess has died of malaria (Matthew Lloyd | Getty Images)

A report published this week found that the biggest complaint made against the NHS by patients is that mistakes are not met with a full and frank apology. It would seem that clinicians find it difficult to be open and honest about their errors – something which may seem rather insignificant but has huge implications, not just for healthcare but beyond. After all, mistakes are inevitable in a complex world. Whether we are doctors, politicians, scientists or students, we are going to get things wrong. But if we are not open and honest about these errors, if we do not confront them, how can we learn from them?

In her seminal book, After Harm, researcher Nancy Berlinger investigated how doctors talk about mistakes. It proved to be eye-opening. “Observing more senior physicians, students learn that their mentors and supervisors believe in, practice and reward the concealment of errors,” she writes. “They learn how to talk about unanticipated outcomes until a ‘mistake’ morphs into a ‘complication’. Above all, they learn not to tell the patient anything.”

This allergic attitude to error is partly about fear of litigation, but is also about the threat to ego. Think of the language associated with senior physicians. Surgeons, for example, work in a “theatre”. This is the stage where they “perform”. How dare they fluff their lines?

As James Reason, a safety expert, put it: “After a long education, you are expected to get it right. The consequence is that medical errors are marginalised and stigmatised.” This is why clinicians find it so difficult to offer full apologies, and why mistakes are concealed.

The data here is revealing. Epidemiological estimates of error in the US suggests that 44 to 66 avoidable injuries occur per 10,000 hospital visits. But how many hospitals report rates within this range? Only one per cent.

In the UK, it is estimated that only 15 per cent of incidents are reported. The closed culture is perhaps most fully revealed by the fact that it is not just clinicians covering up mistakes and substandard care, but regulators and, in the case of the Mid Staffordshire scandal, the Department of Health too.

But think of the consequences. If mistakes are not confronted and learnt from, the same mistakes will be made over and over again. This is why preventable medical error is one of the biggest killers in the West. A report by the UK National Audit Office in 2005 estimated that up to 34,000 people are killed each year due to human error. It put the overall number of patient incidents (fatal and non-fatal) at 974,000. In the US, 400,000 people die annually because of preventable error in hospitals alone. That is the equivalent of two jumbo jets crashing every day.

Compare, instead, this attitude to failure with the one adopted in the aviation industry. Every plane is equipped with two black boxes, which record vital information. If an accident occurs, these boxes are recovered, the failure analysed, and reforms implemented. This means that the same mistakes don’t happen twice. Pilots also voluntarily report thousands of “near miss” events, which are then statistically analysed and the information then used to avert accidents before they happen.

Which of these two approaches is adopted by politicians? Do they learn from the inevitable weaknesses in their policies and idea, within a culture of learning and adaptation? Or do they spin their mistakes, cover them up, and deploy their intellectual energy towards justifying their prior assumptions? You already know the answer.

Virginia Mason, a hospital in Seattle, adopted the aviation attitude to error. Staff were encouraged to file reports if anything went wrong, such as accidentally prescribing the wrong medicine. That gave the hospital an opportunity to make changes, such as altering the labelling on drugs so that they could be easily identified even when doctors and nurses were under extreme pressure. Other changes were made in the light of mistakes, such as the redesign of surgical equipment, the use of checklists, and so on.

Virginia Mason has had an astonishing 74 per cent reduction in liability insurance premiums. It is now regarded as one of the safest hospitals in the world. That is the power of a culture that does what good scientists do: learn from mistakes. If only our politicians could be so bold.

Matthew Syed is the author of ‘Black Box Thinking: The surprising truth about success’

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in