When surgery goes wrong: weighing up the risks

As a pilot, Martin Bromiley is trained to expect human error. When his wife died following routine surgery, he was stunned to find that NHS doctors are not. Jane Feinmann reports
Click to follow
Indy Lifestyle Online

When Elaine Bromiley’s sinus problems turned into an infection of the eye socket two Christmases ago, her surgeon recommended a routine operation. Straightening out the side of her nose and clearing the passages would prevent the trouble from spreading and potentially causing permanent damage to the optic nerve.

The small risks involved seemed worthwhile to the fit, healthy, 37-year-old mother of two. Early one morning in March 2005, husband Martin and children Victoria, aged six and Adam, five, accompanied Elaine to the hospital to prepare for surgery.

“We both felt the children would appreciate being involved,” recalls Martin Bromiley. They kissed goodbye before she was wheeled off to the operating theatre – and then drove home to wait for word that she was ready to see them all again. It never came.

Elaine did not regain consciousness after the operation and suffered brain damage that deteriorated over the remaining 13 days of her life. A scan was “like a TV covered in static, no shape, no texture, no colour”. She had expressed the wish not to live as a vegetable and life support was eventually withdrawn even though she was still breathing strongly.

At first, his wife’s death appeared to be a ghastly misfortune: the result of an unpredictable emergency where everything possible was done to save her. However, Bromiley’s own experience and insight into the tragedy led to two devastating conclusions: his wife’s death could have been avoided; worse, the circumstances in which she died are not unusual.

The first inkling that anything was wrong with his wife’s surgery came at 11am the morning of the operation. Bromiley took a call from the ear, nose and throat surgeon informing him that she had not woken up properly. Her airways had collapsed shortly after she had been put to sleep, he was told – and an attempt “over a period of time” to get a tube into her lungs had failed because of an unexpected blockage. Now her oxygen levels were “very low”.

“Couldn’t you have cut into her throat?”, Bromiley recalls asking the ENT surgeon, delving into his minimal medical knowledge. The answer, that to do so would have been too dangerous, was comforting: if his wife had been in real danger, he reasoned, such risks would surely have been taken. It only gradually became clear that his suggestion, a tracheotomy, might well have saved her life. That it didn’t happen, he discovered, is a reflection, not of the failings of any individual, but rather the result of a deeply flawed system.

At first, Bromiley had no concerns about his wife’s care. Alongside his intense worry for his wife and children, he felt, if anything, comradeship with the operating team. In conversation with an intensive care (ICU) consultant, he said: “I’ve accepted it as just one of those things, that Elaine’s condition couldn’t have been predicted and that when the emergency occurred, the team did what they believed to be the right things. Unfortunately, it just didn’t work out.”

Bromiley made these assumptions because of his job. He is a pilot, working in an industry which, over the past decade, has achieved an excellent safety record. He assumed – wrongly – that surgery, another high-risk activity, had introduced the same reforms as aviation. The risk of dying on a scheduled aircraft flight today is one in 10 million. By contrast, Bromiley now knows, the risk of dying in hospital as a result of medical error is one in 300. That means hospital treatment is 33,000 times more dangerous than flying on a scheduled airline – a statistic that the chief medical officer, Sir Liam Donaldson, went out of his way to confirm recently.

This extraordinary difference is almost certainly explained by a single fact: British aviation today acknowledges that human error is normal, especially in fast-moving, high-risk situations – and that because safety cannot be taken for granted, it must be carefully designed.

Since 1995, a simple, supplementary, “non-technical skills” training package, known as human factor (HF) training, has been mandatory for every British pilot and crew member. Aimed at making safety the top priority, HF training involves building teamwork, communication skills and assertiveness. Simple measures such as a routine of briefing and debriefing raise safety levels dramatically – as well as creating security and confidence in a team that knows each member is working for the same purpose.

Last week, Bromiley, an expert in this training programme, joined senior health managers and leaders of medical royal colleges, including Bernard Ribeiro, president of the Royal College of Surgeons (RCS), on the platform of a day-long conference entitled: Everybody’s Business: Lessons from High-Risk Industries for Patient Safety, a milestone in a growing campaign to make human factor training compulsory for surgeons.

While Bromiley told the story of his wife’s death, Mr Ribeiro told the conference: “Professional failures are more often due to behavioural difficulties, personal conflict, lack of insight, systems failure or defective infrastructure than technical failings or lack of knowledge.”

He went on: “Accidents are rarely caused by a single individual. They are more often the result of a sequence of avoidable errors or organisational defects.”

The first signal that avoidable errors and organisational defects killed his wife surfaced during that same conversation with the ICU specialist.

Bromiley had said that the only positive he could hope for was that some lesson might be learnt as a result of the investigation that would take place. The reply, that no such investigation would take place unless he sued or made a complaint, was a genuine shock: and the seeds of his campaign took root.

“I was stunned to discover that there would be no automatic investigation,” he recalls. “As a professional, I couldn’t understand how the operating team, the hospital and the profession as a whole wouldn’t try to learn from this tragic incident.”

Yet an independent investigation did take place, commissioned by the director of the clinic where the operation took place. She told him: “We need to learn what happened.”

Carried out by Professor Michael Harmer, a former president of the Association of Anaesthetists, its findings were devastating: not least because the circumstances of Elaine Bromiley’s death was a textbook example of what happens if human error is not anticipated. It found there had been no shortage of knowledge, equipment or manpower – an ENT consultant, two consultant anaesthetists and four nurses were present – to manage the emergency that presented shortly after the patient was put to sleep.

When her airway collapsed, all three consultants attempted to intubate: a simple procedure that involves putting a tube into airways. “But there was an obstruction; we still don’t know what it was. The consultants appear to have become fixated on intubation as the only option.

“Elaine died because of collective loss of awareness among the consultants – both an awareness of time and much more importantly an awareness of the seriousness of the situation,” says Bromiley. “At the inquest, the lead anaesthetist said that he had lost control and there was a dispute over exactly who was in charge and making the life-and-death decisions.”

What made the situation worse was that two of four nurses admitted that they “knew exactly what needed to happen”: one brought tracheotomy equipment into the theatre but was not acknowledged; another booked an intensive care bed but was led to understand that she was overreacting and so cancelled it. “Both of these nurses knew how to save Elaine’s life. But they didn’t know how to broach the subject with their bosses,” says Bromiley.

At the inquest in 2005, he dismissed a suggestion that a verdict of neglect on the lead anaesthetist would be appropriate. “This was not a case of one man being incompetent. The team made mistakes as a whole. It was a failure of the system.”

Since then, he has found support for his demands that HF training be made compulsory. He has supported an RCS initiative that has successfully tested human factor training for surgical teams in Oxford and London. He has also met with the chief medical officer and Sir Ian Kennedy, chairman of the Bristol Inquiry and the Healthcare Commission. “Senior health officials are well aware of the issues but are struggling to persuade senior clinicians and politicians of the problem,” he says.

Meanwhile he and his children face the second Christmas without their mother. “I am not angry with the surgical team. If anything, I feel sympathy for fellow professionals who have been involved in an incident that has resulted in a death without the training that is routine in aviation,” he says.

HF training and patient safety will become institutionalised, he says. “It will happen within the next 20 years. That’s too long to wait – with so many people at risk of suffering in the way our family has suffered. It must happen more quickly. I need to be able to show my children the lessons that have been learnt from their mother’s death.”



How to have safer surgery



When you are deciding whether to have an operation:

* What are the risks involved in this operation?

* What are the risks of not having this operation done?

* If it all goes well, what will the recovery be like and the end result?



Immediately before the operation, when the whole surgical team is present:

* What points in this operation present risk?

* If problems occur at that point, what options do you all have and what is your likely course of action?

* What is the worst-case outcome?

* As a team how will you support each other?



Finally, say to the whole team

* If at any stage you see something you are unhappy with, tell the whole team and make sure you get a considered response from them. Remember that safety comes before productivity.

Comments