Search

Flight risk: what safety lessons can healthcare learn from aviation?

OT  talks with University College London visiting professor and consultant neuropsychologist Narinder Kapur

Aeroplane

It may start as an envelope through the letter box or an unexpected phone call. The notification of an investigation into professional conduct is a moment that any clinician dreads.

Alongside proceedings by regulatory bodies, the idea that a practitioner could end up in a court room for an error that occurred on the job is sobering.

In the UK, the number of healthcare practitioners facing criminal investigations over errors that occurred while working has increased significantly in recent decades.

A total of 11 doctors were prosecuted for gross negligence manslaughter in the eight years to 2013 – only one fewer than the total number of manslaughter cases brought against doctors in the nine decades between 1885 and 1974.

Prosecutions for manslaughter have not been confined to doctors, with other healthcare practitioners facing criminal sanction.

In July 2016, optometrist Honey Rose was convicted of gross negligence manslaughter. This conviction was overturned a year later at the court of appeal. The General Optical Council’s fitness to practise investigation is ongoing.

The case of senior paediatric trainee, Dr Hadiza Bawa-Garba, has caused disquiet among the medical community, and achieved national prominence, including a BBC Panorama investigation.

Dr Bawa-Garba was convicted of gross negligence manslaughter in November 2015.

Following a lengthy legal battle she was allowed to continue practising as a doctor, after the court of appeal stood by a medical tribunal decision that erasure from the register would be ‘disproportionate.’

Both cases have raised questions about apportioning responsibility when things go wrong in healthcare and the best way to prevent mistakes reoccurring.

Professor Natinder Kapur
Professor Narinder Kapur

An eye on aviation

University College London visiting professor and consultant neuropsychologist, Narinder Kapur, is an expert adviser to the Confidential Reporting System for Surgery (CORESS), a scheme run by leading UK surgeons.

Last year, Professor Kapur was honoured with a lifetime achievement award from The British Psychological Society.

The CORESS scheme emulates a similar system that was established for the aviation industry, allowing surgeons to share lessons learnt from adverse events in surgical practice.

Professor Kapur explained that, following a large number of aviation disasters in the 1960s and 1970s, the aviation industry put a series of measures in place that improved safety.

Although the number of worldwide flight hours has doubled over the past two decades, the number of fatalities has fallen from around 450 to 250 per year.

“There was this feeling that things were bad in aviation and then they improved dramatically,” Professor Kapur highlighted.

However, within healthcare it is estimated that there are 200,000 preventable healthcare deaths in the US alone.

“That is equivalent to three fatal airline crashes every day,” Professor Kapur emphasised.

The importance of a blame-free culture is that the focus is on the systems – what went wrong and how can we prevent this happening again in the future – rather than automatically asking ‘Who is to blame?

Professor Narinder Kapur

A turning point

Asked what prompted the turn-around within aviation, Professor Kapur pointed to two key factors that helped to boost safety.

Firstly, increasing levels of automation and more safeguards within aircraft reduced the potential for human error to cause a crash.

Next, airlines put more resources into controlling the impact of ‘human factors,’

An emphasis was put on the training of staff in human factors, stringent adherence to checklists, good communication between crew members, removing hierarchy and people being encouraged to speak up if they saw something that was not right.

Professor Kapur said: “If you look at most airlines, they have had human factors specialists and psychologists working for them for years. If you go into most hospitals and say you want to speak to their patient safety psychologist, they will reply ‘Who is that?’.”

A recent UK rapid policy review has recognised the importance of human factors in its recommendations on how to improve the handling of gross negligence manslaughter cases in healthcare.

Aviation worker

Close calls

Professor Kapur also highlights the different organisational cultures that operate within healthcare and aviation.

There are different general reactions when something goes wrong, Professor Kapur shared.

“I think the concept of a near-miss is something that is accepted more in aviation than in healthcare. Whenever there is a near-miss in aviation, the aviation personnel are encouraged to hold their hands up and report this in an objective, factual way,” he observed.

However, this is not the case in healthcare. Professor Kapur shared that a relatively small proportion of safety incidents and near-misses are reported within healthcare.

“The number of near misses that is reported in healthcare is very small, although they must happen all the time,” he observed.

“In a year’s worth of work, if a person is in clinical practice, they will invariably make a mistake or have a near-miss here and there. A clinician might forget to carry out a certain procedure or make some communication error,” Professor Kapur said.

Professor Kapur emphasised that this focus on reporting potential safety incidents is present from the start of airline personnel training.

“It is drummed into them from the very beginning. They are encouraged to put their hands up and are told they won’t suffer as a result,” Professor Kapur said.

Professor Kapur shared that he does not think the same degree of rigour is applied within clinical training or inductions to work for a new healthcare employer.

“I think the ethos and the atmosphere in aviation is much more geared to speaking up. Near misses are more frequent than actual disasters but they are just as important. Often exactly the same circumstances are happening,” he said.

Important decisions should not be based in fear

Professor Narinder Kapur

The fear factor

While Professor Kapur shared that individuals should be accountable for what they do, he believes it is valuable to take a holistic approach when investigating a safety incident.

“The importance of a blame-free culture is that the focus is on the systems – what went wrong and how can we prevent this happening again in the future – rather than automatically asking ‘Who is to blame?’,” Professor Kapur highlighted.

Recent high-profile criminal cases, including the case of Dr Bawa-Garba, have contributed to a culture of defensive practice within the UK, he concludes.

Professor Kapur observed that while practising defensively is fine, if it means a clinician is taking more precautions, it becomes problematic if a practitioner is automatically refusing certain cases.

“Some surgeons, especially cardiac surgeons, are actually avoiding complex procedures. I have spoken to cardiac surgeons who say ‘If I get given a complex, high-risk case, I refuse to take it,” he shared.

“That is terrible because a patient may have survived that operation and had a full life afterward,” Professor Kapur emphasised.

“Important decisions should not be based in fear,” he added.

Dr Ali Poostchi
Dr Ali Poostchi

Learning the right lessons

Dr Ali Poostchi works as a senior registrar in the ophthalmology department at Queen's Medical Centre in Nottingham.

In December last year, he was a co-author on a paper published in Eye highlighting a spike in neuroimaging requests at his hospital trust. This followed the conviction of optometrist Honey Rose for gross negligence manslaughter, but was prior to the conviction being overturned on appeal.

Dr Poostchi shared his concern the profession could learn the wrong lessons from the case.

“We need to have robust systems in place to make sure that we are getting the basics right – seeing the right scan, on the right day, of the right patient – which didn’t appear to happen in this case,” he shared, noting that at trial Ms Rose had said she had not seen a retinal image showing obvious pathology.

Dr Poostchi emphasised that: “Sending borderline cases could makes it more likely that this situation will be repeated.”

He highlighted that a “flood” of referrals on a particular subject could result in delays in processing more urgent referrals.

He added that there is a risk that patients could be subjected to unnecessary investigations and procedures.

“If you identify something that is innocuous on the scan and you monitor or treat this you can cause harm to the patient,” he shared. 

Dr Poostchi added that the number of investigative aids available to optometrists has increased dramatically over the past decade and many optometrists are using advanced imaging techniques, such as ocular coherence tomography.

He highlighted the importance of making sure that these techniques are used safely and appropriately.

“You need to make sure that scans are reviewed and acted upon,” he said.

Like Professor Kapur, Dr Poostchi believes that a focus on systems rather than individuals is the best approach for ensuring that similar mistakes do not occur in the future.

“People will always make mistakes. They need systems in place to help mitigate those errors and provide safety nets, particularly in current optometry practice where you can have multiple individuals seeing a patient and performing investigations,” he observed.

Dr Poostchi is also worried about a shift to defensive practice within the UK following high-profile criminal cases.

“Focusing on the individual makes people nervous and it makes them less likely to admit their mistakes,” he shared.

“People often think that by practising defensively they will minimise their risk of litigation but I don’t think that is correct. I think it is damaging to the system and it is damaging to the patient,” Dr Poostchi concluded. 

Advertisement