I’m something of an aviation geek. So when I first heard that medicine had much to learn from aviation in order to improve patient safety, I was a ready recipient. And when I was subsequently trained by the MPS to deliver workshops on adverse outcomes for doctors here and in the UK, I became a strong believer in the potential for medicine to learn from aviation.
Much of what I taught made sense: How flattening the hierarchical structure in the cockpit with the introduction of crew resource management (CRM) meant that the old-fashioned ‘captain knows best’ became an anachronism — co-pilots are now encouraged to speak up and question, rather than sit there as a bombastic captain confidently flew the plane into the side of a mountain. Medicine has become safer by adopting a team approach and applying checklists.
Another key learning point was for medicine to challenge a ‘blame and shame’ culture. Instead of hanging the poor anaesthesiologist who inadvertently hooked-up the wrong tube to the patient out to dry, to develop a culture where human error is expected. As part of this, medicine has had to move to a system that looks for the ultimate cause of error. Having identified this, the task then is to change how things are done in order to minimise the chances of repeat incidents.
A key part of improving aviation safety was the sector’s development of a system to openly report, share, analyse and learn from incidents and near misses. Crucially, however, aviation does this on a completely anonymised basis, something that has yet to be replicated in the practice of medicine. Aviation supports an open culture of learning and reporting in which staff who report concerns are protected.
Unfortunately, this is very different from the way internal reports or personal reflections have fed into recent manslaughter cases brought against doctors in the UK.
The British aviation investigation body functions independently of the system in which it operates, meaning that it can make recommendations to system leaders, national bodies and regulators without fear of censure. Aviation investigators are trained, full-time and independent of the organisations that they are investigating. They are not busy people expected to investigate while continuing with their regular day-to-day responsibilities.
The other aspect of comparing aviation and medicine that bothers me is the presumed equivalence of passengers and patients. A greater prevalence of older travellers notwithstanding, airline passengers are a significantly healthier lot than most hospital patients.
Writing in the BMJ last May, consultant geriatrician Prof David Oliver noted that acute healthcare deals with people whose condition is often already unstable, who are unwell, and who have life-limiting illnesses. “Many people die, deteriorate, or have complications, even when no serious failings occur in healthcare: It’s often just the natural history of disease. And no number of protocols or checklists can eliminate all common harms and incidents,” he says.
Prof Oliver points out another important difference: While airlines don’t have to schedule flights on routes that they no longer consider profitable, GPs and hospitals must deal with everyone who arrives. “We must continue to soak-up relentless demand, no matter how bad the rota gaps are, how exhausted or demoralised the teams are, how short of beds or community support services we are, and no matter what key parts of logistics may be broken or what infection outbreaks the organisation may have.”
It’s the equivalent of a plane taking off with the co-pilot missing and two of the cabin crew absent. Which, let’s face it, puts a serious dent in safety comparisons between aviation and medicine.
Dr Torree McGowan, a US emergency physician and former military pilot, reckons trying to equate a machine to a complex biological system with free will is disingenuous at best.
Writing in Emergency Physician Monthly, she points out that, unlike pilots, doctors don’t get to pre-plan their missions:
“They calculate fuel, weight, distance, weather, a million variables. They are able to control for all of those things, and if things don’t look right, they stop the planning process and the mission is scrubbed. How often do we get to work to be told the CT scanner is down? Or that we’re out of saline? Would you ask a pilot to fly if the ground crew was out of hydraulic fluid for the plane? Of course not. That would be dangerous. We deal with supply shortages and maintenance failures every day and are expected to safely make do without,” she says.
There are obvious limits to what medicine can learn from aviation. But there is one element I would like to see put in place before we decide we have learnt all we can from aviation safety — we badly need a no-fault and firewalled incident analysis system as part of a robust open disclosure policy in the Irish health system.
You must be logged in to post a comment.
The health requirements of people in deprived communities need to be addressed as a matter of...
In an exclusive article for the Medical Independent, Minister for Health Simon Harris discusses his vision...
The Judge's report proposes that a Tribunal be established under legislation to hear and determine claims...
In December, the HSE released part of an external review into the case of 'Brandon', a...
The evidence on doctor burnout “should scare us and concern us”, the Director of the RCSI...
A review of public health governance structures and addressing “longstanding” IT infrastructure...
Agree completely with Muiris. But I would have one inter-related addendum to add. I suggest we have something very important to learn from the airline industry and which is related to his thesis. Pilots are restricted to ? maximum 22 hours duty per week. Doctors ( in my own personal experience) are expected to work ( if the roster is unfilled) up to 22 hours per day and the statutory inference is that it is their vocation. For 6 months as NCHD I worked 104 hours/wk. Safe? Def not.
Great article Muiris.
The danger with the aviation vs medicine comparison occurs when it is made by non-medics or non HFE psychologists. Jeremy Hunt et al. lecturing doctors on how to learn the lessons of aviation sticks in the craw when he (and other governments) fail to establish systems that facilitate that change.
For Human Factors to work, its needs a management buy in, top down. That involves a just culture, anonymous electronic reporting system, stress & fatigue mitigating rosters (including part time/job share) and entertaining the concept that outside of wilful non-compliance, all incidents arguably are system incidents.
Best regards, Ruth Little @Rxpeanutbutter