International | Patient safety

Hospitals are learning from industry how to cut medical errors

Industry and behavioural science may both have things to teach doctors

|COVENTRY
Listen to this story.
Enjoy more audio and podcasts on iOS or Android.

AFTER a brain aneurysm in 2004, Mary McClinton was admitted to Virginia Mason Medical Centre in Seattle. Preparing for an x-ray, the 69-year-old was injected not, as she should have been, with a dye that highlights blood vessels, but with chlorhexidine, an antiseptic. Both are colourless liquids. The dye is harmless; the antiseptic proved lethal. After kidney failure, a stroke and two cardiac arrests McClinton died 19 days later.

In response, Virginia Mason committed itself to improving safety. It used an unlikely model: the Toyota Production System (TPS), the Japanese carmaker’s “lean” manufacturing techniques. Nearly every part of the hospital, from radiology to recruitment, was analysed and standardised. Staff were trained to raise safety concerns. Today Virginia Mason prides itself on its safety record—and sells its take on Toyota to hospitals across the world.

Among its recent customers are five in England’s National Health Service (NHS), including University Hospitals Coventry & Warwickshire. On a recent Thursday morning the hospital’s patient-safety team began its daily meeting by reviewing errors reported overnight. In one case, a surgeon had perforated a patient’s bowel during a laparoscopy. In another, a patient’s chest drain, a tube used to remove air, fluid or pus from the thorax, was dislodged.

Since the team was set up a year ago, reporting of such incidents has increased from 35 incidents per 1,000 bed-days in October 2015 to 57 per 1,000 in April 2018. After the meeting, the safety team apologises to the patients involved. It also debriefs the relevant staff, and sometimes, as in the case of the botched chest drain, recommends changes to procedures.

Error messages

“To Err Is Human”, a study published in 2000 by America’s National Academies of Sciences, Engineering and Medicine, estimated that medical errors were to blame for up to 98,000 deaths a year in American hospitals, or twice as many as deaths in road accidents. A study published in 2016 by researchers from Johns Hopkins medical school in Baltimore puts the number much higher, at 250,000 deaths per year.

That is probably an exaggeration. But a study in 2017 by the OECD estimated that 10% of patients are harmed at some point during their stay in hospital. It also found that unintended or unnecessary harm in a medical setting is the 14th leading cause of ill health globally—a burden akin to malaria. At the annual meeting in May of the World Health Organisation (WHO), the UN’s public-health body, delegates discussed “global action” on patient safety.

So policymakers are trying many ways to improve safety. Much is standard fare—tweaks to regulations, changes to training and new kit less prone to cause infection. But Virginia Mason is not alone in looking outside medicine—not just to industry, but, for example, to behavioural science. There is a growing sense that, to make patients safer, hospitals need to simplify the ever more complex world of health care.

Efforts to reduce the harm medics do have a long history. In the 20th century, doctors began systematically to compare how patients are treated in different settings. Take James Alison Glover, a doctor, who noted that, by 1938, 83% of new boys at Eton, England’s poshest public school, had no tonsils (perhaps so the silver spoons could fit). Yet just 2% of Basque refugee children fleeing the Spanish civil war then raging had their tonsils out, and were no worse off for it. So Glover urged an end to widespread tonsillectomies, which, given the rate of surgical infections at the time, spared English teenagers a lot of suffering.

Even so, until the 1990s, notes Ashish Jha of Harvard University, harm done to patients was often blamed on doctors, not defective health-care systems. “To Err Is Human” changed that by showing that most cases of harm resulted from dysfunctional ways of working. A lack of good historical data makes it impossible to know if medical errors have become more common. But Dr Jha suspects that the increasing complexity of health care means they are more prevalent than in the 1960s. Back then, a paediatrician, say, would need to know at most a few dozen different drugs. Today it is over a thousand.

Evidence from developing countries supports the idea that errors are the side-effects of better, if more complex, health care. A study in 2010 for the WHO found that rates of hospital infections were higher in poor countries. But, since fewer drugs were doled out, less harm was done by incorrect prescriptions and side-effects.

To improve their hospitals, rich countries have borrowed heavily from two industries: manufacturing and aviation. “Lean” is one of the popular industrial-management theories taken from manufacturing. It suggests that hospitals should study a patient’s “flow” through the building much as a car is monitored through the production line. That way bottlenecks and other inefficiencies can be spotted. In addition, Virginia Mason, for example, uses a policy of “stop the line”—ie, any member of staff is encouraged to halt a procedure deemed unsafe. It also has genchi genbutsu, or “go and see for yourself”, a standardised way for executives to visit wards and speak to staff about safety risks.

Virginia Mason claims that since 2001 it has become more profitable as it has reduced liability claims. Yet there is little evidence that introducing manufacturing-based management to other hospitals has made much difference. A literature review published in 2016 found that just 19 of 207 articles on the effects of “lean” methodologies were peer-reviewed and had quantifiable results. These found no link between lean methods and health outcomes. Mary Dixon-Woods of Cambridge University notes that evangelists for the use of manufacturing methods can be loth to submit to rigorous, randomised studies.

As for aviation, over the past decade the use of checklists like those used by pilots has become commonplace. Before cutting a patient open, surgeons, anaesthetists and nurses go through a simple exercise to ensure they have the right equipment (and the right patient), know the operation to be performed and understand the risks.

In 2009 another study for the WHO suggested that a simple checklist in eight hospitals in cities in eight countries cut the rate of death during surgery from 1.5% to 0.8%, and that of complications from 11% to 7%. Since then checklists have become ubiquitous in Danish, French, Irish, Dutch and British hospitals, and used about half of the time in developing countries.

But, again, there are very few randomised studies to bear this out. And, often, medics know procedures are under evaluation, which may change behaviour. Some of the more rigorous studies are disappointing. One published in 2014, of 200,000 surgical procedures in 101 hospitals using checklists in Ontario, Canada, found no link to improved outcomes. A recent study of the use of checklists in obstetric care in India again found no firm link between their introduction and reduced deaths of infants or new mothers. The reasons for these disappointing results “are primarily social and cultural”, suggested an article in the Lancet medical journal co-authored by Charles Bosk, a medical sociologist. He argues that many surgeons feel that using a checklist infantilises them and undermines their expertise.

So, more promising may be approaches that do not ask much of doctors themselves. Over the past few years behavioural scientists have begun to try to nudge doctors to make better decisions by studying and acting upon their inherent biases. “Default bias”, the tendency to accept the status quo, is powerful in clinical settings. Most doctors, for example, follow the prescription dosages suggested by electronic medical-record (EMR) software. The same is true of the default settings on medical kit. Research in ICUs has shown that, on their standard settings, artificial ventilators can put huge pressure on the lungs, tearing tissue and provoking inflammation. Tweaking ventilators so that they have a “low tidal volume” setting is often better, but many doctors do not have the time to make the necessary calculations. In a study published in 2016, doctors at the University of Bristol showed that, just by switching the default settings on the machine, patients received safer ventilation.

Established in 2016, the Penn Medicine Nudge Unit, based at the University of Pennsylvania, is the first dedicated behavioural-science unit to be set up within a health system anywhere. It has shown how courses of action can be safer when doctors have to opt out of typically better practices, rather than opt in. For example, just 15% of patients with heart attacks were being referred on to cardiac rehabilitation, because doctors had to opt in to the service and fill out a lengthy form. By making referral to rehab the default setting, and providing pre-filled forms, rates rose to 85%.

Opioids offer another example. Many EMR systems are set by default to prescribe 30 pills to patients requiring pain relief, when ten may be sufficient. The consequences can be severe. The more pills in the first opioid prescription, the greater the chance of becoming addicted. By changing the default setting of their EMR, the Penn team doubled the number of patients on the ten-pill doses.

Other researchers are exploring the power of design to improve safety. The Helix team based at St Mary’s hospital in London is a joint project of Imperial College London and the Royal College of Art. One of its projects involved prescription forms. The team noticed that when doctors had to write out the units of the drug to be prescribed they often made mistakes—milligrams instead of micrograms, for example. The Helix team redrew the form so that doctors just had to circle a pre-written unit.

Moving upstream

Perhaps the greatest potential for reducing medical errors, however, lies in new technology. Streams, an app developed by DeepMind, an artificial-intelligence company owned by Google’s parent, is on trial at the Royal Free hospital in London. It is currently being used to alert doctors and nurses more quickly to patients at risk of acute kidney injury, a potentially fatal condition often first detected by blood tests rather than by a patient’s feeling unwell. Instead of having to receive a pager message and then log on to a computer, the medics get an alert to the Streams app on their mobile phone, along with all the data needed to make a quick clinical decision.

In future, Streams may use machine learning to improve how it crunches data. But for now the researchers have focused on how to make the app useful for clinicians. One concern it is trying to tackle, for example, is “alarm fatigue”. A study of ICU wards found an average of 350 alerts per bed per day; one averaged 771 alerts. Other research has found that nurses are interrupted every five to six minutes. Little wonder, perhaps, that staff can ignore alerts, with sometimes fatal consequences.

Medical technology is saving ever more lives. But by expanding the range of what medicine can do, progress also brings with it new routes for harm. It is surely right that to tackle these medicine studies the advances other fields have made in dealing with complexity. But the profession has too often been oddly slapdash in implementing these advances. They too need to be subject to the scientific rigour—and exhaustive testing—that has served medicine so well. It might also help to remember that, for all health care’s dazzling progress, doctors are mere humans.

This article appeared in the International section of the print edition under the headline "Physician, heal thy systems"

Netflix: The tech giant everyone is watching

From the June 30th 2018 edition

Discover stories from this section and more in the list of contents

Explore the edition

Discover more

Narendra Modi’s secret weapon: India’s diaspora

Migrants help campaign for the prime minister at home and lobby for India abroad

Why young men and women are drifting apart

Diverging worldviews could affect politics, families and more


We’re hiring a global correspondent

An opportunity to join our editorial staff in London