When she was a nurse, Karen Frush recalls, she was working with a child in the emergency room. The resident physician asked Frush to quickly grab some heparin flush, a blood thinner used to prevent blockages in IV lines. She ran to the nurses’ station, grabbed the vial and returned to the young patient’s bedside. Within moments, she had drawn up the medication, and the resident began injecting it.
Immediately, the child cried out in pain. The medical team stopped. They all realized what had happened: The heparin flush vial looked remarkably similar to another vial—potassium. In her rush, Karen had gotten the vials mixed up. The mistake would have been lethal had the resident injected the whole syringe. Fortunately, the patient was fine.
Karen was not. Guilt-ridden and fearful about the consequences of her mistakes, she considered quitting. It was only years later, when a national recommendation to separate look-alike and sound-alike drugs was announced, that Karen realized she wasn’t the only one who had made that kind of error.
In the medical world, the saying “To err is human” is associated with a well-known report that emphasizes that errors in health care are more often due to systemic, rather than individual, failings. But admitting that someone was hurt by a mistake that you made, or that was made on your watch, can still be a personally daunting experience. For many, it puts them off admitting errors altogether.
When no one talks about making mistakes, it can seem like you’re the only one struggling. And more importantly, when we aren’t talking about our mistakes, we aren’t fixing them.
Health Care Has a Trust Issue
An estimated 98,000 people die every year as a result of medical errors in hospitals. Despite progress being made to make it easier for medical professionals to disclose mistakes, many still hesitate. A 2016 report found that 7 percent of physicians think it’s acceptable to hide clinical errors that could harm a patient, and another 14 percent think that it depends on the situation. These are the physicians who admit they would hide mistakes. In reality, of course, the number of people who hide them is higher.
There are many reasons why physicians might hide their own mistakes or cover up for a colleague—fear of retaliation, losing the respect of peers or superiors, lacking time to submit the paperwork, and so on. Even when honest mistakes happen, being open about them can be hard. But the problem isn’t bad people in healthcare; it’s that good people are working in a system where they’re not feeling safe to report errors. And when there’s lack of trust in a system, problems escalate.
Behavorial Science Can Help
A key insight from behavioral science is that people tend to use the same heuristics, or rules of thumb, to help them make decisions. Most of the time, these rules are very beneficial. Sometimes, however, they can lead to predictable errors in our judgment.
Because we know that many of the errors people make are similar and systematic, applying insights from behavioral science to the design of health care systems can help medical professionals avoid mistakes. It can also be used to help increase trust between medical professionals and the institutions they work for, creating a culture in which doctors are comfortable with reporting the mistakes they make.
To increase the trust between physicians and the health care institutions they work for, several approaches can be taken. Here are a few:
Increase accountability. When our behaviors are being monitored or evaluated, we are less likely to misbehave. Several hospitals are starting to increase accountability, often through initiatives such as the “Just Culture” framework.
Even in regards to less-consequential slights, peer accountability can be a powerful tool. Duke Health System has instituted a program known as the “cup of coffee” program, based on a similar system at Vanderbilt University. Trained messengers invite colleagues for a brief chat to discuss an incident of poor behavior. Superiors are not informed; informal, frank dialogue with a peer is impactful enough. The program at Duke is still new, but out of the 85 people who have had a “cup of coffee” conversation, only three had to meet for the second time. At Vanderbilt, 70 percent of people who were contacted didn’t have any repeat complaints.
Make sure incentives are aligned. The way many systems are currently structured, employees who report their own mistakes get punished, while those who hide mistakes go unpunished. Employees taking the blame may sometimes serve the interest of the hospital, because they can blame mistakes on “human error” rather than systematic errors that they should address.
To increase trust, make sure that doing the right thing and reporting isn’t the same as taking the full blame for what happened. Reporting a mistake should be the start of a conversation where both the physician and the institution reviews what they could have done better. In cases where there’s public scrutiny, the institution often gain more trust if they avoid throwing physicians under the bus and take some of the responsibility themselves. In other words, institutions should positively reinforce the “good” behavior, and also show that they are on the same side as the physicians.
Have regular conversations about medical professionalism. Research shows that being reminded of our integrity helps us improve our behavior. In a series of experiments by Nina Mazar, On Amir and Dan Ariely, participants who were reminded about moral standards cheated less on a test. This particular experiment asked participants to recall the Ten Commandments, but reminders of morality and ethics can take many forms. The key to this finding is the reminder. Even when self-described atheists were asked to recall the Ten Commandments, they cheated less.
We created the Medical Professionalism Project to help get these conversations started. A collaboration between behavioral scientist Dan Ariely, the Center for Advanced Hindsight and Academy Award–winning producer and director Yael Melamede, our film-based course brings together experts in behavioral science and medicine to address some of the pressing issues providers face today.
Set the example. Thankfully, Karen Frush did not quit her job after the incident with the mistaken potassium vial. She continued in her career, studied to become a doctor, and pursued a specialty in emergency pediatrics. She is now the chief patient safety officer at Duke University Health System, where she works to improve patient safety and the emergency care of children. Her story serves as an example of both the difficulty in admitting and accepting error and how openness can improve the healthcare system.
By sharing her experiences with her students, Karen leads by example and continues a dialogue on the importance of these issues. Only by doing so can we bring about much-needed change in how mistakes are brought up and addressed.