To misdiagnose is human

Patient details have been changed to protect patient privacy.

A previously-healthy man sees his primary care physician and complains of fatigue.  Laboratory studies shows abnormalities which concern his physician for the presence of cancer.  Imaging reveals enlarged lymph nodes, and he is admitted to the hospital to determine his type of cancer and begin treatment.  Several biopsies later, no malignancy is found, although his medical team remains suspicious.

A rheumatology evaluation reveals that his history, physical exam, and laboratory studies strongly suggest an autoimmune disease.  Nevertheless, the medical team sends him for a fourth biopsy to conclusively rule-out malignancy.  This again proves negative.

This case made me think about the biases that we have–all of us–when evaluating patients.  What made this primary care physician, from the very start, believe that his patient had cancer? Why was his medical team so committed to finding cancer,  despite a more likely alternative diagnosis?  How can I avoid making these diagnostic errors in the future?

Much discussion has been focused on medical errors since the 1999 Institute of Medicine report, which suggested that up to 98,000 patients die in American hospitals every year as a result of preventable medical errors.   The report describes many types of errors responsible for these excess deaths, including errors of diagnosis.   With my patient, I believe that cognitive errors in the diagnostic process delayed treatment and exposed him to unnecessary procedures.

It is difficult to estimate the burden of misdiagnoses in the United States.  A metaanalysis in a recent issue of BMJ Quality & Safety journal suggested that up to 5% of outpatient diagnoses may be incorrect, representing 12 million misdiagnoses per year.  Of these,  up to 74% may be completely or partially due to cognitive errors.   A better understanding of why we make these errors may help us to prevent misdiagnoses in the future.

I first learned about cognitive errors from an insightful book, Thinking Fast and Slow, by Nobel laureate Daniel Kahneman (this should be required reading for all physicians).  In it, he describes the shortcuts that our brains take when making decisions under uncertainty, including making medical diagnoses.  These shortcuts, which he called System 1, help us manage our daily lives, allowing us to make quick, accurate predictions without much thought.  However, Kahneman found that System 1 thinking sometimes result in predictable errors.  These errors can lead us–and our patients–into trouble.

System 1 thinking is responsible for the error that Kahneman called “anchoring.”  This occurs when we fail to adjust our initial impressions, even after additional information is given.  For example, when people are asked: “Was Gandhi younger or older than 144  when he died?” they respond “younger.”  When asked to estimate his age of death, they say he died in his 80’s or 90’s.  However, when others are asked “Was Gandhi younger or older than 44 when he died?” they estimate his death at around age 50.  Thus, their final response remains close, or “anchored” to the initial value given.

A similar “anchoring” effect also occurs in medicine, when early impressions of a patient limit the differential diagnosis.  In my patient with fatigue and swollen lymph nodes, cancer was appropriately considered during his initial visit, but all of the physicians that evaluated him afterwards were anchored on the idea that the patient had cancer.  Thus, his workup involved figuring out which type of cancer he had, instead of whether he actually had cancer.

A second error that occurs due to System 1 thinking is called the “availability” bias, our tendency to estimate the probability of an event by the ease by which a similar event is recalled.  For example, if people had been questioned about the safety of flying before the disappearance of Malaysia Airlines Flight 370 they would have said that flying is relatively safe.  However, asking this same question after the event, people are much more likely to give a negative response, despite the fact that flying still represents one of the safest forms of travel.

Unfortunately, doctors are not exempt from the availability bias.  We are influenced by recent dramatic or unusual cases, and may wrongly attribute this same diagnosis to a new patient we encounter.  The availability bias probably becomes more pronounced when physicians sub-specialize,  only managing patients with a limited number of diseases.  This bias of seeing the world as you are, instead of as it is, was dramatically illustrated in a classic 1994 study about the workup of back pain.  Physicians of various specialties were given identical case scenarios of a patient with back pain, and were asked how they would arrive at a diagnosis.  Researchers found that physicians of different specialties had drastically different approaches for the same patient: “neurosurgeons and neurologists are much more likely to order imaging studies, rheumatologists are much more likely to order laboratory tests, and physiatrists and neurologists are much more likely to order electromyograms.”  Clearly, specialists were looking for different etiologies of back pain.  The authors called this effect “who you see is what you get.”  Kahneman referred to this same phenomenon as WYSIATI (what you see is all there is).

The availability and anchoring biases are only two of 30 cognitive biases that physicians regularly  encounter in their daily practice.  As a result, we commit diagnostic errors even when we are confident about a diagnosis.

How can we, as doctors, improve our diagnostic skills?  Fortunately, Kahneman described the presence of System 2, a way of thinking that helps to correct biases that result from System 1.  Unfortunately, System 2 thinking takes hard work–it is effortful, slow, and deliberate.  We can invoke System 2  when we are forced to reconsider our initial impressions.  A recent study showed that, just by reflecting on their reasoning, doctors can improve their diagnostic skills when confronted with complex cases.

Another way to overcome our biases is by using tools to help us stay in better touch with reality.  Statistics can help us ensure that “when we hear hoofbeats, we think of horses not zebras.”  Having (and using) data about the incidence of diseases can be effective in overcoming the availability bias.  If we know that 5% of patients with lymph node enlargement have cancer, we can focus on evaluating  the other causes which are more common.

In addition, incorporating the use of probability in the clinical encounter, especially in the building of the differential diagnosis and helping to interpret test results, may be important.  Unfortunately, people do not have an inherent understanding of probability.  Kahneman found this out during his experiments in the 1970s, and a study recently published in JAMA Internal Medicine showed that physicians are no exception.  A greater focus in learning statistics during premedical and undergraduate medical education can arm future physicians with the tools to make better clinical decisions.

Computers are also becoming increasingly used in helping physicians make difficult diagnoses.  iPhone apps are being developed to help diagnose melanoma.  A diagnostic support software, SimulConsult,has shown promise in diagnosing patients with genetic or neurologic disorders.  IBM’s Watson is  already helping oncologists to diagnose cancer and provide individualized treatment plans for patients.  I can only imagine that computers (and “big data”) will become more prominent members of the healthcare team, no longer only used for record-keeping, but implemented in the diagnostic evaluation and treatment decisions of patients.

Weather forecasts are improved when meteorologists and computers work together to predict future weather conditions.  In chess, humans and computers, working together, can beat any individual human or computer program.   Similarly, I believe that our diagnostic skills will be significantly improved when a physician and computer work together to solve a patient’s diagnostic dilemma.

Fortunately, I don’t think that our job as healers will be diminished by the increasing role that computers will play in the examination room of the future.    Physicians will still be needed to lay hands on a patient, palpate an enlarged spleen, detect a swollen joint, and note the use of accessory muscles in a patient with respiratory distress.  Unlike computers, physicians have the advantage of being able to take into account aspects of the patient’s history or physical exam which may be difficult to quantify into a computer algorithm.   Physicians and computers, working together, have the potential to  bring about the next great revolution of medicine.

Making an accurate diagnosis is only the first step in providing excellent patient care.  Caring for the patient, providing guidance, encouragement, comfort, and support will still require a human heart.

After his fourth negative biopsy for cancer, the patient began treatment for his autoimmune disorder.  Within a few weeks of starting treatment, his symptoms, laboratory abnormalities, and lymph node enlargement almost completely resolved.  He continues to be followed closely by the rheumatology department.

3 thoughts on “To misdiagnose is human”

  1. Totally agree with the anchor thing. I can only guess if there is other medical specialty with the ratio of 2dn diagnostic as rheumatology

Leave a Reply