Cognitive Biases as Contributing Factors to Medical Errors

Cognitive Biases as Contributing Factors to Medical Errors

“The human brain is a complex organ with the wonderful power of enabling man to find reasons for continuing to believe whatever it is that he wants to believe. – Voltaire” (O’Sullivan & Schofield, 2018) 

Cognitive Biases Defined

Cognitive biases, defined as “flaws or distortions in judgment and decision-making” (Joint Commission, 2016, p.1), have been identified as contributing factors to medical errors and sentinel events. According to the Joint Commission (2016), this includes events such as retained foreign objects, wrong-site surgeries, falls, and delays in care. Literature has shown that diagnostic errors are correlated with 6-17 percent of adverse events in the hospital setting, with cognitive errors being associated with 28 percent of these diagnostic errors. 

Human Problem Solving and Decision Making 

To better understand this, let’s review how humans problem solve and make decisions, as related to how cognitive biases can arise. There are 2 modes of processing information: fast thinking and slow thinking. Fast thinking is an intuitive, unconscious process based on pattern recognition. Repeat experiences lead to quick conclusions regarding the situation at hand. This type of processing can be helpful and efficient, but it can also hinder the ability to consider alternatives and lead to wrong conclusions based on incomplete information. In contrast, slow thinking is more time-intensive and analytical (Joint Commission, 2016). It takes more brainpower and involves higher-level critical thinking. Both fast and slow thinking modes can be used in decision making, and both are prone to cognitive biases under the right conditions- fatigue, distractions, complicated patient scenarios, etc.

There are over 100 cognitive biases that have been identified. Many of these biases fall under the following general categories: errors based on the frequency with which they occur (availability bias), fixation on a specific diagnosis (anchoring), lack of considering other diagnoses (Sutton’s slip), failure to consider one’s own synopsis of the situation (framing effect, ascertainment bias), errors related to patient characteristics (gender bias), and errors related to provider personality (visceral bias) (Fondahn et.al., 2016). This article will focus on a few examples to give a general overview. 

Case Scenario

Consider the following case scenario of a patient who experienced a uterine rupture and availability bias. 

A multiparous lady with one previous cesarean section at 39 + 4 weeks of gestation came with spontaneous rupture of membranes. Oxytocin augmentation was started the next day, and 2 h later, she was found to be 4 cm dilated with fore-waters intact, and hence, artificial rupture of membranes was performed. Some early decelerations were found on the CTG trace when she was 7 cm dilated. An hour later, bradycardia was noted to be 60 beats per minute. Clear liquor was noted. Emergency cesarean section was undertaken within 10 min, and a baby girl was found to be in the abdominal cavity; the entire scar rupture was noted. The arterial cord pH was 6.981 and the venous cord pH was 6.976. The Apgar scores were 4 at 1 min and 7 at 5 min. (Revicky et.al., 2012, p.667)

The experience of catastrophic events such as this can lead to availability bias, which refers to clinical decision-making on future events based on recent or more common events. In this case, the experience of a severe uterine rupture could lead to an increase in cesarean delivery. Literature supports this concept, including a study by Riddell et.al. (2014), in which cesarean delivery rates among hospitals were evaluated in patients undergoing a trial of labor after cesarean delivery, before and after experiencing a patient with uterine rupture. An increase in cesarean delivery rate was noted in the month following severe uterine rupture. 

Other Examples of Cognitive Biases

  • Availability bias also refers to medical decision-making based on scenarios commonly encountered. For example, diagnosing the most commonly seen conditions, such as the flu, instead of considering alternatives. 
  • Anchoring bias – jumping to conclusions based on first impressions, despite additional information added to the scenario.  
  • Sutton’s slip– lack of considering other diagnoses due to focusing on the most obvious one
  • Ascertainment bias – describes the influence of prior experiences on decision-making. For example, a frequent flier that presents with repeat complaints. 
  • Framing effect- describes decision-making based on how the information is presented. For example, an intern reports out to a provider that the fetal tachycardia seen in a 37-week gestation patient status post abdominal trauma is due to maternal anxiety. The provider fails to consider other causes of fetal tachycardia in this scenario. 
  • Gender bias– when the gender of the patient or provider influences medical decision making 
  • Visceral bias– when personal feelings toward a patient influence medical decision making (Fondahn et.al., 2016)

A number of factors can contribute to cognitive biases including fatigue, complex patient presentations, distractions, and poor teamwork and communication. Organizations should consider increasing awareness and developing systems that can help to mitigate cognitive biases such as reducing distractions, helpful information displays, and training that promotes critical thinking and teamwork (Joint Commission, 2016). 

*If this article interests you, you may also enjoy my book titled: Obstetric and Neonatal Quality and Safety (C-ONQS) Study Guide: A Practical Resource for Perinatal Nurses, available on amazon: Amazon_obneonatalstudyguide

References

Fondahn, E., Lane, M. A., & Vannucci, A. (2016). The Washington Manual of patient safety and quality improvement. Wolters Kluwer.

Joint Commission (2016). Cognitive biases in healthcare. Quick Safety Issue, 28, 1-3. Retrieved from https://www.jointcommission.org/resources/news-and-multimedia/newsletters/newsletters/quick-safety/quick-safety-28/

O’sullivan, E., & Schofield, S. (2018). Cognitive bias in clinical medicine. Journal of the Royal College of Physicians of Edinburgh, 48(3), 225-232. doi:10.4997/jrcpe.2018.306

Revicky, V., Muralidhar, A., Mukhopadhyay, S., & Mahmood, T. (2012). A Case Series of Uterine Rupture: Lessons to be Learned for Future Clinical Practice. The Journal of Obstetrics and Gynecology of India, 62(6), 665-673. doi:10.1007/s13224-012-0328-4

Riddell, C. A., Kaufman, J. S., Hutcheon, J. A., Strumpf, E. C., Teunissen, P. W., & Abenhaim, H. A. (2014). Effect of Uterine Rupture on a Hospital’s Future Rate of Vaginal Birth After Cesarean Delivery. Obstetrics & Gynecology, 124(6), 1175-1181. doi:10.1097/aog.0000000000000545

Royce, C. S., Hayes, M. M., & Schwartzstein, R. M. (2019). Teaching Critical Thinking. Academic Medicine, 94(2), 187-194. doi:10.1097/acm.0000000000002518

Copyright by Jeanette Zocco MSN, RNC-OB, C-EFM, C-ONQS

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top