“The human brain is a complex organ with the wonderful power of enabling man to find reasons for continuing to believe whatever it is that he wants to believe.” – Voltaire (O’Sullivan & Schofield, 2018).
Cognitive biases, defined as “flaws or distortions in judgment and decision-making” (The Joint Commission, 2016, p. 1), have been identified as contributing factors to medical errors and sentinel events. According to The Joint Commission (2016), this includes events such as retained foreign objects, wrong-site surgeries, falls, and delays in care. Moreover, literature has shown that cognitive errors contribute to 28 percent of diagnostic errors in the hospital setting, and diagnostic errors are correlated with 6-17 percent of all adverse events (Joint Commission, 2016).
This article discusses how humans process information related to the development of cognitive biases. Several types of cognitive biases are described in addition to a case example of a uterine rupture leading to availability bias.
Human problem solving and decision making
Let’s review how humans problem-solve and make decisions in regard to cognitive biases. The human brain has two modes for processing information: fast thinking and slow thinking. Fast thinking is an intuitive, unconscious process based on pattern recognition. Repeat experiences lead to quick conclusions. This type of processing is efficient, but it can hinder the consideration of alternatives and can lead to wrong conclusions based on incomplete information (The Joint Commission, 2016). 64 Section 2: Integrating Quality and Safety into Practice
In contrast, slow thinking is more time-intensive and analytical (The Joint Commission, 2016). It requires more brain power and involves higher-level critical thinking. Both fast and slow thinking modes can be used in decision-making, and both are prone to cognitive biases under certain conditions, such as fatigue, distractions, and complicated clinical scenarios.
Over 100 cognitive biases have been identified. Many of these biases fall under general categories: errors based on the frequency with which they occur (i.e., availability bias), fixation on a specific diagnosis (anchoring), lack of consideration for other diagnoses (Sutton’s slip), failure to consider one’s own synopsis of the situation (framing effect, ascertainment bias), errors related to patient characteristics (gender bias), and errors related to provider personality (visceral bias) (Fondahn et al., 2016).
Case example
A multiparous lady with one previous cesarean section at 39 + 4 weeks of gestation came with spontaneous rupture of membranes. Oxytocin augmentation was started the next day, and 2 h later, she was found to be 4 cm dilated with fore-waters intact, and hence, artificial rupture of membranes was performed. Some early decelerations were found on the CTG trace when she was 7 cm dilated. An hour later, bradycardia was noted to be 60 beats per minute. Clear liquor was noted. Emergency cesarean section was undertaken within 10 min, and a baby girl was found to be in the abdominal cavity; the entire scar rupture was noted. The arterial cord pH was 6.981 and the venous cord pH was 6.976. The Apgar scores were 4 at 1 min and 7 at 5 min (Revicky et al., 2012, p.667).
Catastrophic events such as this one can lead to availability bias in healthcare practitioners, in which clinical decision-making is based on recent adverse events. In this case, after caring for a patient with a severe uterine rupture, a practitioner may be more likely to recommend precautionary cesarean delivery for other patients.
Studies support the concept of availability bias. A study by Riddell et al. (2014) evaluated cesarean delivery rates in patients undergoing a trial of labor after cesarean delivery. The study looked at cesarean rates before hospitals experienced a patient with uterine rupture as well as after. For one month after a hospital cared for a patient with uterine rupture, the cesarean delivery rate increased.
Types of cognitive biases
Availability bias also refers to medical decision-making based on scenarios commonly encountered — for example, diagnosing the most commonly-seen conditions, such as the flu, instead of considering alternatives. It also refers to clinical decision-making based on recent adverse events, as described above.
Anchoring bias is making conclusions based on first impressions, even after additional information has been added to the scenario.
Sutton’s slip refers to lack of consideration of alternative diagnoses due to focusing on the most obvious one. The name is derived from a Brooklyn bank robber named Willie Sutton. When asked at his trial why he robbed banks, he responded, “Because that’s where the money is” (Croskerry, 2002, p. 1196).
Ascertainment bias describes the influence of prior expectations on decision-making (and includes stereotyping and gender bias). For example, assuming a patient with a known drug history has overdosed when found unconscious, when instead he is hypoglycemic.
Framing effect describes decision-making based on how the information is presented. For example, an intern reports to a provider that the fetal tachycardia in a 37-week gestation patient who experienced abdominal trauma is due to maternal anxiety. The provider fails to consider other causes for the fetal tachycardia.
Gender bias is when the gender of the patient or the provider influences medical decision making.
Visceral bias occurs when personal feelings toward a patient influence medical decision making.(Fondahn et al., 2016)
Many factors can contribute to cognitive biases, including fatigue, complex patient presentations, distraction, and poor teamwork or communication. Organizations may increase awareness of and develop systems to mitigate cognitive biases, such as reducing distractions, posting helpful informational displays, or providing training on critical thinking and teamwork (The Joint Commission, 2016).
*If this article interests you, you may also enjoy my book titled: Obstetric and Neonatal Quality and Safety (C-ONQS) Study Guide: A Practical Resource for Perinatal Nurses, available on amazon: Amazon_obneonatalstudyguide
Copyright by Jeanette Zocco RNC-OB, C-EFM, C-ONQS
References
Croskerry, P. (2002). Achieving quality in clinical decision making: Cognitive strategies and detection of bias. Academic Emergency Medicine, 9(11), 1184-1204. https://doi.org/10.1197/ aemj.9.11.1184
Fondahn, E., Lane, M. A., & Vannucci, A. (2016). The Washington Manual of Patient Safety and Quality Improvement. Wolters Kluwer.
O’Sullivan, E., & Schofield, S. (2018). Cognitive bias in clinical medicine. Journal of the Royal College of Physicians of Edinburgh, 48(3), 225-232. doi:10.4997/jrcpe.2018.306
Revicky, V., Muralidhar, A., Mukhopadhyay, S., & Mahmood, T. (2012). A case series of uterine rupture: Lessons to be learned for future clinical practice. The Journal of Obstetrics and Gynecology of India, 62(6), 665-673. doi:10.1007/s13224-012-0328-4
Riddell, C. A., Kaufman, J. S., Hutcheon, J. A., Strumpf, E. C., Teunissen, P. W., & Abenhaim, H. A. (2014). Effect of uterine rupture on a hospital’s future rate of vaginal birth after cesarean delivery. Obstetrics & Gynecology, 124(6), 1175-1181. doi:10.1097/aog.0000000000000545
Royce, C. S., Hayes, M. M., & Schwartzstein, R. M. (2019). Teaching critical thinking. Academic Medicine, 94(2), 187-194. doi:10.1097/acm.0000000000002518
The Joint Commission (2016). Cognitive biases in healthcare. Quick Safety Issue, 28, 1-3. Retrieved from https://www.jointcommission.org/resources/news-and-multimedia/newsletters/ newsletters/quick-safety/quick-safety-28