Why even smart people make bad decisions – without knowing it!
On 29 March 2005, Elaine Bromiley was scheduled for a simple, low-risk operation to clear her of the sinus problems she had been having for a couple of years. The ear-nose-throat (ENT) surgeon, let’s call him Dr E, had more than 30 years’ experience and the anesthesist – let’s call him Dr A, had 16 years’ experience. And the hospital was a top tier establishment.
At 8.30am, the nurse wheeled Elaine into the operating theatre where Dr A was waiting for her. He inserted a cannula into the vein at the back of her hand and administered the anesthetic at 8.35am. As anesthetics go, they are very powerful drugs that don’t just knock a person out, but also disables many vital bodily functions. Hence, these functions have to be assisted. Breathing is assisted by what is called a laryngeal mask. However, for Dr A, he could not get the mask into Elaine’s mouth because her jaw muscles had tightened due to the anesthetic – a familiar issue. So, Dr A administered a few more doses of drugs to loosen the muscles and tried a smaller-sized mask. They did not work. It was now 8.37 am, two minutes after being put under, and Elaine was beginning to turn blue. Her oxygen level had fallen to 75% (anything below 90% is significantly low). At 8.39am, Dr A tried an oxygen facemask, which sits over the mouth and nose. But it still did not do the trick. By 8.43am, Elaine’s oxygen saturation level had dropped to 40%. This was the lowest level in the measuring machine. Elaine was at danger of suffering swelling in the brain, and declining heart rate. In fact, her heart rate had fallen to 69 BPM, then to 50. At this stage, Dr E and another anesthesist joined Dr A and the nurses to help out, but try as they may, they could not get oxygen into the patient.
At 8.47am, the nurses had anticipated the next move. The head nurse darted out to fetch the tracheostomy kit, which is used to cut a hole directly into the windpipe. This is a high-risk option to be used only in such situations. When she returned, she informed the doctors that the kit was ready to be used. They looked up at her, but ignored her suggestion, continuing to intubate the patient. In the end, the doctors did not use the kit but were able to force the tube down the back of Elaine’s throat. By the time they had finally done so, it was 8.55am and in all, Elaine had been starved of oxygen for more than 20 minutes. She had fallen into a coma.
Dr E took Elaine’s husband aside and explained, “There was a problem with anesthesia. These things do happen sometimes. We don’t know why. The anesthetists did their very best, but it just didn’t work out. It was a one-off. I am so sorry.” There was no mention of the futile attempts at intubation. No mention of the nurse’s attempt to alert them to the growing disaster. No mention of the tracheostomy kit on standby.
On 11 April 2005, Elaine Bromiley died. She was just 37 years old.
How was it that a healthy mother of two who had gone in for a simple, low-risk procedure ended up dying after 13 days? Was it the doctors’ fault? Was it the nurses’ fault? Or was it, as they said, a one-off; and hence no one’s fault? And how was it that the doctors failed to perform the tracheostomy that was so apparent to those watching the progress of intubation?
In the course of the investigations, the following issues were identified to have caused Emily Bromiley’s death:
1. Hierarchy If the nurse had perhaps been more forceful in suggesting that the doctors switched to tracheostomy, Emily Bromiley might still be alive today. However, the culture in the hospital then – as with many healthcare facilities even until today – doctors outrank nurses. Hence, it was seen as not their place to even suggest a course of action, even if they may be right. On hindsight, the tracheostomy was deemed the right course of action at the time it was suggested, but due to hierarchical issues, the nurse was not able to make a stand for it, even if the patient was losing her life! 2. Infallibility of doctors Doctors, especially the specialists who were working on Emily Bromiley, spent many years learning and training to be doctors. Hence, in the operating theatre, they are infallible. Nurses are there to assist, but it is not their place to question nor suggest. Indeed, there had been many prior instances that nurses had lost their job in the hospital when they questioned the doctors, even if there was legitimacy in questioning. Yet this does not just happen in healthcare; it happens in politics, in the military, even in schools! Any time when there is a culture of infallibility, where one group cannot be questioned because they are seen as “unable to be wrong”, there will be a problem of overconfidence, leading to misjudgements, or even, in this unfortunate incident, death! 3. Poor situational awareness
When Dr A finally completed the process, and Emily’s oxygen saturation had gone back up to 90% (but not before irreparable brain damage), he was surprised, and rejected, that so much time had slipped by. The fact of the matter is that when we are in high-stressed situations, our perception of time is warped. We are unaware that it had slipped by as we are trying to fix the situation. And in time-sensitive situations such as these, we must always be aware of the whole setting, and not just on the one element in front of us. Let me give you an example most of you will recognize – a final-year examination. I am sure many of you have had the experience of spending so much time in solving one math question (out of 50), because it was challenging and you did not want to leave any stone unturned, such that you now did not have enough time to solve all the remaining questions! In such situations, you would have said the same thing, “I was unaware so much time had gone by!” This is really akin to winning the battle but losing the war! 4. Inability to learn from failure Consider what Dr E reported to Emily’s husband, “It was a one-off.” This is often cited by doctors to explain away unexpected events. Yet, was this really a one-off? The case may be rare, but the circumstances surrounding Emily Bromiley’s death was certainly not a one-off! In fact, there had many cases where doctors’ arrogance, or their lack of respect for nurses, or their inability to appreciate the situation, caused the life of a patient. But are they investigated? No. With a wave of their hand, and a dismissive attitude, they report that it is a one-off! In fact, there had not been as many formal failure analysis in US healthcare system on such cases as there might have been in an aircrash investigation. Sure, you might say, in an aircrash, many people die. Hence there will be an investigation. Yet, the US NTSB investigates all accidents and near-misses. These are even when there is no injury. And the results are published without ascribing blame. Compare this with healthcare; all incidents and accidents are kept locked away, for fear of being sued by the victims or their family members. This hides all situations from learning where things went wrong, allowing it to perpetuate.
5. Poor operational procedures While many medical operations proceed without incident, there should be a procedure for exception-handling. There was none. Instead, the doctors were left to handle the situation as they saw fit. In fact, even when the best course of action at the most dire time was correctly identified by the nurses, the doctors were not obliged to accede to it. The deference that had been given to doctors as a means of acknowledging their learnedness and their training, had caused the hospital to have just one procedure – listen to the head doctor in the room! As we shall see in the next paragraph, everyone is subject to biases, that is human nature; and if we do not have an established process that acknowledges these biases, and institute ways to overcome them, then we are bound to make these mistakes again. 6. Biases According to Rolf Dobelli, author of The Art of Thinking Clearly, there are 99 different cognitive biases (although one suspects the number is even higher than that!). From Dobelli’s list, we can identify survivorship bias, clustering illusion, authority bias, availability bias, story bias, hindsight bias, illusion of control, regression to the mean, groupthink, loss aversion, fundamental attribution error, false causality, halo effect, forecast illusion, action bias, omission bias, self-serving bias, association bias, decision fatigue, effort justification, affect heuristic, alternative blindness, false-consensus effect, ambiguity aversion, in-group out-group bias, illusion of attention, deformation professionnelle, illusion of skill, and fallacy of the single cause to name a few. We are not here to uncover these biases (we can do that at a later date), but suffice it to say that the human mind is subject to so many errors and biases that we need to ensure that we are not improperly framed by our thinking and do the right thing! But of course that is easier said than done, since we are going against our human nature!