Second-guessing in veterinary medicine: pitfalls and problems

Charles O. Cummings Tufts Clinical and Translational Science Institute, Tufts University, Boston, MA 02111.

Search for other papers by Charles O. Cummings in
Current site
Google Scholar
PubMed
Close
 DVM

Introduction

Second-guessing refers to the all-too-human tendency to question or criticize someone's—or one's own—actions or decisions after the results of those actions or decisions are already known. In the sports world, second-guessing is frequently known as Monday morning quarterbacking and is a generally harmless pastime. In the health-care field, however, second-guessing can be detrimental, shaking the confidence of clinicians, particularly those early in their careers.

Of course, there are times when retrospective analysis of case-management decisions is appropriate. Structured morbidity and mortality rounds and even informal conversations with mentors and colleagues can help in identifying and correcting deficiencies in clinical care. Yet, for many early-career veterinarians, particularly those in internship or residency programs, second-guessing of their clinical decisions can seem almost constant. Although some of this questioning is reasonable, especially for those in training programs, much of it is not.

A major problem with second-guessing is that it is subject to 2 particular types of bias: hindsight bias and outcome bias. Hindsight bias is the tendency for people who know the outcome of an event to exaggerate the probability that they would have correctly predicted the outcome beforehand.1,2 For example, hindsight bias means that when reviewing the management of a particularly difficult case, individuals who know the diagnosis are likely to overestimate the probability that they would have come to the correct diagnosis had they handled the case themselves. Outcome bias is like hindsight bias but refers to the influence that knowledge of the outcome has on evaluations of decision quality.2 In other words, when you know the outcome was bad, it is only natural to think a different decision would have been more appropriate.

Psychological research into medical decision-making has borne out the existence of hindsight and outcome bias a number of times.2,3,4,5 For example, in 1 particular study,3 5 groups of physicians with 15 physicians/group were given a patient's clinical history, examination findings, and laboratory test results and asked to assign a probability estimate for each of 4 possible diagnoses. One group, the foresight group, was given only the clinical information. The 4 other groups, the hindsight groups, were given the same information, but in addition, each group was told that a different 1 of the 4 possible diagnoses was correct. For the 2 rarer diagnoses, physicians in the hindsight groups thought themselves 2 to 3 times as likely to have made the correct diagnosis as did those in the foresight group. Another study4 involved 160 physicians and trainees attending case presentations during 4 clinicopathologic conferences. Half the attendees were asked, before the correct diagnosis was announced (foresight group), to estimate the probability that each of 5 possible diagnoses was correct. The other half were asked, after the correct diagnosis was announced (hindsight group), to estimate the probability they would have assigned to each of the 5 possible diagnoses if they had been making the initial diagnosis. Again, mean probability assigned to the correct diagnosis was significantly higher for the hindsight group than for the foresight group, and the authors concluded that physicians in the hindsight group may have had unfairly negative perceptions of the wisdom of various choices that had been made prospectively.4 Put another way, “knowledge of the outcome has enabled mediocre processes to be evaluated as good, and sound processes as poor.”2

In short, second-guessing is easier than first guessing.

What is worse is that early-career veterinarians are already likely to second-guess themselves, because rightly or wrongly, they often feel their knowledge doesn't match that of their experienced peers. A study6 of the imposter phenomenon at a college of veterinary medicine in the United States found that 50% (74/148) of the students who participated had frequent or intense imposter feelings. In contrast, for veterinarians (faculty, residents, and interns) who participated in the study, 72% (47/65) of those with 0 to 5 years of experience had frequent or intense imposter feelings, compared with 60% (12/20), 44% (8/18), 50% (3/6), and 15% (5/33), respectively, with 6 to 10, 11 to 15, 16 to 20, and ≥ 21 years of experience.

Although research on the effects of second-guessing is lacking, it seems reasonable to suppose that second-guessing by colleagues, whether peers or mentors, could erode the self-confidence of many early-career veterinarians. This could have substantial adverse long-term consequences, both personal and professional. One review,7 for example, found that nurses involved in medical errors could experience constructive change, but could also suffer burnout and moral distress and develop a desire to leave the field of nursing.

Given the potential adverse effects of second-guessing and the likelihood of hindsight and outcome bias, careful consideration should be given before second-guessing clinical decisions made by others. In deciding whether second-guessing is appropriate, I recommend that, first and foremost, you consider whether the patient was harmed. If the patient was not harmed, you should then consider whether the way you perform your own job was affected (eg, having to deal with patients inappropriately transferred or owners given markedly inaccurate information on prognosis or cost). If again the answer is no, then it might be best to not comment on whatever decision was made. If the patient was not harmed but the way you perform your own job was affected, you should consider your role in the case. Would the same decision have been made if you had been available for a consultation when needed? If not, then you should consider accepting the decision that was made and making yourself more available for consultations in the future. If the decision affected your job and your availability was not a factor, a discussion with the individual who made the decision is likely in order. Still, that discussion should be a dialogue, not a lecture or admonishment.

If a patient was harmed, you should consider whether a particularly egregious error was made (eg, administration of a drug that was absolutely contraindicated). If so, an immediate discussion with the person in question, with or without notification of hospital administrators, is in order. Doing so will allow the hospital team to develop a plan for the patient and client as well as take steps to prevent similar mistakes in the future. If no egregious error was made, consider whether you might have made a similar decision without the benefit of outcome knowledge. This is not easy, as it has been shown that an awareness of the existence of hindsight bias does not necessarily negate its effect.4 What has been effective, however, in limiting the effects of hindsight and outcome bias is to explicitly state one or more reasons why an alternative diagnosis or decision might have been made.5 Henriksen and Kaplan2 wrote, “when individuals are forced to write down their own views, their own assumptions, their own uncertainties, and their own tradeoffs, contingencies, and options, they are better able to appreciate the complexity of the decision process which is messy and riddled with unknowns.” If you believe you might have made a similar decision prospectively, acknowledge that to the individual in question and make a plan for the patient. If, after writing down reasons, you do not believe you would have made a similar decision, then discuss the case with the individual. They may have some insight or explanation of which you were unaware. If you still think you would not have made a similar decision, acknowledge the challenges of the case and explain the decisions you would have made instead and the rationales for those decisions. Then, as always, make a plan for the patient.

Notably, there is no place for second-guessing veterinarians—be they colleagues, trainees, or referring veterinarians—to others. Doing so is not productive and reduces clients' confidence in the veterinarians being second-guessed and the veterinarians' confidence in themselves. Importantly, it does so without offering strategies for improvement.

In conclusion, veterinarians should formally recognize the roles of hindsight and outcome bias when considering how to respond to case management decisions made by others. Hindsight and outcome bias make us believe we would have done things differently and suggest the prospective decisions of others were somehow erroneous, when in fact they may have been completely reasonable given the available information. Publicly or privately second-guessing the decisions of others is often inappropriate for reasons both factual and sociological. Early-career veterinarians already tend to have low self-confidence, which may be further eroded by second-guessing by colleagues. There are ways to discuss poor decisions that result in bad outcomes, but this requires first employing strategies to mitigate hindsight and outcome bias and then having a discussion with the individual in question. Doing so might help prevent early-career veterinarians from experiencing burnout and moral distress and leaving the profession.

Acknowledgments

This work was supported by the National Center for Advancing Translational Sciences, National Institutes of Health, Award Number TL1TR002546.

The content is solely the responsibility of the author and does not necessarily represent the official views of the NIH.

References

  • 1.

    Bornstein BH, Emler AC. Rationality in medical decision making: a review of the literature on doctors' decision-making biases. J Eval Clin Pract 2001;7:97107.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 2.

    Henriksen K, Kaplan H. Hindsight bias, outcome knowledge and adaptive learning. Qual Saf Health Care 2003; 12:ii46ii50.

  • 3.

    Arkes HR, Wortmann RL. Hindsight bias among physicians weighing the likelihood of diagnoses. J Appl Psychol 1981;66:252254.

  • 4.

    Dawson NV, Arkes HR, Siciliano C, et al. Hindsight bias: an impediment to accurate probability estimation in clinicopathologic conferences. Med Decis Making 1988;8:259264.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 5.

    Arkes HR, Faust D, Guilmette TJ, et al. Eliminating the hindsight bias. J Appl Psychol 1988;73:305307.

  • 6.

    Appleby R, Evola M, Royal K. Impostor phenomenon in veterinary medicine. Educ Health Prof 2020;3:105.

  • 7.

    Lewis EJ, Baernholdt M, Hamric AB. Nurses' experience of medical errors: an integrative literature review. J Nurs Care Qual 2013;28:153161.

    • Crossref
    • Search Google Scholar
    • Export Citation
All Time Past Year Past 30 Days
Abstract Views 363 0 0
Full Text Views 42097 41904 6730
PDF Downloads 262 89 6
Advertisement