Cover Page

Errors in Veterinary Anesthesia

John W. Ludders, DVM, DipACVAA

Professor Emeritus, College of Veterinary Medicine,
Cornell University, Ithaca, NY, USA

Matthew McMillan, BVM&S, DipECVAA, MRCVS

Clinical Anaesthetist, Department of Veterinary Medicine,
University of Cambridge, Cambridge, UK

 

 

 

 

 

 

 

 

 

 

 

 

logo.gif

 

 

 

To veterinary anesthetists who err and wonder why

Preface

It’s a busy night of emergencies. A puppy, having eaten its owner’s socks, is undergoing exploratory abdominal surgery for gastrointestinal obstruction. A young tomcat is being treated for urinary tract obstruction, and a Pomeranian with a prolapsed eye has just been admitted as has a dog with multiple lacerations from a dog fight. A German shepherd with gastric dilatation and volvulus is being treated in one of the two bays in the emergency treatment room. This evening is also a new employee’s first night on duty; she and two other staff members are assisting with the German shepherd dog. After initially stabilizing the dog, it is anesthetized with fentanyl and propofol and then intubated so as to facilitate passing a stomach tube to decompress its stomach. The new employee, who is unfamiliar with the emergency practice’s standard operating procedures, facilities, and equipment, is told to attach an oxygen insufflation hose to the endotracheal tube. The employee inserts the hose into the endotracheal tube rather than attach it to a flow-by device, a device that is small and located out of sight at the other treatment bay. By inserting the insufflation hose into the endotracheal tube the patient’s airway is partially obstructed; the oxygen flow is set at 5 L min−1 (Figure 1). No one notices the error because the rest of the team is focused on inserting the stomach tube; within a few minutes the dog has a cardiac arrest. During CPR, which is ultimately unsuccessful, the team recognizes that the dog has a pneumothorax and its source is quickly identified.

Photos of an insufflation hose in an endotracheal tube (left) and flow-by device with a hose connection and an oxygen insufflation hose to be attached (right).

Figure 1 a) Insufflation hose inserted into an endotracheal tube almost completely occluding it. b) Flow-by device with connector where the oxygen insufflation hose is supposed to attach. The flow-by device is nothing more than an endotracheal tube (ETT) adaptor with a connector normally used with a gas analyzer for sampling and analyzing airway gases from an anesthetized, intubated animal. Using it for insufflating oxygen is a unique application of this device, not a usual one, and probably not a familiar one for the new employee of this practice.

Why do well-trained and caring professionals make errors such as this? How should the veterinarian in charge of the emergency respond to this accident? How can a veterinarian or practice anticipate an error or accident such as this so that it can be avoided, or prevented from occurring again? Both of us have thought about and explored the hows and whys of errors that occur during anesthesia. Based on our experiences and as teachers of anesthesia to veterinary students and residents, it is our opinion that the answers lie in the reality that we can and must learn from errors; they are learning opportunities, not personal or professional stigmata highlighting our failings for all to see. How those of us involved in veterinary medicine, specifically those of us in doing and teaching veterinary anesthesia, can learn from errors is the purpose of this text.

John W. Ludders
Matthew McMillan

Acknowledgments

We cannot thank enough those who have supported and encouraged us in the writing of this book. A number of colleagues kindly read various drafts of this book, and their comments helped us clarify initial thoughts as to how the topic of errors in veterinary anesthesia should be approached. Most important was their encouragement to push on with the writing.

Some colleagues expended a great deal of time in reviewing a first draft. Looking back at that draft makes us appreciate their efforts even more. Two individuals deserve special mention. Dr Erik Hofmeister at the University of Georgia, pointed out details we came to realize were not important to the subject matter; his comments were especially helpful in structuring and writing what is now Chapter 2. Dr Daniel Pang at the University of Calgary, made extensive comments throughout the draft, comments that encouraged us to do a better job of using terminology consistently, and to develop concepts more thoroughly and link them more effectively to the cases we have included.

When faced with questions about various concepts in patient safety, concepts that are evolving as this book was being written, we asked for advice from individuals who are experts in various aspects of this field. We were pleasantly surprised to find that these individual were approachable and willing to give of their time and share their knowledge. Dr Marjorie Steigler at the University of North Carolina-Chapel Hill and Director of the Consortium of Anesthesiology Patient Safety and Experiential Learning, provided references and her thoughts concerning issues relating to effective training strategies for residents. Dr Allisa Russ at the Roudebush Veterans Administration Medical Center and the Regenstrief Institute, and who is involved in developing processes for reducing medication errors, generously shared her knowledge about human factors analysis, what it is and what it is not.

John Wiley & Sons, our publisher, have also been very supportive since we first approached them. We were both surprised and thankful that the concept of writing a book about error in veterinary anesthesia, something we feared might have been perceived as too much of a niche subject, was accepted in such an enthusiastic manner.

Throughout the writing process our families have encouraged and assisted us in our efforts. In particular, our wives, Kathy (J.W.L.) and Sam (M.W.M.), have tolerated a huge amount of inconvenience and disruption of the routines of family life. Without their support, encouragement, and sacrifices this book would not have been possible. We are as always in their debt.

Finally, as with any book there will be errors and they are our errors alone, we are after all only human.

Introduction

Knowledge and error flow from the same mental sources, only success can tell the one from the other.

Ernst Mach, 1905

There are many veterinary anesthesia texts on how to anesthetize a variety of animal patients; such is not the purpose of this text. It does, however, have everything to do with the processes involved in anesthetizing animal patients, from pre-anesthetic assessment to recovery, and does so by seeking answers to how and why errors occur during anesthesia. In this text we define an error as a failure to carry out a planned action as intended (error of execution), or the use of an incorrect or inappropriate plan (error of planning), while an adverse incident is a situation where harm has occurred to a patient or a healthcare provider as a result of some action or event. How can those who are responsible for the anesthetic management of patients detect and manage unexpected errors and accidents during anesthesia? How can we learn from errors and accidents?

In the heat of the moment when a patient under our care suffers a life-threatening injury or dies, it is natural to look for something or someone to blame; usually the person who “made the mistake.” This is a normal response. Subsequently we may reprimand and chastise the individual who caused the accident and, by so doing, assume we’ve identified the source of the problem and prevented it from ever occurring again. Unfortunately, such is not the case because this approach fails to take into account two realities: (1) all humans, without exception, make errors (Allnutt 1987); and (2) errors are often due to latent conditions within the organization, conditions that set the stage for the error or accident and that were present long before the person who erred was hired. We can either acknowledge these realities and take steps to learn from errors and accidents, or we can deny them, for whatever reasons, be they fear of criticism or litigation, and condemn ourselves to make the same or similar errors over and over again (Adams 2005; Allnutt 1987; Edmondson 2004; Leape 1994, 2002; Reason 2000, 2004; Woods 2005).

In general there are two approaches to studying and solving the problem of human fallibility and the making of errors: the person approach (also called proximate cause analysis) and the systems approach (Reason 2000). The person approach focuses on individuals and their errors, and blames them for forgetfulness, inattention, or moral weakness. This approach sees errors arising primarily from aberrant mental processes, such as forgetfulness, inattention, poor motivation, carelessness, negligence, and recklessness (Reason 2000). Those who follow this approach may use countermeasures such as poster campaigns that appeal to people’s sense of fear, develop new procedures or add to existing ones, discipline the individual who made the error, threaten litigation, or name, blame, and shame the individual who erred (Reason 2000). It’s an approach that tends to treat errors as moral issues because it assumes bad things happen to bad people—what psychologists call the “just world hypothesis” (Reason 2000).

In contrast, the systems approach recognizes the fundamental reality that humans always have and always will make errors, a reality we cannot change. But we can change the conditions under which people work so as to build defenses within the system, defenses designed to avert errors or mitigate their effects (Diller et al. 2014; Reason 2000; Russ et al. 2013). Proponents of the systems approach strive for a comprehensive error management program that considers the multitude of factors that lead to errors, including organizational, environmental, technological, and other system factors.

Some, however, have misgivings about these two approaches as means of preventing errors in medical practice. A prevalent view is that clinicians are personally responsible for ensuring the safe care of their patients and a systems or human factors analysis approach will lead clinicians to behave irresponsibly, that is, they will blame errors on the system and not take personal responsibility for their errors (Leape 2001). Dr Lucian Leape, an advocate of the systems approach, points out that these thoughts only perpetuate the culture of blame that permeates healthcare (Leape 2001). The essence of systems theory is that human errors are caused by system failures that can be prevented by redesigning work environments so that it is difficult or impossible to make errors that harm patients (Leape 2001). Leape contends that this approach does not lessen a clinician’s responsibility, but deepens and broadens it; when an error does occur the clinician has a responsibility—an obligation—to future patients to ask how the error could have been prevented, thus questioning the system with all of its component parts. Leape goes on to say that fears about “blameless” medicine are unfounded and are related to the universal tendency to confuse the making of an error with misconduct (Leape 2001). Misconduct, the willful intent to mislead or cause harm, is never to be tolerated in healthcare. Multiple studies in many different types of environments including healthcare, have shown that the majority of errors—95% or more—are made by well-trained, well-meaning, conscientious people who are trying to do their job well, but who are caught in faulty systems that set them up to make mistakes and who become “second victims” (Leape 2001). People do not go to work with the intent of making errors or causing harm.

This text is written with a bias toward the systems approach, a bias that has grown out of our experiences as anesthetists, as teachers of anesthesia to veterinary students, residents, and technicians, and as individuals who believe in the principles and practices underlying continuous quality improvement. This latter stance is not unique and reflects a movement toward the systems approach in the larger world of healthcare (Chang et al. 2005).

No part of this book is written as a criticism of others. Far from it. Many of the errors described herein are our own or those for which we feel fully responsible. Our desire is to understand how and why we make errors in anesthesia so as to discover how they can be prevented, or more quickly recognized and managed. We believe that the systems approach allows us to do just that. It is also an approach that can be used to help teach the principles of good anesthetic management to those involved in veterinary anesthesia. This approach also has broader applicability to the larger world of veterinary medicine.

This text consists of eight chapters. The first chapter is divided into two sections, the first of which briefly discusses terminology and the use of terms within the domain of patient safety. The reader is strongly encouraged to read the brief section on terminology because it defines the terms we use throughout this book. Terms, in and of themselves, do not explain why or how errors occur; that is the purpose of the second section, which provides some answers to the “whys” and “hows” of error genesis. This discussion draws upon a large body of literature representing the results of studies into the causes and management of errors and accidents; a body of literature spanning the fields of psychology, human systems engineering, medicine, and the aviation, nuclear, and petrochemical industries. This section is not an exhaustive review of the literature, but is meant to acquaint the reader with error concepts and terminology that are the basis for understanding why and how errors happen.

Terminology, especially abbreviations, can be a source of error. In the medical literature many terms are abbreviated under the assumption they are so common that their meanings are fully recognized and understood by all readers. For example, ECG is the abbreviation for electrocardiogram unless, of course, you are accustomed to EKG, which derives from the German term. It is assumed that every reader know that “bpm” signifies “beats per minute” for heart rate. But wait a minute! Could that abbreviation be used for breaths per minute? Or, what about blood pressure monitoring? And therein is the problem. A number of studies have clearly shown that abbreviations, although their use is well intentioned and meant to reduce verbiage, can be confusing, and out of that confusion misunderstandings and errors arise (Brunetti 2007; Kilshaw et al. 2010; Parvaiz et al. 2008; Sinha et al. 2011). This reality has led us to avoid using abbreviations as much as possible throughout the book. In the few instances where we do use abbreviations, primarily in the chapters describing cases and near misses, we spell the terms in full and include in parentheses the abbreviations that will be used in that particular case or near miss vignette. It seems like such a minor detail in the realm of error prevention, but the devil is in the details.

The second chapter presents the multiple factors that cause errors, including organizational, supervisory, environmental, personnel, and individual factors. At the organizational level the discussion focuses on organizational features that are the hallmarks of “learning organizations” or “high reliability organizations,” organizations with a culture attuned to error prevention and a willingness and ability to learn from errors. Because individuals are at the forefront—at the sharp end—of systems where errors occur this chapter discusses cognitive factors that can lead to error generation.

The third chapter focuses on strategies by which we can proactively deal with errors. To be proactive an individual or organization has to be knowledgeable about the environment within which work is performed and errors occur. This knowledge can only come from collecting and analyzing data about patient safety incidents. To act there have to be reporting systems in place that provide information that accurately reflects the working of the organization, including its culture, policies, and procedures, and, of course, the people who work within the organization. This chapter especially focuses on voluntary reporting systems and the key features that make such systems successful. Reporting an incident is critical, but so too is the process of analysis, and this chapter presents some strategies and techniques for analyzing errors and accidents. It does so by using a systems approach and presents concepts and techniques such as root cause analysis and Ishikawa diagrams (fishbone diagrams). This chapter also presents a process by which accountability for an error can be determined so as to distinguish between the healthcare provider who intentionally causes harm (misconduct) in contrast to the individual who is the unfortunate victim of a faulty system.

Chapters 4 through 7 present and discuss cases and near misses that have occurred in veterinary anesthesia. Each chapter has an error theme: Chapter 4 presents cases and near miss vignettes involving technical and equipment errors; Chapter 5 medication errors; Chapter 6 clinical decision-making and diagnostic errors, and Chapter 7 communication errors. After reading these chapters some readers may object to our classification scheme. Indeed, we created the chapters and grouped the cases and near misses according to our assessment of the final act/proximate cause of the error, not in terms of their root causes. Although this is contrary to the approach we advocate throughout the book for dealing with errors, it has enabled us to resolve two issues with which we had to contend while developing these chapters. Firstly, not all cases underwent a thorough analysis at the time they occurred, making it difficult to retrospectively establish with certainty the root causes of a number of the errors and near misses. Secondly, the themes of the chapters allow us to present cases and near misses that have common themes even though they may seem dissimilar because of the context in which they occurred.

Some of the cases involve patients that many veterinarians will never see in practice, such as the polar bear (see Case 6.1). Such unusual cases superficially may seem of limited value for understanding how errors occur. Although the error itself is unique (involving an exotic species or unfamiliar drug combinations), the many factors involved in the evolution of the incident have a high likelihood of occurring anywhere and with any patient regardless of species, anesthetics used, or procedures performed. We need to recognize the multitude of factors that predispose to making errors in any situation and also embrace the problem-solving processes that can be applied to manage them.

A word of caution to our readers: while reading these cases a natural response is to think, “What was the anesthetist thinking?!?! It’s so obvious, why didn’t the anesthetist see the problem?” In the retelling of these cases all too often clues are given that were not apparent at the time of the error. Indeed, these cases are retold with full use of the “retrospective scope,” which, with its hindsight bias, influences how one perceives and judges the described events (see “Pattern-matching and biases” in Chapter 2, and Table 2.3). Remember, the view was not as clear to the anesthetist involved at the time of the error as it is in these pages.

The near miss vignettes represent errors that occur in veterinary anesthesia but do not cause patient harm only because the errors were caught and corrected early. These types of errors are also called “harmless hits” or “harmless incidents.” Although we can learn a great deal from adverse incidents, such as the cases described in these four chapters, they are rare and the knowledge gained is often at the expense of a patient’s well-being. Near misses, on the other hand, occur frequently and serve as indicators of problems or conditions within the system that have the potential to cause patient harm (Wu 2004).

The eighth and final chapter presents general and specific ideas and strategies for creating a patient safety organization, one in which patient safety as a cultural norm is paramount and permeates the organization. Training is an essential component of such a program. Throughout this chapter we present and discuss, in varying detail, some strategies and techniques that can be incorporated into training programs so that trainees have a proactive view of errors rather than a negative view (i.e., we all make errors, so let’s learn from them), and are better prepared to identify and neutralize errors before they cause patient harm, or to mitigate their effects once identified.

The Appendices contain supplemental material supporting various concepts discussed in the book, such as guidelines and checklists.

This book is an introduction to error in veterinary anesthesia, it is not a definitive text on the subject. As such, we hope this book contributes to changing the perception that errors and mistakes happen only to bad or incompetent anesthetists or veterinarians, that it helps move the veterinary profession and the various regulatory agencies that monitor the profession, to recognize and accept that errors happen despite our best intentions and efforts. We need to move beyond the “name, blame, and shame” mentality and direct our energies at taking positive steps toward helping ourselves and others learn from our errors, fundamental steps that we can and must take if we are to reduce error and improve the safety of veterinary anesthesia. Our hope is that this book contributes to this journey.

References

  1. Adams, H. (2005) 'Where there is error, may we bring truth.' A misquote by Margaret Thatcher as she entered No. 10, Downing Street in 1979. Anaesthesia 60(3): 274–277.
  2. Allnutt, M.F. (1987) Human factors in accidents. British Journal of Anaesthesia 59(7): 856–864.
  3. Brunetti, L. (2007) Abbreviations formally linked to medication errors. Healthcare Benchmarks and Quality Improvement 14(11): 126–128.
  4. Chang, A., et al. (2005) The JCAHO patient safety event taxonomy: A standardized terminology and classification schema for near misses and adverse events. International Journal for Quality in Health Care 17(2): 95–105.
  5. Diller, T., et al. (2014) The human factors analysis classification system (HFACS) applied to health care. American Journal of Medical Quality 29(3): 181–190.
  6. Edmondson, A.C. (2004) Learning from failure in health care: Frequent opportunities, pervasive barriers. Quality & Safety in Health Care 13(Suppl. 2): ii3–9.
  7. Kilshaw, M.J., et al. (2010) The use and abuse of abbreviations in orthopaedic literature. Annals of the Royal College of Surgeons of England 92(3): 250–252.
  8. Leape, L.L. (1994) Error in medicine. Journal of the American Medical Association 272(23): 1851–1857.
  9. Leape, L.L. (2001) Foreword: Preventing medical accidents: Is “systems analysis” the answer? American Journal of Law & Medicine 27(2–3): 145–148.
  10. Leape, L.L. (2002) Reporting of adverse events. New England Journal of Medicine 347(20): 1633–1638.
  11. Parvaiz, M.A., et al. (2008) The use of abbreviations in medical records in a multidisciplinary world–an imminent disaster. Communication & Medicine 5(1): 25–33.
  12. Reason, J.T. (2000) Human error: Models and management. British Medical Journal 320(7237): 768–770.
  13. Reason, J.T. (2004) Beyond the organisational accident: The need for “error wisdom” on the frontline. Quality and Safety in Health Care 13(Suppl. 2): ii28–ii33.
  14. Russ, A.L., et al. (2013) The science of human factors: Separating fact from fiction. BMJ Quality & Safety 22(10): 802–808.
  15. Sinha, S., et al. (2011) Use of abbreviations by healthcare professionals: What is the way forward? Postgraduate Medical Journal 87(1029): 450–452.
  16. Woods, I. (2005) Making errors: Admitting them and learning from them. Anaesthesia 60(3): 215–217.
  17. Wu, A.W. (2004) Is there an obligation to disclose near-misses in medical care? In: Accountability – Patient Safety and Policy Reform (ed. V.A. Sharpe). Washington, DC: Georgetown University Press, pp. 135–142.