To Err is Human

To Err is Human: Building a Safer Health System is a classic report in the field of patient safety that was published in 2000 by the Committee on Quality of Health Care in America (Institute of Medicine). My notes from the book (focused mostly on the first few chapters, which are about errors and adverse events, which I’m currently reading about as I’m looking for the best way to measure this in my evaluation of a healthcare IT project) are presented below.

Definitions

Error

    • safety = “freedom from accidental injury” (p. 4) “Safety is a characteristic of systems and not of their components. Safety is an emergent property of systems” (p. 157)
    • accident = “an event that involves damage to a defined system that disrupts the ongoing or future output of that system” (p. 52)
    • error = “a failure of a planned action to be completed as intended or the use of a wrong plan to achieve an aim” (p. 4) – not all errors cause harm
      • errors depend on 2 kinds of failures:
        • error of execution: “correct action does not proceed as intended” (p. 4) – intended outcome may or may not occur
        • error of planning: “original intended action is not correct” (p. 4) – desired outcome cannot be achieved
      • errors can happen in any stage of care (e.g., prevention, diagnosis, treatment)
    • slip – “occurs when the action conducted is not what was intended (an error of execution)” (p. 54) and is observable (e.g., turning the wrong knob on a piece of equipment
    • lapse  – “occurs when the action conducted is not what was intended (an error of execution)” (p. 54) and is not observable (e.g., not being able to recall something from memory)
    • mistake – “the action proceeds as planned but fails to achieve its intended outcome because the planned action was wrong. The situation might have been assessed incorrectly, and/or there could have been a lack of knowledge of the situation […] the original intention is inadequate” (p. 54) (error of planning)
  • preventable adverse events = errors that cause injury
  • adverse event (AE)= “an injury resulting from a medical intervention” i.e., “not due to the underlying condition of the patients”

 

  • not all AEs are preventable
    • if a surgical patient dies from post-op pneumonia, it’s an AE
    • if that pneumonia was due to poor handwashing or poor instrument cleaning, it’s preventable (i.e., due to an error of execution)
    • if the pneumonia developed by there was no error (e.g., just a poor recovery), it’s not preventable
  • analysis of errors allows us to see if there are opportunities to make our care delivery system better
    • if we blame people for errors:
      • we might be able to prevent that one person from making a similar error in the future
      • but we also make it less likely that people will report errors (and we can’t learn from them if we don’t know about them)
    • if we look at what in the system made the error possible, we might be able to prevent everyone from making a similar error in the future
  • to improve patient safety, we need to be able to identify errors
  • mandatory reporting systems are used to hold healthcare organizations accountable – usually focused on serious harm/death and are punitive
  • voluntary, confidential reporting systems focus on errors more broadly and are intended to focus on learning about where there are weaknesses in the system so that we can fix them before an error leads to serious harm/death
  • with either type of reporting system, there needs to be resources dedicated to following up on the reports – the reporting systems are only useful if we do something with that information
  • since the two types of reporting systems serve different purposes, they should be done separately
  • data collected for the purposes of learning (those not related to serious harm/death) should be protected because fear of legal discovery will discourage people from reporting voluntarily and thus will hamper the efforts to improve the system
  • medication errors tend to be studied because:
    • they are a common type of errors
    • results in significant healthcare costs
    • lots of people are prescribed drugs, so you can get a good sample size
    • drug prescribing process requires good documentation of medical decisions (much of which is in databases we can search)
    • deaths due to med errors are recorded on death certificates [Beth’s note: not sure if this is true in Canada too. Need to look into this.]
  • other types of errors may also offer opportunity to improve, but aren’t as studied [Beth’s note: and even for med errors, we don’t have a lot of good numbers]
  • focus tends to be more on hospitalized patients more than other areas of the healthcare system

Types of Errors

  • Leap et al (1993) classified types of errors:
    • Diagnostic
      • Error or delay in diagnosis
      • Failure to employ indicated tests
      • Use of outmoded tests or therapy
      • Failure to act on results of monitoring or testing
    • Treatment
      • Error in the performance of an operation, procedure, or test
      • Error in administering the treatment
      • Error in the dose or method of using a drug
      • Avoidable delay in treatment or in responding to an abnormal test
      • Inappropriate (not indicated) care
    • Preventive
      • Failure to provide prophylactic treatment
      • Inadequate monitoring or follow-up of treatment
    • Other
      • Failure of communication
      • Equipment failure
      • Other system failur
  • Medication Use Processes (there are many processes during which an error – or errors – can be made:
    • Prescribing
      • Assessing the need for and selecting the correct drug
      • Individualizing the therapeutic regimen
      • Designating the desired therapeutic response
    • Dispensing
      • Reviewing the order
      • Processing the order
      • Compounding and preparing the drug
      • Dispensing the drug in a timely manner
    • Administering
      • Administering the right medication to the right patient
      • Administering medication when indicated
      • Informing the patient about the medication
      • Including the patient in administration
    • Monitoring
      • Monitoring and documenting patient ’ s response
      • Identifying and reporting adverse drug events
      • Reevaluating drug selection, regimen, frequency and duration
    • Systems and Management Control
      • Collaborating and communicating amongst caregivers
      • Reviewing and managing patient’s complete therapeutic drug regimen

 

  • Safetysome important differences between accidents in healthcare vs. other industries:
    • in other industries, accidents usually affect worker & company directly (“the pilot is always the first at the scene of an airline accident”), but in healthcare, the damage happens to a third party: the patient
    • in other industries (e.g., airline), large groups can be affected, but in healthcare, it’s usually only one patient being affected at a time (so accidents are less likely to be reported in the media)
  • human error is a big contributor to accidents, but:
    • saying an accident is due to human error is ≠ blaming them
    • when equipment fails, human error can exacerbate the accident
  • Active errors occur at the level of the frontline operator, and their effects are felt almost immediately” (p. 55) (a.k.a.,   the sharp end). 17
  • Latent errors tend to be removed from the direct control of the operator and include things such as poor design, incorrect installation, faulty maintenance, bad management decisions, and poorly structured organizations .” (p. 55) (a.k.a. the blunt end)
  • e.g., “The active error is that the pilot crashed the plane. The latent error is that a previously undiscovered design malfunction caused the plane to roll unexpectedly in a way the pilot could not control and the plane crashed.”
  • latent errors = bigger threat to safety in a complex system because:
    •  often unrecognized
    • can –> many types of errors
  • we often focus on active errors (e.g., fire the person who made the error; retrain the person who made the error), but foucsing on fixing the latent erro would have more of an impact on increasing safety
  • “High reliability theory believes that accidents can be prevented through good organizational design and management. Characteristics of highly reliable industries include an organizational commitment to safety, high levels of redundancy in personnel and safety measures, and a strong organizational culture for continuous learning and willingness to change.” (p. 57)
  • systems are more prone to accidents if they are:
    • complex – since 1 component can interact with multiple other components, if that 1 component fails, all dependent functions also fail; as well, complex systems have multiple feedback loops, so it’s often difficult to predict what’s going to happen if 1 component fails
    • tightly coupled – coupling = “no slack or buffer between two items” (p. 59) – because this usually means there’s only one way to reach a goal and sequences are fixed, can’t “tolerate processing delays, […] reorder[ing of] the sequence of production, […or…] employ alternative methods or resources” (p. 59)
  • healthcare is a complex, tightly coupled sequence (and thus is prone to accidents)
  • “Complex, tightly coupled systems have to be made more reliable. One of the advantages of having systems is that it is possible to build in more defenses against failure. Systems that are more complex, tightly coupled, and are more prone to accidents can reduce the likelihood of accidents by simplifying and standardizing processes, building in redundancy, developing backup systems,” etc. (p. 60)

Human Factors

  • Human factors is defined as the study of the interrelationships between humans, the tools they use, and the environment in which they live and work.” (p. 63)
  • two types of human factors analysis:
    • Critical incident analysis examines a significant or pivotal occurrence to understand where the system broke down, why the incident occurred, and the circumstances surrounding the incident. Analyzing critical incidents, whether or not the event actually leads to a bad outcome, provides an understanding of the conditions that produced an actual error or the risk of error and contributing factors.” (p. 63-4)
    • “ Naturalistic decision making […] examines the way people make decisions in their natural work settings. […} the researcher goes out with workers in various fields, such as firefighters or nurses, observes them in practice, and then walks them through to reconstruct various incidents. The analysis uncovers the factors weighed and the processes used in making decisions when faced with ambiguous information under time pressure” (p. 64)

Error Reporting Systems

  • mandatory reporting systems = for “errors that result in serious patient harm or death (i.e., preventable adverse events)” (p. 87)
  • voluntary reporting systems = for errors that cause no harm/minor harm and “near misses”
  • reporting systems must dedicate sufficient resources to follow up on reports of errors (because the point of reporting systems is to learn from the errors and improve the system to make it safer)
  • reporting systems are known to greatly underreport errors (because most people don’t report errors), so they are not intended to be a “count” of errors
  • voluntary reporting systems need to be non-punitive/protected from legal discovery (as no one will report

Unsafe acts are like mosquitoes. You can try to swat them one at a time, but there will always be others to take their place. The only effective remedy is to drain the swamps in which they breed. In the case of errors and violations, the “swamps” are equipment designs that promote operator error, bad communications, high workloads, budgetary and commercial pressures, procedures that necessitate their violation in order to get the job done, inadequate organization, missing barriers, and safeguards . . . the list is potentially long but all of these latent factors are, in theory, detectable and correctable before a mishap occurs.” (cited on page 155)

Image Credits
  • multiple errors image – posted on Flickr with a Creative Commons licence
  • safety – posted on Flickr with a Creative Commons license
This entry was posted in healthcare, notes and tagged , , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *