Colorado Firecamp - wildfire training wildland firefighter training Wildfire Blog Engine Boss Apprenticeship Location and Facility About Colorado Firecamp Frequently Asked Questions

Colorado Firecamp - wildland firefighter training
Swiss Cheese Model

swiss cheese slice

The Human Factors Analysis and Classification System—HFACS

Cover and Documentation
Introduction
1. Unsafe Acts
2. Preconditions for Unsafe Acts
3. Unsafe Supervision
4. Organizational Influences
Conclusion
References


HFACS and Wildland Firefighting Investigations

Hugh Carson wrote this article a few days after the Cramer Fire


A Roadmap to a Just Culture: Enhancing the Safety Environment

Cover and Contents
Forward by James Reason
Executive Summary
1. Introduction
2. Definitions and Principles of a Just Culture
3. Creating a Just Culture
4. Case Studies
5. References
Appendix A. Reporting Systems
Appendix B. Constraints to a Just Reporting Culture
Appendix C. Different Perspectives
Appendix D. Glossary of Acronyms
Appendix E. Report Feedback Form


Rainbow Springs Fire, 1984 — Incident Commander Narration

Introduction
Years Prior
April 25th
Fire Narrative
Lessons Learned
Conclusion


U.S. Forest Service Fire Suppression: Foundational Doctrine


Tools to Identify Lessons Learned

An FAA website presents 3 tools to identify lessons learned from accidents. The site also includes an animated illustration of a slightly different 'Swiss-cheese' model called "defenses-in-depth."

Drawing upon Reason’s (1990) concept of latent and active failures, HFACS describes four levels of failure: 1) Unsafe Acts, 2) Preconditions for Unsafe Acts, 3) Unsafe Supervision, and 4) Organizational Influences. A brief description of the major components and causal categories follows, beginning with the level most closely tied to the accident, i.e. unsafe acts.

1. Unsafe Acts

The unsafe acts of aircrew can be loosely classified into two categories: errors and violations (Reason, 1990). In general, errors represent the mental or physical activities of individuals that fail to achieve their intended outcome. Not surprising, given the fact that human beings by their very nature make errors, these unsafe acts dominate most accident databases. Violations, on the other hand, refer to the willful disregard for the rules and regulations that govern the safety of flight. The bane of many organizations, the prediction and prevention of these appalling and purely “preventable” unsafe acts, continue to elude managers and researchers alike.

Still, distinguishing between errors and violations does not provide the level of granularity required of most accident investigations. Therefore, the categories of errors and violations were expanded here (Figure 2), as elsewhere (Reason, 1990; Rasmussen, 1982), to include three basic error types (skill-based, decision, and perceptual) and two forms of violations (routine and exceptional).


Figure 2. Categories of unsafe acts committed by aircrews.

Errors

Skill-based errors. Skill-based behavior within the context of aviation is best described as “stick-and-rudder” and other basic flight skills that occur without significant conscious thought. As a result, these skill-based actions are particularly vulnerable to failures of attention and/or memory. In fact, attention failures have been linked to many skill-based errors such as the breakdown in visual scan patterns, task fixation, the inadvertent activation of controls, and the misordering of steps in a procedure, among others (Table 1). A classic example is an aircraft’s crew that becomes so fixated on trouble-shooting a burned out warning light that they do not notice their fatal descent into the terrain. Perhaps a bit closer to home, consider the hapless soul who locks himself out of the car or misses his exit because he was either distracted, in a hurry, or daydreaming. These are both examples of attention failures that commonly occur during highly automatized behavior. Unfortunately, while at home or driving around town these attention/ memory failures may be frustrating, in the air they can become catastrophic.

TABLE 1. Selected examples of Unsafe Acts of Pilot Operators (Note: This is not a complete listing)

ERRORS

Skillbased Errors

  • Breakdown in visual scan
  • Failed to prioritize attention
  • Inadvertent use of flight controls
  • Omitted step in procedure
  • Omitted checklist item
  • Poor technique
  • Overcontrolled the aircraft

Decision Errors

  • Improper procedure
  • Misdiagnosed emergency
  • Wrong response to emergency
  • Exceeded ability
  • Inappropriate maneuver
  • Poor decision

Perceptual Errors (due to)

  • Misjudged distance/altitude/airspeed
  • Spatial disorientation
  • Visual illusion

VIOLATIONS

  • Failed to adhere to brief
  • Failed to use the radar altimeter
  • Flew an unauthorized approach
  • Violated training rules
  • Flew an overaggressive maneuver
  • Failed to properly prepare for the flight
  • Briefed unauthorized flight
  • Not current/qualified for the mission
  • Intentionally exceeded the limits of the aircraft
  • Continued low-altitude
    flight in VMC
  • Unauthorized low-altitude
    canyon running

In contrast to attention failures, memory failures often appear as omitted items in a checklist, place losing, or forgotten intentions. For example, most of us have experienced going to the refrigerator only to forget what we went for. Likewise, it is not difficult to imagine that when under stress during inflight emergencies, critical steps in emergency procedures can be missed. However, even when not particularly stressed, individuals have forgotten to set the flaps on approach or lower the landing gear – at a minimum, an embarrassing gaffe.

The third, and final, type of skill-based errors identified in many accident investigations involves technique errors. Regardless of one’s training, experience, and educational background, the manner in which one carries out a specific sequence of events may vary greatly. That is, two pilots with identical training, flight grades, and experience may differ significantly in the manner in which they maneuver their aircraft. While one pilot may fly smoothly with the grace of a soaring eagle, others may fly with the darting, rough transitions of a sparrow. Nevertheless, while both may be safe and equally adept at flying, the techniques they employ could set them up for specific failure modes. In fact, such techniques are as much a factor of innate ability and aptitude as they are an overt expression of one’s own personality, making efforts at the prevention and mitigation of technique errors difficult, at best.

Decision errors. The second error form, decision errors, represents intentional behavior that proceeds as intended, yet the plan proves inadequate or inappropriate for the situation. Often referred to as “honest mistakes,” these unsafe acts represent the actions or inactions of individuals whose “hearts are in the right place,” but they either did not have the appropriate knowledge or just simply chose poorly.

Perhaps the most heavily investigated of all error forms, decision errors can be grouped into three general categories: procedural errors, poor choices, and problem solving errors (Table 1). Procedural decision errors (Orasanu, 1993), or rule-based mistakes, as described by Rasmussen (1982), occur during highly structured tasks of the sorts, if X, then do Y. Aviation, particularly within the military and commercial sectors, by its very nature is highly structured, and consequently, much of pilot decision making is procedural. There are very explicit procedures to be performed at virtually all phases of flight. Still, errors can, and often do, occur when a situation is either not recognized or misdiagnosed, and the wrong procedure is applied. This is particularly true when pilots are placed in highly time-critical emergencies like an engine malfunction on takeoff.

However, even in aviation, not all situations have corresponding procedures to deal with them. Therefore, many situations require a choice to be made among multiple response options. Consider the pilot flying home after a long week away from the family who unexpectedly confronts a line of thunderstorms directly in his path. He can choose to fly around the weather, divert to another field until the weather passes, or penetrate the weather hoping to quickly transition through it. Confronted with situations such as this, choice decision errors (Orasanu, 1993), or knowledge-based mistakes as they are otherwise known (Rasmussen, 1986), may occur. This is particularly true when there is insufficient experience, time, or other outside pressures that may preclude correct decisions. Put simply, sometimes we chose well, and sometimes we don’t.

Finally, there are occasions when a problem is not well understood, and formal procedures and response options are not available. It is during these ill-defined situations that the invention of a novel solution is required. In a sense, individuals find themselves where no one has been before, and in many ways, must literally fly by the seats of their pants. Individuals placed in this situation must resort to slow and effortful reasoning processes where time is a luxury rarely afforded. Not surprisingly, while this type of decision making is more infrequent then other forms, the relative proportion of problem-solving errors committed is markedly higher.

Perceptual errors. Not unexpectedly, when one’s perception of the world differs from reality, errors can, and often do, occur. Typically, perceptual errors occur when sensory input is degraded or “unusual,” as is the case with visual illusions and spatial disorientation or when aircrew simply misjudge the aircraft’s altitude, attitude, or airspeed (Table 1). Visual illusions, for example, occur when the brain tries to “fill in the gaps” with what it feels belongs in a visually impoverished environment, like that seen at night or when flying in adverse weather. Likewise, spatial disorientation occurs when the vestibular system cannot resolve one’s orientation in space and therefore makes a “best guess” — typically when visual (horizon) cues are absent at night or when flying in adverse weather. In either event, the unsuspecting individual often is left to make a decision that is based on faulty information and the potential for committing an error is elevated.

It is important to note, however, that it is not the illusion or disorientation that is classified as a perceptual error. Rather, it is the pilot’s erroneous response to the illusion or disorientation. For example, many unsuspecting pilots have experienced “black-hole” approaches, only to fly a perfectly good aircraft into the terrain or water. This continues to occur, even though it is well known that flying at night over dark, featureless terrain (e.g., a lake or field devoid of trees), will produce the illusion that the aircraft is actually higher than it is. As a result, pilots are taught to rely on their primary instruments, rather than the outside world, particularly during the approach phase of flight. Even so, some pilots fail to monitor their instruments when flying at night. Tragically, these aircrew and others who have been fooled by illusions and other disorientating flight regimes may end up involved in a fatal aircraft accident.

Violations

By definition, errors occur within the rules and regulations espoused by an organization; typically dominating most accident databases. In contrast, violations represent a willful disregard for the rules and regulations that govern safe flight and, fortunately, occur much less frequently since they often involve fatalities (Shappell et al., 1999b).

While there are many ways to distinguish between types of violations, two distinct forms have been identified, based on their etiology, that will help the safety professional when identifying accident causal factors. The first, routine violations, tend to be habitual by nature and often tolerated by governing authority (Reason, 1990). Consider, for example, the individual who drives consistently 5-10 mph faster than allowed by law or someone who routinely flies in marginal weather when authorized for visual meteorological conditions only. While both are certainly against the governing regulations, many others do the same thing. Furthermore, individuals who drive 64 mph in a 55 mph zone, almost always drive 64 in a 55 mph zone. That is, they “routinely” violate the speed limit. The same can typically be said of the pilot who routinely flies into marginal weather.

What makes matters worse, these violations (commonly referred to as “bending” the rules) are often tolerated and, in effect, sanctioned by supervisory authority (i.e., you’re not likely to get a traffic citation until you exceed the posted speed limit by more than 10 mph). If, however, the local authorities started handing out traffic citations for exceeding the speed limit on the highway by 9 mph or less (as is often done on military installations), then it is less likely that individuals would violate the rules. Therefore, by definition, if a routine violation is identified, one must look further up the supervisory chain to identify those individuals in authority who are not enforcing the rules.

On the other hand, unlike routine violations, exceptional violations appear as isolated departures from authority, not necessarily indicative of individual’s typical behavior pattern nor condoned by management (Reason, 1990). For example, an isolated instance of driving 105 mph in a 55 mph zone is considered an exceptional violation. Likewise, flying under a bridge or engaging in other prohibited maneuvers, like low-level canyon running, would constitute an exceptional violation. However, it is important to note that, while most exceptional violations are appalling, they are not considered “exceptional” because of their extreme nature. Rather, they are considered exceptional because they are neither typical of the individual nor condoned by authority. Still, what makes exceptional violations particularly difficult for any organization to deal with is that they are not indicative of an individual’s behavioral repertoire and, as such, are particularly difficult to predict. In fact, when individuals are confronted with evidence of their dreadful behavior and asked to explain it, they are often left with little explanation. Indeed, those individuals who survived such excursions from the norm clearly knew that, if caught, dire consequences would follow. Still, defying all logic, many otherwise model citizens have been down this potentially tragic road.

<<< continue reading HFACS, Preconditions for Unsafe Acts >>>

 


©2005-2006 Colorado Firecamp, Inc. home scheduleblogENGBfacilityabout usFAQ's