To build resilience into our systems and success into our safety toolkit, a “student of fire” (and our firefighting systems) must embrace error as an opportunity to learn.
Accident investigations have changed as new research and understanding of complex systems has emerged. The US Forest Service has replaced accident investigation with the Learning Review process, which is designed to understand the relationship between actions and conditions. This transformative change was the result of three major observations.
- Accident prevention results from systemic change coupled with learning and learning was undervalued in the traditional processes.
- Learning continues after the reports are released and this learning has multiple audiences, which must be recognized and addressed to maximize the opportunity to learn.
- Traditional investigations have the potential to create casual relationships that result in blame, which can undermine efforts to learn from events.
The result of this transformation: if we recognize error we can use that information to develop learning products and processes designed to support safer operations for a wide variety of audiences.
Here, one of the leaders in this transformation, Ivan Pupulidy, discusses the bedrock concepts and application of “sensemaking,” error recognition, and learning from errors.
Investigations and reviews of accidents and incidents struggle with what to do with error. Some investigations use error as the principle cause of accidents, while other styles avoid the mention of error all together. In either case, the value of error will not be fully realized as a method to learn from events. It is important to begin to understand the value of error; however, it is equally important to recognize that error can be a judgment based on individual values and perceptions. To leverage error into learning, we must explore error types and the rich context that surrounded the errors.
In many cases “Error” is a perception by the reviewer, investigator or the individuals involved. Error can also be a judgment based on outcome and made after the fact. There may also, be times when error is normative (based on deviation from accepted practice or direction). Recognizing the type of error is important; however, the key to learning from error is using error as the starting point, not the ending point, of any study being performed (Dekker, S., Field Guide to Understanding Human Error). The control of biases is important and it can be very valuable to avoid judgment. One effective way to do this is to handle error as a set of actions and decisions that are influenced by the conditions in which they occurred, and thus to focus the study on the conditions.
There is a difference between blame and error, which can be associated with the perception of taking unnecessary risk. Of course, these are all grey areas, the boundaries of which — post incident — can often be easily drawn. The job of the review is to ensure that the degree of grey is represented so that people outside the incident can understand how it appeared to those involved.
If we never acknowledge the challenges faced by people performing the mission, there is very little learning that can result for anyone other than, perhaps, those with the most intimate information. The question becomes, “Will recognition of the perceived error help people to learn from the event?” Answering this question should establish tension through honest inquiry, which could not only help us learn, it may also help us to heal. We should consider this as an opportunity to engage in a helping relationship. Helping others to avoid the near miss by understanding what occurred and why. This may require facing what the USFS has come to call “hard truths.”
Risks and mistakes may appear to have lesser value when associated with positive outcomes and this may result the loss of learning opportunities. For example, we may judge an outcome as positive, and in this context, we may unintentionally reduce the importance of the mistakes or the risks we may have taken.
This may be more common than we think, and may contribute to an inflated belief in the safety of the system. Think about the old Roadrunner cartoon with “Wile E. Coyote”; the Coyote would run off a cliff and keep right on running and it wasn’t until he looked down that he realized his error and began to fall. We are often like that Coyote, in that we don’t recognize the “error” until we actually start to see the peril. The challenge inherent in preventing accidents is to avoid doing the same things again and again; this may require that we focus on what got us to the dangerous place. To do this we may need to develop a strength that allows us as individuals, and as an organization, to be vulnerable.
The question now shifts to, “How could we learn and what practices do we need to consider to make learning possible?”
From judgment to honest inquiry
No doubt learning occurs in many ways and at many levels. The only seemingly universal starting point, or requirement, is that we are in the mode of honest inquiry instead of judgment. This means that we have to pose a question and be ready for the answer, even if we don’t like what might be revealed. Sometimes this means that our stories have to establish enough tension to allow the listeners to form their own questions. Tension is often developed by not constructing ‘the’ answer, and tension alone can demand the creation of questions (Gergen, K, An Invitation to Social Construction). These questions can be introspective and shared in groups and should inspire sensemaking, “Would I have done the same thing?” or, “How could it have made sense for someone to have done what they did?” (Dekker, S., Field Guide, Weick, K., Sensemaking in Organizations). The product(s) of the Learning Review 1 should establish this tension.
The creation of tension in a narrative may be difficult to separate from the cultural predisposition to default to judgment. Judgment of actions and decisions as errors is a dominant social construct, which can lead to the criminalization of error. This is inconsistent with a nation that espouses the presumption of innocence until there is proof of guilt. Perhaps we must learn that there is no shame in error. We all make mistakes, and the expectation of perfect performance is an illusion. When errors are considered to be shameful acts, the result can be the suppression or omission of actions and decisions from the study or review being conducted. This decision is as judgmental as any traditional cause-effect based report that blames humans for the accident.
To learn, we must embrace error as an opportunity! Real strength may be found in our ability to be vulnerable. There are many examples of people who took chances and failed, pointed to clear errors they made, and used what they learned to transform. Many of these people are the heroes of legend – our heroes. Look at how Joseph Campbell’s “Hero’s Journey” depicts the way that heroes have been created in myth for generations.
The hero has to make a terrible mistake, this happens in the “abyss, death and rebirth. The hero reflects on the mistake/ error and emerges transformed because of what he/she has learned. It is in the humility and recognition of error that the hero is created.
A narrative of error
One excellent example of this took place in a recent fire refresher training session where a senior firefighter told a story about a time that he made a mistake as a Division Group Supervisor. He prefaced the story by saying that it had happened years before and that he had never talked about it because he was ashamed. He “spiked out” his division of four crews on a ridge above a smoldering fire. “Our mission was to construct indirect line up on a major ridge in the wilderness area, prep it for burnout, and eventually burn out to establish a blackline along this indirect fireline. The main fire was down in a major drainage bottom below us in what seemed like a long distance away” (Firefighter’s Account). They spent several days and nights doing lots of work with seemingly little effect, all the while with the fire was smoldering below and being held in check by an inversion. They attempted to create black line between them and the fire but could not get the fuel to carry fire. One night the inversion broke and the fire made a run placing all the crews in jeopardy. “The main fire had finally come out of the canyon during the night, made a major uphill run towards our location, and spotted over the main ridge starting a spot fire below us. We now had fire on both sides of our ridge-top location – the main fire on one side and the spot fire on the other side, and us in the middle (Firefighter’s Account).”
By luck, one firefighter was awake and noticed what was happening in time to warn the rest. During his presentation he acknowledged:
Not long after I had been abruptly awoken from my sleep and we were responding to the spot fire, the thought struck me of the mistake I made. I started beating myself up and asking myself: “how could you have missed that?? How could you have been so stupid!?” It became clear to me then that I should have kept someone awake all night to act as a lookout in case something like this happened. I was very embarrassed for not having done that. I know that Fire Order #3 states: “Base all actions on current and expected fire behavior,” and not: “Base all actions on previous fire behavior.”
I didn’t see the potential for changing conditions ahead of time to anticipate the risk of what happened that night. By not doing so, I put myself and 60 other firefighters in danger. I didn’t believe the fire would make an uphill run that night. I missed it. Fortunately, not one got hurt and we caught the spot fire. I made a mistake and got away with it.
When he was done, all 300 participants rose to their feet and gave him a standing ovation. In his story, they could see themselves because he provided the context of his actions. Our challenge is that we have developed an organizational response that places shame on actions and decisions, often separating them from the context in which they occurred. Given the context, our people often react the way this group did, with understanding an empathy. In the weeks following his presentation, he reflected and sent me this note:
I had pretty much forgotten about this incident, and my mistake, over the 13 years since it happened, since fortunately, no one got hurt and we recovered from it. However preparing to tell it for these sessions brought back all the memories and feelings of guilt over this that I had gotten over and moved on from. I started judging myself again after dredging up a past mistake that I had forgotten about. It reminded me that we are often harder on ourselves than anyone else would be, even after 13 years.
After deciding on which story to tell, I then began to worry about two aspects related to my sharing the story: 1. People would think I was stupid for the mistake I had made; 2. People would think that it really wasn’t that big a deal and I was making much ado about nothing. Since I have worked in the Pacific Northwest for over 30 years, and the three sessions where I would be sharing this story were all being held in the PNW, I knew there would be a number of folks in the audience who I knew. I felt a little more pressure because of that since you don’t want to embarrass yourself in front of people you know and will see again. I worried about what affect it would have on my reputation in the Region after the three sessions were over.
We have created a society that avoids, or even suppresses information as a result the social construction of shame and blame. Currently it takes heroic resolve to tell stories of error, mainly because we have socially created this judgment of actions. Heroes, in this culture, should be created in people who have the strength to tell their stories. The responsibility of the organization is to create an environment and culture where sharing this information can happen easily.
The individual, the organization, and error
Learning about “error” affords us an opportunity to understand more than just what to do and what not to do. It encourages us to begin to learn about the systems ability to do harm. Think about how expertise affects error. People can, with expertise, better predict what the system may deliver with regard to danger and then take steps to avoid the danger (Adams, J., Risk). This may appear as a lower error rate; what it may actually be, though, is a lower exposure rate. In a nontechnical context, experience may not be fewer errors, it may be an increased ability to recognize the potential of the system to do harm and the ability of our people to naturally build margin to minimize or reduce exposure (Adams, J., Risk). This may be information that is useful to a wide range of operators.
Both the individual and the organization share the responsibility for prevention. From an organizational perspective, error can be used to understand the goal conflicts, ambiguities and tensions extant in the system. Leadership can use this information to better prepare the worker for the environment and, perhaps more importantly; this information can be used to reduce the ambiguities and goal conflicts in the system. This recognizes error as an artifact of the system not a failure of the individual. This is less about error control and more about managing margin in the system and creating an environment where workers can be successful by unraveling the goal conflicts and tensions that are inherent in every system.
Normative errors may well be indicators of a system that is clogged with rules. Rules created for a variety of reasons from protection from litigation, to knee-jerk reactions to specific incidents, potential reoccurrence of which is likely nil. Fixed rules can only take us so far in complex environments. These environments characteristically deliver the unexpected and fixed rules only apply to that which can be predicted. So when we recognize that our system-wide base rates for failure have plateaued, what we may be seeing is that the current methods of prevention have reached their limit of effectiveness and something else, some new approach or analysis, may be needed.
Cognitive “slips, trips, and lapses” are examples of a type of error that is often confused with complacency. These are errors like leaving your glasses in a restaurant, or forgetting to pack your toothbrush for a trip, or locking your keys in the car. These errors are completely unintentional and they do not respond well to regulation, punitive actions or training. For example; Writing a rule that says do not forget your glasses, punishing the driver who forgot his/her keys in the car, or sending the traveler to a mandatory safety class to avoid forgetting the toothbrush will not make the system safer. This type of error is not avoidable; instead this may be an opportunity to develop error tolerant systems
Understanding error allows us to place the error in context. This allows us to understand how people are dealing with errors that they perceive, by asking, ‘What types of innovations or work improvements are created when errors are detected?’ This approach moves us to examine how the system fosters error as well as how the workers create resilience in the work environment when they occur. This can help us to understand how our personnel are not simply managing risk, but perhaps more importantly, how they are recognizing and dealing with risk. We may then be able to leverage this knowledge into work improvements. The greater benefit may come from the recognition and potential to develop skills related to critical thinking.
Following an incident, accident or normal work, it is our responsibility as a community to enter into a ‘helping relationship’ with all those involved. Stories must be shared in a way that encourages our workforce to develop empathy and the inner-strength associated with embracing error. This means we have to de-stigmatize error and to recognize error as a consequence not a cause of events.
Organizations and individuals must have the discipline needed to avoid labeling the error as “wrong,” that is simple judgment, which will only reduce reporting. Instead we must foster an environment where we can openly discuss what made the error possible. When we begin to appreciate that any “error” made by an individual is actually an error made by everyone, we will have created a place to learn, heal and recover. Errors and adaptations are not the enemy of safety; rather, they are natural outgrowths of complex environments. They are not bad – they are opportunities to learn.
IVAN PUPULIDY is the Director of the Office of
Learning for the US Forest Service. He currently the resides in Boise, Idaho. His career includes study and research of human factors and systems safety, and service as a Naval Aviator in the Coast Guard, USFS plane pilot, regional aviation safety manager, and a chief accident investigator.
- The Learning Review is the process that replaced the Serious Accident Investigation Guide for the USFS. It is focused on the development of learning products for the organizational leadership and for field personnel and recognizes that there may be very different learning needs and ways to present to different audiences. ↩