HazMat Decision Making

Decision-Making: Dots, Connections, and Certainty

“You can’t connect the dots looking forward; you can only connect them looking backwards.” Steve Jobs

Mr. Jobs is referring to dots as bits of information and the connections as inferences made by linking the related bits together, the latter of which, he suggests, occurs only in hindsight.  Think for a moment about how this quote relates to HazMat response…or any emergency response for that matter. Do you agree? Connecting these dots is key, allowing us to make educated incident scene decisions. Utilizing experience, education, and training we develop an incident scene instinct, often referred to as “street smarts.” As we will discuss, the collection and connection of dots starts immediately (even before an incident) and continues through demobilization. There is a delicate balance that must be struck between the collection of data and making real-time decisions. As we move closer to an incident, we accumulate a lot of data, leading to connections being made. We must be careful of a couple of things, however: assumptions and too much data. Our assumptions (what we think we know) can lead us catastrophically astray by biasing our decision-making capabilities. Additionally, the collection of too much data can paralyze our capability to make decisions quickly. Understanding these concepts can, with awareness and practice, make us all much better size up practitioners.

DOTS

Think of dots as pieces of information. A LEL reading, specific gravity, and time of day, are all valuable pieces of information – These are the dots. This concept can be used to describe the investigation or learning phase of any problem. It is imperative to know the pieces of information in order to make a decision, right? Envision each piece of information as a dot on a piece of paper. Here, dots are synonymous with data and information, and the purpose is to allow visualization of a problem. Remember the “connect the dots” games often found in children’s books or on restaurant placemats? The image was a meaningless spattering of dots with numbers on a page. It was amorphous. Yet when dot 1 was connected with dot 2 which is connected to dot 3, and so on, an image begins to take shape. Incident data collection, for the most part, follows the same premise. As more and more pieces of information are collected a picture begins to evolve.

dots = information, data

When dealing with an emergency incident, information is coming at you from EVERY angle, sometimes even where we don’t want it, our own bias. This is the point where our training, education, and experience all enter into the picture, above and beyond what we are gathering from the incident itself. During the process of collecting the dots we are constantly evaluating information and making connections between various pieces. Some dots and connections are discounted as potentially irrelevant and some are raised to a heightened level of importance. This process is dynamic and leads us to build the foundation from which the incident response is built, managed, and ultimately mitigated. Think of this as the “size-up of size-up.”

So, you’ve been dispatched to an emergency. You have a dot already: it’s an emergency. Now let’s say you’re told it’s a hazardous materials incident. You have another. It’s the middle of the day August on the mid Atlantic Coast, sunny, hot, humid, and windy. There are six more: time, location, and four weather variables. Add to this that you are on a highway, in a populated area, a 275 gallon tote with a red and yellow placard that is leaking a liquid, and there are five other totes on this truck. Nine more: highway, populated, 275 gallons, tote, leaking, liquid, placard, five other totes, on a truck.  That is a total of seventeen pieces of information and you haven’t even arrived on the scene yet. Like me, you’ve probably already started making some assumptions about this incident.

Data is collected in a variety of means, such as size-up, research, and instrumentation, and each component brings a massive number of dots to the page. This sets the stage and begins to define the incident boundaries. Now think about the research occurring about the location, product(s,) incompatibilities, PPE, threshold or IDLH values, etc. Determining the specifics about the chemicals involved gets closer to the problem and hopefully the solution! Instrumentation readings and reconnaissance provide even more bits of information, adding even more structure to the incident framework. The picture is beginning to take shape.

Although this seems straightforward, there is a pitfall that I ask you to keep in mind, and that is confirmation bias. With roots in psychology, Raymond Nickerson in his article “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises” from Review of General Psychology, 1998 defines it well as: “seeking or interpreting of evidence in ways that are partial to existing beliefs, expectations, or a hypothesis in hand.” 1 Simply put, these are assumptions or pre-existing beliefs. Get the Truth by Philip Houston, et al. also discuss the importance of recognizing confirmation bias as part of the interrogation process that needs to be removed in order to see the REAL picture, in their case, the truth. Bias creates a lens that can radically distort information. In this example, interrogators are subject to this bias which can lead them astray during the interrogation process. This applies here as it relates to experience-based decision making. In other words, our previous experiences have a tendency to shape our future decisions and when faced with a new situation, there is a drive to force it into a previously experienced context, especially in two situations: too little data and too much data. When too little data exists, we have to rely on experience and we make decisions based on past experiences. When too much data exists we have to rapidly pick the information we feel is important in order to “see the forest for the trees.” In both of these situations our decisions will fit well, however sometimes it can be dangerous because dots and connections are missed in favor of what we think we know. Please do not misunderstand me, there is great value in experience and there is great value in data. The point is that the two must be in a complimentary manner. In other words, experience alone can be dangerous and data alone can be dangerous. Balancing these two is imperative. What we have to be cognizant of at this step is knowing how much information is enough information. It is easy to become overloaded with data, paralyzing the need for dynamic decision making.

HazMat Decision Making

Individual pieces of data may not give us the entire picture even if there are a lot of them, but by making connections between them, we stand a greater chance of developing that “big picture” we need as we advance towards mitigation.

COLLECTION

Information is data and it is everywhere. Not only is data everywhere, but as of 2017, it is immediately accessible nearly anywhere via computers and the smartphones most of us have on our person. Many times this information is presented in printed form, found in references sections of libraries, educational institutions, and our own HazMat reference sections. Digitally, there are thousands of resources and “apps” that can be used from our mobile devices. An enormous amounts of information is accessible to us, portably, instantly.

A brief note about validity of data is important here. With the overwhelming amount of information available to us, we must be cautious of the source. It is typically believed that “.com” sources are invalid, however there are numerous manufacturers that provide specific chemical data including SDS information. Obtaining a SDS directly from the manufacturer, if applicable, is an excellent practice. For example, hydrogen peroxide from a pharmacy is much different than the 32+% food-grade version of the same chemical. Other web sources such as “.edu” and “.gov” are considered high-quality, but again, be careful with looking at just ONE source. Written guides are excellent, but can be outdated at the time of publication. Mobile “apps” are also an excellent source of information, especially those that are linked to real-time databases such as the NIOSH Guide. In general, as with instrumentation, consider redundancy and utilize at least three sources to validate important incident decisions.

Response organizations routinely share information between and across disciplines. Fusion centers serve to disseminate information from a myriad of sources to those that could find the information useful. Whereas one organization may not see a relevant connection between data elements, another may be able to make such a connection. Therefore, what may appear as an irrelevant bit of data to one, may serve as the “missing link” to another, so collecting and sharing of information is essential. Emergency incidents are no different, they provide information and have information yet to be collected – they evolve. This is the process that leads to strategic and tactical decisions aimed at mitigation, and it starts with dot collection.

Size Ups

IMR Model (my own personal guideline)

Incident: what do you have, where is it, how big is it, how big can it get
Material: quantities, hazards to people/places/things, containers
Response: risk/benefit analysis, how to mitigate, what’s needed to mitigate

CONNECTION

A page full of dots is of little use without a way to make sense of them all. Imagine a spreadsheet with a lot of numbers written down a column. Without some sort of a frame of reference, something to help define what they are, they don’t hold much meaning. If to that data some framing measures are incorporated, such as time of day for each row and a column heading of “LEL Readings,” it begins to make sense. The addition of these framing references have added context allowing us to make sense of this data – The dots have been connected.

Another example is the previously discussed dispatch information about a leaking tote. By itself that may or may not be a big deal, it could be water. However when we add even just a few of the seventeen dots to the previous sentence, such as a hot day and yellow and red placard, we are left with a different scenario. In this case, a few is all that are needed to give us focus or perspective for this incident. Furthermore, this leads us to recognize that there are some pieces of information that are vitally important that we don’t yet have. Many of you have probably already started thinking “organic peroxide, SADT, and MSST.” Recall the discussion of confirmation bias above and think about how your OWN experiences are shaping your incident perspective.

As we collect more and more dots the connections between them often become apparent and connecting them is a natural and subconscious step as we progress through the decision making process. Individual pieces of data may not give us the entire picture even if there are a lot of them, but by making connections between them, we stand a greater chance of developing that “big picture” we need as we advance towards mitigation.

CONNECTED

An unintentional implication here is that we must have ALL of the information before we make decisions, but this isn’t realistic nor how it actually happens. We don’t need all the pieces of information to act and if we did, we would never do anything other than watch the incident progress on its own and that’s not why we’re here. During any size-up exercise, we are asked to make forecasting decisions based on incomplete information, sometimes with only a picture or two and a few tidbits of data. These exercises serve to bridge our experience, education, and training. Instinct guides this process and is significantly influenced by our education and experience.

In Malcolm Gladwell’s book Blink, he introduces the concept of “thin-slicing” as it relates to the decision making process. Thin-slicing is when a decision is made based on partial information, instantly. It’s a first impression – That gut feeling we have for a given situation. In Chapter 4 of Blink, he uses an example from Cook County Hospital in Chicago and their cardiac care dilemma. In essence, doctors were taking in TOO much information and becoming buried in the details. There was so much information that needed to be analyzed and interpreted for chest pain patients that the doctors were grossly inefficient in their diagnosis. A cardiologist named Lee Goldman, borrowing statistical methods from other branches of science, created a model for risk factors of heart attack patients. This model (the Goldman Algorithm) used FOUR data points. Why is this significant? Experienced and educated physicians were making bad decisions causing poor (or grave) patient outcomes and they had an abundance of information, not to mention substantial education and experience. Referring back to the previous “connect the dots” coloring page for kids, if there are too many dots the page becomes muddled and the image loses clarity. Too much information was hampering the ability of these professionals from making life-saving decisions, until the data became manageable. Through Goldman’s study it was determined that FOUR POINTS (or pieces) of information was all that was needed to make accurate decisions about a patients’ cardiac emergency.

In an earlier example we talked about a leaking tote of an organic peroxide and the seventeen pieces of information we had prior to arrival. There are likely hundreds more, but how many do we need to mitigate this incident? As a start, what about: Product, quantity, ambient temperature, location, product temperature? That’s 70% less information than what was provided earlier, four of which can be obtained without even getting close. Could you collect more? Absolutely, and I am not suggesting a minimalist approach to HazMat response, but rather tempering the massive amounts of data into an easily usable amount. Programs such as WISER serve to handle some of the incident scene connections for us. By plugging-in certain parameters, the program provides us with some possible solutions. In essence, this is “machine learning” where the technology connects relevant information and discounts the irrelevant. The overall point is that we are inundated with data at an incident and can easily lose track of what is or is not important. In turn, this delays the ability to make decisions because there is too much information.

“You can’t connect the dots looking forward; you can only connect them looking backwards.” Steve Jobs

Our world is driven by data. Everything we do is based on the collection and connection of pieces of information used in situations from buying a car to mitigating a leaking tote. We collect data (dots) and begin to connect them together as we strive towards one end, certainty. This can be an unrealistic expectation, and that can cripple the process for many. Certainty is one end of the continuum (uncertainty being at the opposite end,) and the point at which a decision is made can occur anywhere along that line. Obviously, the closer to the “certainty” end we are the more accurate the decision, but this comes at an expense. As discussed earlier we may not have all pieces of information we need, and to obtain them all and be 100% “certain” is unrealistic and in this case, comes at the expense of time.

Our decisions are based on experience, education, and training, all of which serve to formulate our instinct and frame our assumptions. When we rely on these, coupled with the information we have about an emergency situation, we don’t need all pieces of information to make good decisions. Good data collection and good connections between data points can lead us, with the aid of experience, education, and training to make accurate inferences about where an incident is headed, all without knowing every detail. Lee Goldman realized that only four pieces of data were needed to accurately route cardiac patients. In the world of HazMat response, this might be a lesson worth applying.

END NOTES

  1. http://pages.ucsd.edu/~mckenzie/nickersonConfirmationBias.pdf Accessed 9 June 2017.
  2. Houston, Philip, et al. Get the Truth. St. Martin’s Press, 2016.
  3. Gladwell, Malcolm. Blink. Back Bay Books, 2005.
dave

About the Author

David (Dave) Millstein

David (Dave) Millstein has been involved in the emergency services since 2000 as a volunteer in Pennsylvania and since 2004 with Frederick County (MD) Fire/Rescue. Outside of the fire department, he worked for the USFA supporting the NFIRS system and as a HazMat contractor in the Mid-Atlantic region. Originally from Massachusetts, he stayed in Pennsylvania for his career. Dave’s passion is focused on leadership and management in the area of HazMat/WMD, including planning, leadership, and management to prepare personnel for current and future threats/hazards. He is currently pursuing a MS in Emergency Management. When he is not working, he enjoys hiking with his family, Tae Kwon Do, reading, and writing.

Do you like Phil Ambrose's articles? Follow on social!