Incident Decision Making and Cognitive BiasCognitive bias can and does get in the way of effective incident management, writes ITSM Watch columnist George Spafford of Pepperweed Consulting.
One challenge is that cognitive biases can, and will, affect personnel during these situations, especially ones involving high stress.
Anchoring - This refers to taking the first piece of evidence and then relying on it too heavily at the expense of later evidence. Imagine seeing a network traffic spike just before a system crashes and then focusing on the spike as to why a system crash versus equaling weighing other evidence collected later.
Attribution Errors - When a user calls in for assistance with an incident, there is a tendency to place too much emphasis on the person calling and to de-emphasize the situation at hand. If a caller is type-cast as a problem user, the tendency will be to immediately think it is something he did wrong versus listening to the symptoms of what is going on as communicated by the user and investigating accordingly.
Bias Blind Spot - People tend to not know their biases and what bias(s) are affecting them during decision making. A database architect may be diagnosing an Incident and not realize that he is succumbing to one or more biases such as anchoring.
Confirmation Bias - This relates to people interpreting new data in a means that confirms previously held beliefs. If a network engineer believes that network congestion is an issue then he will collect and review incident data in such a way that it confirms his beliefs and discount the data that does not.
Déformation Professionnelle - This interesting French phrase relates to ones profession affecting how one makes decisions. A developer may focus her efforts entirely on a possible application defect versus viewing the incident at a higher level and realizing that the cause of incident is not in the application layer.
Information Bias - This is the tendency to search for more information even when having it will not make a difference. Even when armed with sufficient information, some people will tend to try and collect additional information beyond what is necessary. A manager may collect data for months and still not act because he thinks more data will help the situation.
Hofstadter's Law - It can be a challenge to estimate how long a complex incident will take to resolve. During planning, there is a tendency to underestimate the time needed. Hofstadter's Law reflects this and articulates that a task always takes longer than expected, even when factoring in Hofstadter's law.
Compensating - The above are just a few examples of the many cognitive biases that will always exist. What Incident Management teams can do is recognize that biases do, in fact, exist and look for means to recognize them when they are taking place and also to develop procedures and automation to mitigate the risks associated with them. Consider the following options:
Process improvement is a journey because there will always be new business requirements to address, new types of incidents to take into account and a constant need to improve availability and customer service while controlling costs. Biases can negatively affect process and smart groups will work on their decision making skills to minimize the negative impacts of cognitive biases.
George Spafford is a principal consultant with Pepperweed Consulting and a long-time IT professional. George's professional focus is on compliance, security, management and overall process improvement.