Understanding people and events, collecting data and evidence, and formulating perspectives on events – investigation requires a considerable amount of reasoning on the part of individuals and teams. Investigators need to be aware of the cognitive biases that might impact their own decision-making process as well as their peers. If an investigator forms a theory behind an incident and is unwilling to change their view to account for new and contradictory evidence, this is an example of cognitive bias.
Cognitive biases are patterns of deviation from rational thinking. These can apply to any scenario, whether in daily interactions with peers or even when making decisions about what to purchase in a store. For investigators, however, cognitive biases can be quite dangerous when investigating a crime as they can lead to gathering the wrong type of evidence, or worse yet, identifying the wrong person responsible for the threat.
The following common cognitive biases are examples of the type of flawed reasoning that might impact an investigation
1. Outcome Bias: Judging a decision based on its outcome
Outcome bias is an error made in evaluating the quality of a decision when the outcome of that decision is already known. For example, an investigator might make use of outcome bias to compel someone to testify by indicating that other witnesses have come forward with similar information.
2. Confirmation Bias: Favoring information that confirms preconceptions
Confirmation bias is the tendency to bolster a hypothesis by seeking evidence consistent with beliefs and preconceptions while disregarding inconsistent evidence. In criminal investigations, preference for hypothesis-consistent information could contribute to false convictions by leading investigators to disregard evidence that challenges their theory of a case.
3. Automation Bias: Favoring automated decision-making
Automation bias is the preference for automated decision-making systems and ignoring contradictory conclusions made without automation, even if they are correct. This type of bias is increasingly relevant for investigators who rely on automated systems. For example, a cybersecurity professional may assume their system is not under threat given the lack of automated alerts, despite concerns from specific individuals who believe they are being targeted by hackers. Automation might also identify a threat that is not actually there. Automation bias would compel investigators to pursue the problem regardless of facts that contradict it, thereby wasting valuable resources that could be applied elsewhere.
4. Clustering Illusion: Seeing patterns in random events
Clustering illusion is the intuition that random events which occur in clusters are not really random events. An investigator might uncover information that has limited correlation but, given false assumptions about the statistical odds of that correlation, they might suffer from fallacious reasoning. For example, an investigation into a suspect’s online footprint might uncover the person’s name associated with multiple posts with a similar sentiment. However, by looking at additional information and conducting in-person interviews, they could eliminate this bias by determining the posts were made by two different persons.
5. Availability Heuristic: Overestimating the value of information readily available
The availability heuristic is a mental shortcut that relies on immediate examples that come to a given person’s mind when evaluating a specific decision. Heavily publicized risks are easier to recall than a potentially more threatening risk. This impinges on individual perspectives but can also guide public policy.
6. Stereotyping: Expecting a group or person to have certain qualities without having real information about them
Stereotyping involves an over-generalized belief about a particular category of people. Stereotypes are generalized because one assumes that the stereotype is true for each individual person in the category. Stereotypes can be helpful in making quick decisions but they may be erroneous. Stereotypes are said to encourage prejudice. Stereotyping is the key driver behind racial profiling, whereby members of a specific race or ethnicity are associated with crimes typically believed to be perpetrated.
7. Law of the Instrument: Over-reliance on a familiar tool
Law of the instrument, or law of the hammer, is a bias that involves an over-reliance on a familiar tool. As Abraham Maslow said, “I suppose it is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail.” This is especially important for investigators in the digital age. As old behaviors are replaced with new behaviors that intersect with the web, investigators must update their toolkit to be able to investigate crimes. For example, the thriving online drug trade requires narcotics investigators to reexamine the tools and tactics they use to solve a case to gain a better understanding of the people and events involved.
8. Blind-spot Bias: Failing to recognize one’s own biases
Blind-spot bias, while last in our list, should always be top of mind. Investigators must continually assess possible biases that may impact their thinking unconsciously, and make conscious attempts to address them. People tend to attribute bias unevenly so that when people reach different conclusions, they often label one another as biased while viewing themselves as accurate and unbiased.
Great investigators will understand and reflect on these cognitive biases continually during an investigation. Mental noise, wishful thinking – these things happen to the best of us. So remember always: Check your bias.
Book a demo to learn our powerful Internet investigation software. Use Internet chatter and data on the Deep and Dark Web to find out who is talking about, or threatening, your organization.