A few months back I delivered a conference speech on how to run a world class business analytics function. As is the case with these events, immediately afterwards I was approached by lots of people asking lots of questions. Although it can be quite exhausting to try to answer them all, it’s actually the part I enjoy the most because it helps me keep a pulse on the everyday issues being faced by people working in business analytics.
One memorable conversation I had was with someone who seemed genuinely dismayed about the pressure she was under to give ‘desirable’ analysis results. She described a situation several months prior where she was put under undue pressure to find a fact — any fact — that would support an assertion being made by her client. Despite her best efforts, she ended up caving in and providing a fact to the client, even though there were numerous other facts available that did not support the assertion. She told me that the client made it clear that they only wanted to hear facts that helped them with their objective, and had no interest in counter-arguments. She felt helpless in this situation and in the end relented in order to get the work off her plate and move on, but she clearly felt pretty disgusted with herself for allowing it to happen.
This kind of confirmation bias is rife in the business world, and many will argue that it is omnipresent in academia also. It might not always manifest itself as extremely as in the example I’ve just given, but it frequently hides in plain sight. I am sure many readers can relate to situations where they have exclusively looked for facts to support an argument to the exclusion of all other relevant facts.
Why should we worry about confirmation bias?
Confirmation bias helps nobody, especially not the business or organization that suffers from it. A lack of openness to alternative perspectives increases the chance of disagreement among decision makers with entrenched points of view, which in turn increases the chance of erroneous decisions or no decisions at all (decision paralysis).
In an analytics context, I often see confirmation bias manifest itself in three ways:
- At the beginning, it can manifest itself in the way that analytics is requested. Someone in the organization has an agenda or aim, and wants to use analytics to support it. So their questions will be loaded with confirmation bias: Do you have evidence to support that …? Find me data which shows…
- In the middle, a common behavior is to ask for analysis to be re-done on subgroups or over different timeframes until it eventually yields a palatable conclusion. What does this say if we just restrict it to the last 3 months? What about the US only?
- At the end, the analysis provided by the analyst is recast in a way that looks like it supports a desired conclusion, often ignoring warnings from the analyst about this.
In the intense focus of the individual to find the data that supports the desirable conclusion, they often fail to take a step back and realize that — by ignoring counter-arguments or presenting analysis in an unbalanced way— they may not be making the right decision for the business.
Controlling for confirmation bias
I am not a believer in debiasing. You can’t eliminate a bias. You can, however, control it. And the best way to do that is to introduce structured, consistent processes which reduce the chance that bias can play a role.
The best point in time to sniff out confirmation bias is at the point of the initial request for analytics. This is the point at which the requestor can be briefed about the neutral values and unbiased methods of the analytics group, and where the request can be phrased in a way that supports a balanced, evidence-based approach.
When you receive a request for analysis, and assuming it is not just a simple raw data pull, you can consider some of the following ideas to control confirmation bias:
- Write a service charter to share with clients at the outset. As well as containing commitments on turnaround times and the like, it can also be used to make the evidence-based values of the team clear and to get the client’s agreement to work in a way that is consistent with those values. For example, to agree an unbiased approach to the problem, and to commit not to edit the conclusions after the fact without consultation.
- Ensure the analytics request is phrased in the form of a neutral question and not an objective. Bad: We are looking for data that proves that sales have been declining because clients are now further away since our office move. Good: Based on our data, what possible reasons can be suggested for the recent decline in sales?
- Debrief the client on the results of the analysis and make a record of the results and the debrief. Debriefing with the client helps avoid misinterpretation of the results and can preempt further requests to dig deeper. Keeping a record means that there is recourse to intervene at a later point if the results are misinterpreted or misused.
One way to ensure that all this happens consistently is to set up a standard process for receiving and handling analytic requests. For example, you could create an analytics request form or problem statement that needs to be completed and agreed between the client and the analysts. Here’s an example I put together to illustrate this:
If the role of a business analytics function is to support accurate decision making in organizations, then it is essential that it has a way to counter confirmation bias and can produce balanced analytic perspectives for its clients. By installing the right processes and encouraging clients to follow them, you can make a lot of progress towards this aim.