The word ‘storytelling’ has become more prevalent and popular in analytics circles recently. As I read various articles and advice on ‘storytelling’, I see a dangerous trend that encourages individuals away from a research-based approach and runs the risk of major organizations taking erroneous decisions based on a glossy but often inaccurate version of the facts.
I am not saying that how you present your results is not important. I regularly consider the most compelling ways of presenting results, with some formats and visualizations clearly more intuitive and understandable than others.
But when data professionals have been told that they need to show certain results to fit with a company objective, or are working to a pre-prepared narrative that they need to support with data, we have a problem. When data professionals spend too much time on making the results ‘look pretty’, at the expense of testing for significance or having their methodology reviewed, we have a problem.
Many consider it to be a primary responsibility of the data professional to find a way to make their results understood by the business leaders. I disagree. The primary responsibility of the data professional is to pursue a sound approach based on their expertise and to generate reliable, trustworthy results. Only then should the communication of those results be considered, and sometimes the business leaders have to meet us halfway.
It’s time to outline why ‘storytelling’ can be dangerous and to reiterate why a research-based approach should always win through.
The three key fails of ‘storytelling’
There are numerous definitions of the word ‘story’ in existence nowadays. Take a look at one I chose at the top of this article, the primary definition listed in the Collins English Dictionary. I chose this because it contains a word which I believe often leaks over into data storytelling, and often at the expense of sound process and methodology. It is this word which is the basis for some of the poor practices that are driven by the idea of ‘storytelling’ in analytics. That word is entertain.
Here is my view on the three key fails of storytelling in an analytics context:
1. Associating a piece of analysis with an objective within the organization, implying that the key reason for undertaking the analytics is to develop a narrative that will entertain that objective. Data analytics should only ever be associated with a question, never an objective.
2. Determining a ‘desired narrative’ before any analysis is conducted, automatically installing a bias on decision making and a pressure on analytics professionals to entertain the narrative. Narratives should only be built when results are completed, validated and where any potential weaknesses are highlighted.
3. Spending more time thinking about how to present results in an entertaining way, rather than in rigorously questioning their validity, resulting in erroneous conclusions making their way to decision makers. Data professionals should agree on the valid, defendable conclusions of the analytics before any discussion is entered into regarding presentation and visualization.
Multiple serious errors have been made in major organizations because of these three key fails. ‘Storytelling’ can be a dangerous business.
Maintain a research-based approach at all times
If you are a scientifically-trained analytics professional, it’s more than likely you have been drilled in a research-based approach. It’s becoming more and more important that you stick to this in the face of greater and greater pressure to ‘tell stories’.
Here are the elements of a research-based approach which I believe should be preserved in any decision making environment:
1. Context: Outline the reasons why the analysis is being undertaken. Be clear on the business or organizational question that is trying to be answered. Outline prior related work. Ensure that researchers have the freedom to make whatever conclusions the data leads them to.
2. Methodology: Record the methodology used. If there has been a choice of methodology, explain why the specific option was chosen. Be transparent about gaps or weaknesses in the methodology and the impact these may have on the accuracy and reliability of the results,
3. Results: Perform the analysis in a repeatable way. Ensure that appropriate statistical standards are adhered to. Record all instances of when results do and do not meet those standards.
4. Discussion: Ensure a thorough critique and peer review where possible. If not possible, declare so. Compare with any other results from prior work, Be fully transparent about where conclusions cannot be directly drawn from the results. Clearly highlight in particular where causality cannot be assumed.
5. Conclusion: Where conclusions are solid and pass scrutiny, consider the most compelling way to communicate them to stakeholders. Where there are uncertainties, present ‘possible avenues for further research’, and refrain from over-promoting results using language or graphics.
To those without scientific training, the research-based approach may sound like hard work, but I encourage any analytics professional who takes integrity and data-driven decision making seriously to adopt it and to stick by it. Lay out your work carefully and get the best answer you can before being tempted into a discussion of how you present that answer. You’ll be enhancing your reputation in the long run, and you’ll ultimately see that better decisions are made.