Storytelling with Data: Keeping it Nonfiction

The best analysts, marketers, and business people today present analyses and answer questions by crafting a narrative that explains the data. 

The best analysts, marketers, and business people today present analyses and answer questions by crafting a narrative that explains the data. Telling a story with data, rather than by simply presenting raw findings, makes it easier for that data to be consumed and understood. The success of this approach has much to do with the way in which we process information, but the approach itself has significant vulnerabilities when it comes to the biases of those crafting the stories.

    In 2015, a pair of researchers published an editorial comment in the journal Nature that highlighted something very important about the interpretation of data. The year before, they signed up 29 research teams and gave them the same question and the same data set with which to answer it. So, what happened? In isolation, each team answered the question using different analysis techniques and reached different conclusions. What’s more, they were relatively strong conclusions. Team leaders were initially confident in their results, even though some of the teams contradicted each other. After reviewing other reports and discussing, two important things were noted:

    • There was no emerging consensus on what models and approaches were the best way to analyze the data.
    • When exposed to other reports, team leaders generally moved toward a consensus on a more moderate conclusion.

    I recommend reading the editorial; the results of this test are interesting and highlight a significant problem with the way we like to productize and consume data analysis these days. That is, storytelling with data is particularly vulnerable to cognitive bias.

    As the authors note, when small teams interpret data, “strong storylines win out over messy results”.

    The confirmation bias comes into play here; we tend to interpret data in ways that confirm what we already believed the data would tell us. This happens when we want or expect a certain answer to a question, and the data doesn’t overwhelmingly deny our conclusion.

    We also tend to have a problem with belief revision; in looking at data, we may come to a conclusion or decide on the story we want to tell early on. When reviewing the data or incorporating more data into our story, we tend to stick to the earlier impression we had, rather than revising the story to account for all of the information we have.

    The recommendation of the authors of the above-mentioned editorial is that, in appropriate cases, it is worthwhile to crowdsource analysis.

    This cuts down on biases inherent in small teams or individual researchers, however there’s another cognitive bias that comes into play when teams compare results: the bandwagon effect. Working collaboratively, people tend to follow the group in their thinking and change their beliefs and interpretations based on what seems to be the norm for the group. For this reason, the structure of the analysis should likely mimic the structure of the authors’ experiment. Individuals or small groups should start with their own analyses before comparing notes to give those consuming the analysis a sort of “before” shot in the record to remember where uncertainties may arise.

    The example above concerns academic research, but the concerns are equally applicable to business analysis. This is why I recommend that multiple teams be employed to answer significant business questions: both internal and external teams should be looking at your data. The reason being that each team will have different biases. The internal teams will often have more context with which to approach the data, but they will be more invested in the results and are far more likely to be influenced by internal politics. External teams are less likely to be emotionally attached to the data and won’t have the same kinds of expectations that internal team members will.

    Having at least one internal team and at least one external team produce answers to the same questions gives you the opportunity to minimize the effects of their biases.

    Performing analysis in this way can be a tough sell. For many the results are, as the authors put it, “puzzling: the investment of more resources can lead to less-clear outcomes”. It’s possible that your internal and external teams will tell you different stories. That can be useful because it will help you think more critically about the information you receive and make decisions more carefully. On the other hand, if you need a more coherent narrative, bring the teams together as the last step of the analysis. The work they produce together, after having worked on the data apart, will probably be more carefully articulated and nuanced, but it will also be closer to the truth.

    The truth is important. When it comes to the critical components of your business, a great data story is always nonfiction.

    If you’re looking for an external team to help beat the bias in your analysis, I know a great one.

    Latest Articles

    March GA4 & Google Updates

    GA4 UI changes for Key Events and Conversions

    What is changing?

    At a high level, measurement teams across Google recognize that differences in...

    Ricardo Cristofolini April 10, 2024
    Ketul Dave, Senior Implementation Specialist March 21, 2024
    Ketul Dave, Senior Implementation Specialist March 20, 2024

    Ready to grow your company to the next level?

    Graphics