Often unbearable reports happen for the following reasons:
- Reports Without Clear Goals: What is the purpose of it all? Or the readers are supposed to draw their own conclusions?
- No Storyline: Without a viable storyline, a series of charts often looks like a data dump, or worse, data puke. Be selective and have a story to tell.
- Too Much Information on a Page: Get to the point fast. Don’t make them dizzy or distracted. Don’t rely on your audience to figure things out in real time at the presentation.
- Lie With Numbers: Unfortunately, this happens all of the time. Charts without consistent scale, labels or legends, trend lines when there is no trend, 3-D effects that blur judgements, etc. (I recommend a classic book “The Visual Display of Quantitative Information” by Edward R. Tufte.)
- Too Little or Too Much Narrative: Charts without any summary of findings are not beneficial, but writing a dissertation on a PowerPoint page is even worse. Stating the obvious can be really annoying.
I’m sure there are more reasons, but I’m sure readers have seen plenty of meaningless charts throughout their careers. The bottom line is that none of these things happen because of the toolset, though it is easy to wrap it up as a “toolset problem.” Like the toolsets are not to be blamed for imperfect data, a bad storyline can’t be blamed on them, either. Simply: Don’t blame the piano when you don’t know how to play it.
So, how should an analyst go about it? Here is one example — as there are many ways to skin it — of storyline development.
- Thoroughly Understand What Is at Stake. Why does the consumer of information lose sleep at night? What can we do to help her? Is it about low conversion rate, lack of response, decreasing customer value or skyrocketing cost? How can any consultant or analyst come up with a good advice if the he doesn’t know what’s at stake?
- Based on What Matters, come up with a handful — not too many — success metrics. Such as opens, clicks, responses, conversions, renewals, customers and dollars.
- Break Up the Metrics “by” levels of information that matter. Examples are: brand, division, country, region, store/branch, channel, product category, product line, year, month, date, daypart, etc. The list goes on, but what does matter the most to this particular audience?
- Create Ratios: Based on key metrics, develop percentages (e.g., conversion rate= # customer / # exposed to campaign x 100), averages (e.g., average dollar per transaction and/or per customer), ratios (e.g., transaction per customer and items per transaction), etc. Again, the list goes on, but what factors would tell us the most compelling story?
- Develop Index Values: Define a baseline for comparisons (e.g., Total customers active in past 24 months) and calculate index values against it (e.g., conversion rate of the comparison group divided by that of the baseline). Highlight index values that are too high (e.g., over 120) or too low (e.g., less than 80) in different colors. Even untrained eyes can see patterns that way.
- Get the Story Out of the Reports. Let the numbers speak for themselves; don’t force it, as predetermining the lines never ends well (that’s not science).
- Once the Storyline Emerges, develop graphical representation of it. Do not overdo it, unless it is imperative to emphasize multiple storylines. This will be the most fun part, if you have a story to tell.
- Create an Executive Summary. This is the most difficult part of the reporting work, requiring some experience and hard discipline. But, a good story can be covered in a few minutes, and that may be all the time that you have. The rules are:
- Never more than one to two slides
- Not more than five bullet points per slide
- Not more than 10 words per bullet point
- Finally, the Next Steps. What are we going to do about it? What should be the immediate action? Does that deviate us from the long-term goals? This is where the difference between a consultant and a contractor emerges. Simply, contractors take orders, but consultants tell clients what to do. If an analyst sees the things out of mounds of data, credibility comes from the data, anyway. If he saw things that no one else did before? Ah, that is the moment for a geeky analyst to shine; proudly providing recommendations based on findings.
Obviously, the last part of the article is for current and future analysts. But I think the consumers of analytical services must be aware of these steps, as well. For one, marketers will be able to put the blame in the right places when things go wrong.
Not that this is about starting a blame game, but knowing what can go wrong is the first step toward the right kind of investment in this data and analytics game, and marketers will be able to manage the analysts and vendors more effectively, too. Because, like I said in the beginning, nothing really is a toolset problem.