The Cost of Perfection

Data has become the most strategic asset in modern businesses. It is now a “raw material” that any business requires to create and keep a competitive posture in its category. In order to convert this plentiful resource into business value, the data has to be refined, made easily accessible and deployed into the hands of marketers.

Data has become the most strategic asset in modern businesses. It is now a “raw material” that any business requires to create and keep a competitive posture in its category. In order to convert this plentiful resource into business value, the data has to be refined, made easily accessible and deployed into the hands of marketers.

From the perspective of working with dozens of marketing and IT organizations, it’s all too common to see this process grinding slowly along — even as opportunity costs rise for the organization.

Those opportunity costs are real — as the organization doesn’t have the visibility into its customers base and the wider market to make new strategic investments that competitors can’t even consider. Put another way, the opportunity cost is the foregone competitive advantage.

A Case Study of a Strategic Growth Opportunity Cost
A large “brick-and-click” retail brand asked us to look at its business, as it lacked a clear plan on how to achieve ever-increasing growth goals pushed down from the C-suite.

When the conversation started, it was a very tactical discussion. Lacking a strategic dimension or business context — the outcome or definition of success ultimately was to do something new and fresh.

The C-suite wanted outsized growth. So while the market they served was growing modestly, growing only with “the pie” or the market itself would not meet expectations.

Taking share was going to require a more advanced strategy.

With a large retail store footprint in North America and a robust online business, the company struggled to find credible new ways to scale sales and share further.

Shifting the dialog to the evidence of a specific business issue, and the impact that addressing that issue could or would be expected to have, helped focus the organization on the more concrete opportunities that existed.

The Symptoms
Some fundamental business and marketing “health” metrics were either inaccessible or non-existent.

Two prime examples were: the growth rate of net-new customers, and the value of existing customers.

Beyond those, the value of a customer over time, and the cost of acquiring customers from various sources relative to their upfront contribution to the top and bottom line, to the business and its value over time, were elusive.

When we asked to look at source data to get the answers for the organization, we learned that an 18-month project had been under way to clean and organize “the data” and measurement of these KPIs would have to wait.

With a rush to activity without some important benchmarks to define what must change, we agreed there were elements of a “ready, fire, aim” approach. However, the data was “inaccessible” or otherwise “unavailable.” IT owned the data.

Another case of the new “Data ‘Mine’-ing” (not to be confused with productive “Data Mining”). In the interest of “controls, clean data and process” — data is held hostage and doesn’t create business value. As marketing waits, the value of the data (recency, for one dimension) may decay — and competitive advantage diminishes.

These sounded like pragmatic purposes. After all, one would have to suppose that if it really needed to be cleaned, then it had to have been quite “dirty”  in the first place.

What possibly would the value be in working with such “dirty” data?  It makes sense on the surface to me — plus, who likes “dirty,” in general?

Emotionally, I feel and experience the same concern. This dislike of “dirtiness” has made robotic vacuums a mass-market product — iRobot alone has sold more than 14,000,000 of them — and counting!

Yet this same data that was being “organized and cleaned” for the prior year was remarkable for what it did not do … create value.

This example seems paradoxical for many. Yet there are highly rational reasons for this behavior.

Why Does This Happen? The Wrong Conceptual Model
The organization had a conceptual model that did not serve the business. Those most reasonable-sounding priorities were misplaced.

Why? because “clean” or “not clean” data is actually highly dependent on the specific purposes for that data. The same applies for almost any other description of data (and many other things) made without meaningful context.

One example is sentiment analysis. In this case, limited or even no cleaning at all could work with Bayesian methods on unstructured data, such as reviews on Yelp.

Granted, every review would not make an equal contribution to the final determination of the consumer sentiment about that brand — but does it have to? Of course not.

Your House is Dirty.
How did that statement make you feel? I wrote it, and for me it is really uncomfortable. I’m not a total neat freak, but it gives me an emotional, visceral reaction.

Why?  Because I like “clean.”

But if someone wanted to come in with a white glove, I guess that person would be able to get a smudge on it somewhere. Now maybe I could clean up the place to pass even the most stringent white-glove test …

But what if we came in with a microscope … regrettably, I’d find microscopic organisms in every home — mine, included.

Yuck.

But this being the case, we also know that our homes aren’t any less livable or enjoyable.

Let’s say we irradiated it. Like the perfect “cleaning” you may be envisioning for your data. Let’s just say we applied ultraviolet radiation to every surface. Now it’s as clean as we can get it …

Is home that is anything short of irradiated better than being homeless?

The corollary … is good even if ‘not perfect’ data any better than being data-less?

The reality — “clean” in data and elsewhere really is in the eye of the beholder.

Ever been in a college dorm or a fraternity house on a Sunday morning where no one’s complaining about how it looks? However scary it might be to you and I … it’s clean enough for them.

Yes, the same can be said for your data. It depends on how you wish to use it, and what the outcome you’re looking for is. Even the fraternity house looks perfect and smells like fresh lemons the day that the parents (and their checkbooks) come to visit.

Accuracy vs. Precision
Instead of clean vs. dirty data, marketers do well when they consider how accurate the data is for a specific purpose, vs. how much precision it could produce.

Big Data being “big,” we simply don’t need to hit the bulls-eye every single time; which is critical, because that’s not likely.

If the collection methods are logical and reasonable, even if only 90 percent right … that’s 10 fails out of 100 tries … we can still have precision for a given purpose.

This example from Jim De Novo’s “Drilling Down” makes a great example of why accuracy (AKA, “clean” data ) isn’t the only thing that matters — precision is what matters.

Mike Ferranti blog artWhen our data is inaccurate, but precise, we can use it to predict what will happen next.

In the bulls-eye on the left, we keep aiming for the perfect bulls-eye. We keep missing, however, and how much we miss by, or where the next shot will land is hard to say.

In the bulls-eye on the right, the attempts are precise. That is, they do not hit the bulls-eye consistently — but they are also consistently near the bulls-eye. We can realistically expect to know where the next dart will land.

Perfect Is the Enemy of Good
Similarly with data, we don’t need some theoretical “perfection” to be practical. When we have a large data-set (and in the digital age, they are usually sufficiently large) with some level of random error in it, we have precision, and we can predict the customer will buy more bath soap than perfume.

Better yet, one of the beautiful things about statistics (and computers) is the ability to assess, measure and account for error, outliers and still produce predictable outcomes.

In marketing, especially at scale, we’re looking to optimize performance. Rarely do we get it truly and totally perfect — not just because we’re not building medical devices to implant in a person or bridges that millions will walk across … but because a few percentage points of improvement in profit can redefine the leader in a category.

The Bottom Line
In the example we began with, we used a fairly weak proxy for the “ideal” data we couldn’t get our hands on for our analysis. With all of its limitations, we were able to discover an opportunity to grow customer value and take share from competitors with an eight-figure return … and if we had used only the data the organization already had on Day One, 18 months prior, that rate of return could be double.

Marketers need to have a bias to action, and start using the data they have today. It is far too easy to succumb to a narrative that leads us down the path of inactivity and reactivity.

Clean and perfect may sound or feel good — but the corner office and a big promotion requires action and results.

Don’t delay in the hopes of theoretical perfection that really never happens — take a shot and see what is actually feasible.

If someone, however well-intended, “scares you into inaction” over visions of some perfection, cleanliness or readiness of your raw data, perhaps progressive marketers have to start asking what — or whom it is — who’s “just not ready.”