All’s Fair in Love, War, and Business — Addressing a Competitor’s Bold Moves

It’s no coincidence that Jack Dorsey, CEO of Twitter, announced that his platform would ban political ads less than an hour before Facebook’s much-anticipated Q3 earnings call. It wasn’t the first time that a competitor made a business decision that forced a company to either follow suit or defend its position.

It’s no coincidence that Jack Dorsey, CEO of Twitter, announced that his platform would ban political ads less than an hour before Facebook’s much-anticipated third-quarter earnings call. While this is a business decision, and Dorsey will forgo revenue as a result, his move had a broad-reaching marketing impact; especially given the timing.

Mark Zuckerberg doesn’t typically do well in the hot seat; however, he stood behind his policy, saying, “I don’t think it’s right for private companies to censor politicians and the news.”

I can only speculate, but it’s likely that Zuckerberg, his legal, marketing, PR, and investor relations teams held an eleventh-hour strategy session to prep and align on Facebook’s response ahead of the earnings call, and for the coming days.

Companies on the Defensive

It wasn’t the first time, nor will it be the last, that a competitor made a business decision that forced a company to either follow suit or defend its alternate position.

In early October, Charles Schwab made a surprising announcement that it would eliminate commission fees on online stock, ETF, and options trades. Hours later, TD Ameritrade announced it would also reduce fees to zero. E-Trade did the same the following day.

All is fair in love, war, and business. When a competitor makes a bold move, business leaders must make tough decisions that have major ramifications — financial, moral, ethical, and otherwise. In order to address the competition’s news, a strategic marketing response is required.

Marketing and Communications Readiness to Counter the Competition

Whether or not your company plans to follow a competitor’s lead or chart your own path, marketing and communications readiness will ensure you communicate effectively with customers, prospects, and the general public surrounding the matter.

Close Alignment With Leadership

A response to the competition’s news is more successful if marketing has a seat at the table with leaders as they make any related decision. The marketing team can be a sounding board on the reputational impact of the business decision and can help with the planning and strategy for the response to ensure the company’s position relative to the competition is clear.

Real-Time, Multichannel Response

Following Charles Schwab’s announcement, the companies who reduced their fees in step with Schwab needed to move quickly to retain their customers and ride the news wave. Again, alignment with leaders is crucial, because marketing teams can only move quickly with marketing efforts if they have access to stakeholders and decision-makers.

Digital channels allow for the quickest turnaround for marketing efforts and a variety of levers must be pulled simultaneously to have the greatest impact. This requires a collaborative approach across marketing, sales, client relationship management, and other teams.

Strong PR Foundation

Well ahead of these circumstances, it is important to have a solid PR foundation, including approved subject matter experts who have been vetted, prepped, and coached.

Additionally, PR teams should be continuously building media relationships before there’s even news to share. Then, when it is time to participate in a relevant dialogue, the reporter knows your company and will be more likely return calls or emails.

Reclaim the Competitive Advantage

There’s no way to anticipate every move your competitor will make. However, if you’re strategic and prepared, you can use your competitor’s news to your marketing advantage.

Perspectives Matter in Analytics

When we observe a certain phenomenon, we should never do so from just one angle. We’ve all heard the fable about blind men and an elephant, where each touched just one part of the animal and exclaimed, “Hey, this creature must be like a snake!” and “No, it feels like a thick column!” or “I’m sure it is like a big wall!” We certainly don’t want to fall into that trap.

When we observe a certain phenomenon, we should never do so from just one angle. We’ve all heard the fable about blind men and an elephant, where each touched just one part of the animal and exclaimed, “Hey, this creature must be like a snake!” and “No, it feels like a thick column!” or “I’m sure it is like a big wall!” We certainly don’t want to fall into that trap.

In the world of marketing, however, so many jump to conclusions with limited information from one perspective. Further, some even fool themselves into thinking that they made scientific conclusions because they employed data mining techniques. Unfortunately, just quoting numbers does not automatically make anyone more analytical, as numbers live within contexts. With all these easy-to-use visualization tools, it’s equally easy to misrepresent the figures, as well.

When we try to predict the future – even the near future – things get even more complicated. It is hard enough to master the mathematical part of predictive analytics, but it gets harder when the data sources are seriously limited; or worse, skewed. When the data sources are contaminated with external factors other than consumer behavior, we may end up predicting the outcome based on the marketer’s action, not on consumer behaviors.

That is why procuring and employing multiple sources of data are so important in predictive analytics. Even when the mission is to just observe what is happening in the world, having multiple perspectives is essential. Simply, who would mind the bird’s-eye view when reporting a high-speed car chase on TV news? It certainly enhances the picture. On the other hand, you would not feel the urgency on the ground without the camera installed on a police car.

I frequently drive from New Jersey to New York City during rush hour. (I have my reasons.) I have been tracking the number of minutes in driving time between every major turn. Not that it helps much in reducing overall commuting time, as there isn’t much I can do when sitting helplessly on a bridge. But I can predict the arrival time with reasonable accuracy. Now armed with smartphone apps that collect such data from everyone with the same applications (crowd sourcing at its best), we can predict ETA to any destination with a margin of error narrower than a minute. That is great when I’m sitting in the car already. But do such analytics help me make decisions about whether I should have been in the car in the first place that morning? While it is great to have a navigator that tells me every turn that I should make, do all that data tell me if going to the city on the first day of school in September is the right decision? Hardly. I need a different perspective for that type of decision.

Every type of data and analytics has its place, and none are almighty. Marketers literally track every breath you take and every move you make when it comes to online activities. So-called “analytical solution providers” are making fortunes collecting data and analyzing them. Clickstream data are the major reasons why data got so big; and, thanks to them, we started using the term “Big Data.” It is very difficult to navigate through this complex world, so marketers spend a great amount of time and resources to figure out where they stand. Weekly reports that come out of such data are easily hundreds of pages (figuratively), and before marketers get to understand all those figures, a new set of reports lands on their laps (again, metaphorically). It is like having to look at the dashboard of a car without a break when driving it at full speed. Such a cycle continues, and the analysts get into a perpetual motion of pumping out reports.

I am not discounting the value of such reporting at all. When a rocket ship is being launched, literally hundreds of people look at their screens all simultaneously just to see how the process is going. However, if the rocket ship is in trouble, there isn’t much one can do by looking at the numbers other than, “Uh-oh, based on these figures, we have a serious engine problem right now.” And such reporting certainly does not tell anyone whether one should have launched the vehicle at that particular moment in time with that pre-set destination. Such analytics are completely different from analyzing every turn when moving at a full speed.

Marketers get lost because they look at the given sets of numbers looking for answers, while the metrics and reports are designed for some other purpose. At times, we need to change the perspective completely. For instance, looking at every click will not provide accurate sales projections on a personal or product level. Once in a while it may be correct, but such predictions can easily be thrown off with a slight jolt in the system. It gets worse when there is no direct correlation between clicks and conversions; as such things are heavily dependent upon business models and the site design (i.e., actions of marketers, not buyers).

As I emphasized numerous times in this series, analytical questions must be formed based on business questions, not the other way around. But too often, marketers seek to find answers to their questions within the limited data and reports they get to see. It is not impossible to gauge the speed of your vehicle based on the shape of the fur of your dog who is sticking his head out the window, but I wouldn’t recommend using that method when the goal is to estimate time of arrival with a margin of error of less than a minute.

Not all analytics are the same, and different types of analytical objectives call for different types of data, big and small. To understand your surroundings, yes, you need some serious business intelligence with carefully designed dashboards, real-time or otherwise. To predict the future outcome, or to fill in the blanks (as there are lots of unknown factors, even in the age of Big Data), we must change the perspective and harness different sets of data. To determine the overall destination, we need yet another types of analytics at a macro-level.

In the world of predictive analytics, predicting price elasticity, market trends or specific consumer behaviors all call for different types of data, techniques and specialists. Just within the realm of predicting consumer behavior, there lie different levels of difficulties. At the risk of sounding too simplistic, I would say predicting “who” is relatively easier than predicting “what product.” Predicting “when” is harder than those two things combined, as you may be able to predict “who” would be in the market for a “luxury vacation” with some confidence, but predicting “when” that person would actually purchase cruise ship tickets requires a different type of data, which is really hard to obtain with any consistency. The hardest one is predicting “why” people behave one way or the other. Let’s just say marketers need to listen to anyone who claims that they can do that with a grain of salt. We may need to get into a deep discussion regarding “causality” and “correlation” at that point.

Even that relatively simple “who” part of prediction calls for some debate, with all kinds of data being pumped out every second. Some marketers employ data and toolsets based on availability and price alone, but let us step back for a second and look at it from a different perspective.

Hypothetically speaking, let’s assume we as marketers get to choose one superpower to predict who is more likely to buy your product at a mall, so that you can address your prospects properly (i.e., by delivering personalized messages properly). Your choices are:

  • You get to install a camera on everyone’s shoulder at the entrance of the mall
  • You get to have everyone’s past transaction history on an SKU level (who, when, for how much and for what product)

The choice behind Door No. 1 offers what we generally call clickstream data, which falls into the realm of Big Data. It will record literally every move that everyone makes with a time stamp. The second choice is good old transaction data on a product level, and you may call it small data; though in this day and age, there is nothing so small about it. It is just relatively smaller in size in comparison to No. 1. Now, if your goal is to design the mall to optimize traffic patterns for sales, you surely need to pick No. 1. If your goal were to predict who is more likely to buy your product, I would definitely go with No. 2. Yes, some lady may be looking at shoes very frequently, but will she really make a purchase in that category? What does her personal transaction history say?

In reality, we may have to work just with No. 1, but if I had a choice in this hypothetical situation, I would opt for transaction data any time. In my co-op data business days, I looked through about 50 model documents per day for more than six years, and I have seen the predictive power of transaction data firsthand. If you can achieve accurate answers with smaller sets of data, why would you pick any reroute?

Of course in real life, I would like to have both. Because more varieties of data – not just these choices, but also demographic, geo-demographic, sentiment and attitudinal data, as well – will help you zoom into the target with greater accuracy, consistency and efficiency. In this example, if the potential customer is new to the mall, or has been dormant for a long time, you may have to work with just cameras-on-shoulders data. But such a judgment should be made during the course of analytics, and should not be predetermined by marketers or IT folks before the analysis begins.

Not all datasets are created equal, and we need all kinds of data. Each set of data comes with tons of holes in it, and we need to fill such gaps with data from other sources, from different angles. Too often, marketers get too deep into the rabbit hole simply because they have been digging it for a long time. But once in a while, we all need to stick our heads out of the hole and have a different perspective.

Digging a hole to a wrong direction will not make anyone richer, and you will never see the end of it while you’re in it.

Today’s B-to-B Marketing: It’s a Lot Like Shark Tank

As a marketer, I understand the challenge of reaching business decision makers like me in a fresh and meaningful way, but I will tell you that as a focus group of one, I despise the direction marketers seem to be headed:

As a marketer, I understand the challenge of reaching business decision makers like me in a fresh and meaningful way, but I will tell you that as a focus group of one, I despise the direction marketers seem to be headed:

  • My LinkedIn inbox is now overflowing with invitations to connect to people I don’t know and now choose NOT to connect to because I know they’re going to simply try and sell me something based on their job description/profile.
  • To download a whitepaper of interest requires me to complete a form that includes my phone number, which means dealing with unwanted calls from a bored sales rep.
  • My regular inbox is stuffed with offers from strangers that want to set up meetings, desperate attempts to sell me data from unknown sources, demands that I click links to view the video about revolutionary new technology that will “change the way I do business.”
  • If I express any interest at all in a product (attend a webinar, visit a tradeshow booth, download a spec sheet), I am relentlessly mobbed by emails and phone calls.

I get that sales folks have a job to do, so what’s the answer?

It’s called Lead Nurturing.

An organized and systematic way of building a relationship that will, over time, help turn a cold prospect into a warm prospect… and from a warm prospect into a hot prospect… and ultimately to a sale.

But excellence in lead nurturing seems to be a lost art form as I haven’t been exposed to many companies that are doing it—let alone doing it well.

Best practices suggest that the marketer try to ask just a few questions at the outset of the relationship to try and determine the prospects pain point (the reason for their download or visit to your website or tradeshow booth), and the role the individual plays in the purchase process (influencer, part of a decision making team, final decision maker).

Based on the answers to these and perhaps one or two other pertinent questions that would help you define your lead nurturing strategy (for example, industry or job title/function), leads should be scored and placed into an appropriate lead nurturing system that will help the marketer deliver ongoing content that will be most relevant to that prospect.

Best practices do NOT include asking questions about intent to purchase timeframes (God forbid you answer “in the near future” as that will guarantee an instant follow up call), budget size (really? Do you think I’ll reveal that I have earmarked$100K on a form?).

Lead nurturing programs should include:

  • Additional assets that can be distributed via email: Content can include a competitive review, an article that’s relevant to the prospects vertical industry, research findings, videos that demonstrate how a product works, etc. These should NOT be sales literature but rather help the company position itself as an expert in their field. This in turn, helps build credibility and trust (key components in a B-to-B purchase).
  • Invitations to webinars where a particular topic is explored. Webinars should include speakers from OUTSIDE the sponsoring organization to give the topic value and ensure the attendee isn’t just signing up for a sales pitch.
  • Invitations to breakfast or luncheon roundtable discussions: Bring in a speaker of interest and discuss a topic that is most relevant to your audience (especially if it’s industry specific).

Over the course of time, you’ll be able to ask additional questions / gain additional insights into your prospect pool that will help you become more familiar with them and the problem they’re trying to solve.

After all, don’t we all want to do business with people we know and like? The reality is, it is highly unlikely that I’m ready to buy after one simple download, so stop treating me like a piece of meat that has fallen into a tank full of hungry sharks.

5 Numeric Speed Bumps to Higher Conversion

Ah, the holiday season. Your prospects are moving fast these days in an always-on world, with all the trimmings of distractions and stress. Fast thinking normally trumps slow thinking, yet sometimes you need to slow down thinking long enough to convert your prospect into a paying customer. Your most challenging task during these last days before the holidays may be

Ah, the holiday season. Your prospects are moving fast these days in an always-on world, with all the trimmings of distractions and stress. Fast thinking normally trumps slow thinking, yet sometimes you need to slow down thinking long enough to convert your prospect into a paying customer. Your most challenging task during these last days before the holidays may be slowing down your prospective customers just enough that they don’t skip over your sales message.

Fast thinking is always on. Fast thinking is instinctive and automatic. Whatever pops into the mind of your prospect often happens with no voluntary control. And sometimes fast thinking works in your favor with a quick, impulsive decision to buy.

But, not always.

So, as you set out to grab attention during these frenzied times, remember that when the mind is in fast thinking mode, short, simple sentences, with short words, are more effective. Content that’s breezy in style usually prevails over hard-to-read copy. And this helps to explain why it’s best to write copy that is readable at about a ninth- or tenth-grade level.

But how do you get the fast thinker to slow down when you want them to make a decision?

Here’s where you can create speed bumps in your message, so the mind doesn’t slide down its established memory grooves too quickly and pass you by.

One way to get attention is by introducing numbers. Numbers—especially dollars and cents—are effective speed bumps.

For people to respond to numeric data effectively, they need to be able to do three things:

  1. Comprehend the number.
  2. Interpret it in proper context.
  3. Act on it.

When our daughters were small children, one of the ways that I discovered how to get them out of an emotional tantrum was to ask a question requiring a numerical answer. Questions like “how old are you?” or “how old will you be on your next birthday?” worked like a charm to move our kids from their right brain emotional state to a left brain logical state to slow down their impulsive thinking.

So, when using numbers in marketing copy, you can slow down the readers’ thinking with these five speed bumps:

  1. Ask a question that requires a numeric answer.
  2. Reveal pricing in small chunks, such as a cost per day.
  3. Display discounts in dollars, not percentages. Not everyone quickly grasps that 30 percent off a $100 item equals $30. Better to say “save $30.00.”
  4. Illustrate improvement or satisfaction increases using specific numbers. Better: give numbers visual life in charts or graphs.
  5. Guarantee your product or service for a specific number of days (more time, such as 60 or 90, is stronger than 30 days).

All said, you may be able to get a prospect to make a purchase decision in your favor from snap thinking and decision-making (and if you can close them quickly, then why not?). But most people don’t act that impulsively. And impulsive decisions are a slippery slope to buyer’s remorse. Slow them down, if you can, in these final days before the holidays with a few strategically placed speed bumps.

Smart Data – Not Big Data

As a concerned data professional, I am already plotting an exit strategy from this Big Data hype. Because like any bubble, it will surely burst. That inevitable doomsday could be a couple of years away, but I can feel it coming. At the risk of sounding too much like Yoda the Jedi Grand Master, all hypes lead to over-investments, all over-investments lead to disappointments, and all disappointments lead to blames. Yes, in a few years, lots of blames will go around, and lots of heads will roll.

As a concerned data professional, I am already plotting an exit strategy from this Big Data hype. Because like any bubble, it will surely burst. That inevitable doomsday could be a couple of years away, but I can feel it coming. At the risk of sounding too much like Yoda the Jedi Grand Master, all hypes lead to over-investments, all over-investments lead to disappointments, and all disappointments lead to blames. Yes, in a few years, lots of blames will go around, and lots of heads will roll.

So, why would I stay on the troubled side? Well, because, for now, this Big Data thing is creating lots of opportunities, too. I am writing this on my way back from Seoul, Korea, where I presented this Big Data idea nine times in just two short weeks, trotting from large venues to small gatherings. Just a few years back, I used to have a hard time explaining what I do for living. Now, I just have to say “Hey, I do this Big Data thing,” and the doors start to open. In my experience, this is the best “Open Sesame” moment for all data specialists. But it will last only if we play it right.

Nonetheless, I also know that I will somehow continue to make living setting data strategies, fixing bad data, designing databases and leading analytical activities, even after the hype cools down. Just with a different title, under a different banner. I’ve seen buzzwords come and go, and this data business has been carried on by the people who cut through each hype (and gargantuan amount of BS along with it) and create real revenue-generating opportunities. At the end of the day (I apologize for using this cliché), it is all about the bottom line, whether it comes from a revenue increase or cost reduction. It is never about the buzzwords that may have created the business opportunities in the first place; it has always been more about the substance that turned those opportunities into money-making machines. And substance needs no fancy title or buzzwords attached to it.

Have you heard Google or Amazon calling themselves a “Big Data” companies? They are the ones with sick amounts of data, but they also know that it is not about the sheer amount of data, but it is all about the user experience. “Wannabes” who are not able to understand the core values often hang onto buzzwords and hypes. As if Big Data, Cloud Computing or coding language du jour will come and save the day. But they are just words.

Even the name “Big Data” is all wrong, as it implies that bigger is always better. The 3 Vs of Big Data—volume, velocity and variety—are also misleading. That could be a meaningful distinction for existing data players, but for decision-makers, it gives a notion that size and speed are the ultimate quest. But for the users, small is better. They don’t have time to analyze big sets of data. They need small answers in fun size packages. Plus, why is big and fast new? Since the invention of modern computers, has there been any year when the processing speed did not get faster and storage capacity did not get bigger?

Lest we forget, it is the software industry that came up with this Big Data thing. It was created as a marketing tagline. We should have read it as, “Yes, we can now process really large amounts of data, too,” not as, “Big Data will make all your dreams come true.” If you are in the business of selling toolsets, of course, that is how you present your product. If guitar companies keep emphasizing how hard it is to be a decent guitar player, would that help their businesses? It is a lot more effective to say, “Hey, this is the same guitar that your guitar hero plays!” But you don’t become Jeff Beck just because you bought a white Fender Stratocaster with a rosewood neck. The real hard work begins “after” you purchase a decent guitar. However, this obvious connection is often lost in the data business. Toolsets never provide solutions on their own. They may make your life easier, but you’d still have to formulate the question in a logical fashion, and still have to make decisions based on provided data. And harnessing meanings out of mounds of data requires training of your mind, much like the way musicians practice incessantly.

So, before business people even consider venturing into this Big Data hype, they should ask themselves “Why data?” What are burning questions that you are trying to solve with the data? If you can’t answer this simple question, then don’t jump into it. Forget about it. Don’t get into it just because everyone else seems to be getting into it. Yeah, it’s a big party, but why are you going there? Besides, if you formulate the question properly, often you will find that you don’t need Big Data all the time. If fact, Big Data can be a terrible detour if your question can be answered by “small” data. But that happens all the time, because people approach their business questions through the processes set by the toolsets. Big Data should be about the business, not about the IT or data.

Smart Data, Not Big Data
So, how do we get over this hype? All too often, perception rules, and a replacement word becomes necessary to summarize the essence of the concept for the general public. In my opinion, “Big Data” should have been “Smart Data.” Piles of unorganized dumb data aren’t worth a damn thing. Imagine a warehouse full of boxes with no labels, collecting dust since 1943. Would you be impressed with the sheer size of the warehouse? Great, the ark that Indiana Jones procured (or did he?) may be stored in there somewhere. But if no one knows where it is—or even if it can be located, if no one knows what to do with it—who cares?

Then, how do data get smarter? Smart data are bite-sized answers to questions. A thousand variables could have been considered to provide the weather forecast that calls for a “70 percent chance of scattered showers in the afternoon,” but that one line that we hear is the smart piece of data. Not the list of all the variables that went into the formula that created that answer. Emphasizing the raw data would be like giving paints and brushes to a person who wants a picture on the wall. As in, “Hey, here are all the ingredients, so why don’t you paint the picture and hang it on the wall?” Unfortunately, that is how the Big Data movement looks now. And too often, even the ingredients aren’t all that great.

I visit many companies only to find that the databases in question are just messy piles of unorganized and unstructured data. And please do not assume that such disarrays are good for my business. I’d rather spend my time harnessing meanings out of data and creating values, not taking care of someone else’s mess all the time. Really smart data are small, concise, clean and organized. Big Data should only be seen in “Behind the Scenes” types of documentaries for manias, not for everyday decision-makers.

I have been already saying that Big Data must get smaller for some time (refer to “Big Data Must Get Smaller“) and I would repeat it until it becomes a movement on its own. The Big Data movement must be about:

  1. Cutting down the noise
  2. Providing the answers

There is too much noise in the data, and cutting it out is the first step toward making the data smaller and smarter. The trouble is that the definition of “noise” is not static. Rock music that I grew up with was certainly a noise to my parents’ generation. In turn, some music that my kids listen to is pure noise to me. Likewise, “product color,” which is essential for a database designed for an inventory management system, may or may not be noise if the goal is to sell more apparel items. In such cases, more important variables could be style, brand, price range, target gender, etc., but color could be just peripheral information at best, or even noise (as in, “Uh, she isn’t going to buy just red shoes all the time?”). How do we then determine the differences? First, set the clear goals (as in, “Why are we playing with the data to begin with?”), define the goals using logical expressions, and let mathematics take care of it. Now you can drop the noise with conviction (even if it may look important to human minds).

If we continue with that mathematical path, we would reach the second part, which is “providing answers to the question.” And the smart answers are in the forms of yes/no, probability figures or some type of scores. Like in the weather forecast example, the question would be “chance of rain on a certain day” and the answer would be “70 percent.” Statistical modeling is not easy or simple, but it is the essential part of making the data smarter, as models are the most effective way to summarize complex and abundant data into compact forms (refer to “Why Model?”).

Most people do not have degrees in mathematics or statistics, but they all know what to do with a piece of information such as “70 percent chance of rain” on the day of a company outing. Some may complain that it is not a definite yes/no answer, but all would agree that providing information in this form is more humane than dumping all the raw data onto users. Sales folks are not necessarily mathematicians, but they would certainly appreciate scores attached to each lead, as in “more or less likely to close.” No, that is not a definite answer, but now sales people can start calling the leads in the order of relative importance to them.

So, all the Big Data players and data scientists must try to “humanize” the data, instead of bragging about the size of the data, making things more complex, and providing irrelevant pieces of raw data to users. Make things simpler, not more complex. Some may think that complexity is their job security, but I strongly disagree. That is a sure way to bring down this Big Data movement to the ground. We are already living in a complex world, and we certainly do not need more complications around us (more on “How to be a good data scientist” in a future article).

It’s About the Users, Too
On the flip side, the decision-makers must change their attitude about the data, as well.

1. Define the goals first: The main theme of this series has been that the Big Data movement is about the business, not IT or data. But I’ve seen too many business folks who would so willingly take a hands-off approach to data. They just fund the database; do not define clear business goals to developers; and hope to God that someday, somehow, some genius will show up and clear up the mess for them. Guess what? That cavalry is never coming if you are not even praying properly. If you do not know what problems you want to solve with data, don’t even get started; you will get to nowhere really slowly, bleeding lots of money and time along the way.

2. Take the data seriously: You don’t have to be a scientist to have a scientific mind. It is not ideal if someone blindly subscribes anything computers spew out (there are lots of inaccurate information in databases; refer to “Not All Databases Are Created Equal.”). But too many people do not take data seriously and continue to follow their gut feelings. Even if your customer profile coming out of a serious analysis does not match with your preconceived notions, do not blindly reject it; instead, treat it as a newly found gold mine. Gut feelings are even more overrated than Big Data.

3. Be logical: Illogical questions do not lead anywhere. There is no toolset that reads minds—at least not yet. Even if we get to have such amazing computers—as seen on “Star Trek” or in other science fiction movies—you would still have to ask questions in a logical fashion for them to be effective. I am not asking decision-makers to learn how to code (or be like Mr. Spock or his loyal follower, Dr. Sheldon Cooper), but to have some basic understanding of logical expressions and try to learn how analysts communicate with computers. This is not data geek vs. non-geek world anymore; we all have to be a little geekier. Knowing Boolean expressions may not be as cool as being able to throw a curve ball, but it is necessary to survive in the age of information overload.

4. Shoot for small successes: Start with a small proof of concept before fully investing in large data initiatives. Even with a small project, one gets to touch all necessary steps to finish the job. Understanding the flow of information is as important as each specific step, as most breakdowns occur in between steps, due to lack of proper connections. There was Gemini program before Apollo missions. Learn how to dock spaceships in space before plotting the chart to the moon. Often, over-investments are committed when the discussion is led by IT. Outsource even major components in the beginning, as the initial goal should be mastering the flow of things.

5. Be buyer-centric: No customer is bound by the channel of the marketer’s choice, and yet, may businesses act exactly that way. No one is an online person just because she did not refuse your email promotions yet (refer to “The Future of Online is Offline“). No buyer is just one dimensional. So get out of brand-, division-, product- or channel-centric mindsets. Even well-designed, buyer-centric marketing databases become ineffective if users are trapped in their channel- or division-centric attitudes, as in “These email promotions must flow!” or “I own this product line!” The more data we collect, the more chances marketers will gain to impress their customers and prospects. Do not waste those opportunities by imposing your own myopic views on them. Big Data movement is not there to fortify marketers’ bad habits. Thanks to the size of the data and speed of machines, we are now capable of disappointing a lot of people really fast.

What Did This Hype Change?
So, what did this Big Data hype change? First off, it changed people’s attitudes about the data. Some are no longer afraid of large amounts of information being thrown at them, and some actually started using them in their decision-making processes. Many realized that we are surrounded by numbers everywhere, not just in marketing, but also in politics, media, national security, health care and the criminal justice system.

Conversely, some people became more afraid—often with good reasons. But even more often, people react based on pure fear that their personal information is being actively exploited without their consent. While data geeks are rejoicing in the age of open source and cloud computing, many more are looking at this hype with deep suspicions, and they boldly reject storing any personal data in those obscure “clouds.” There are some people who don’t even sign up for EZ Pass and voluntarily stay on the long lane to pay tolls in the old, but untraceable way.

Nevertheless, not all is lost in this hype. The data got really big, and types of data that were previously unavailable, such as mobile and social data, became available to many marketers. Focus groups are now the size of Twitter followers of the company or a subject matter. The collection rate of POS (point of service) data has been increasingly steady, and some data players became virtuosi in using such fresh and abundant data to impress their customers (though some crossed that “creepy” line inadvertently). Different types of data are being used together now, and such merging activities will compound the predictive power even further. Analysts are dealing with less missing data, though no dataset would ever be totally complete. Developers in open source environments are now able to move really fast with new toolsets that would just run on any device. Simply, things that our forefathers of direct marketing used to take six months to complete can be done in few hours, and in the near future, maybe within a few seconds.

And that may be a good thing and a bad thing. If we do this right, without creating too many angry consumers and without burning holes in our budgets, we are currently in a position to achieve great many things in terms of predicting the future and making everyone’s lives a little more convenient. If we screw it up badly, we will end up creating lots of angry customers by abusing sensitive data and, at the same time, wasting a whole lot of investors’ money. Then this Big Data thing will go down in history as a great money-eating hype.

We should never do things just because we can; data is a powerful tool that can hurt real people. Do not even get into it if you don’t have a clear goal in terms of what to do with the data; it is not some piece of furniture that you buy just because your neighbor bought it. Living with data is a lifestyle change, and it requires a long-term commitment; it is not some fad that you try once and give up. It is a continuous loop where people’s responses to marketer’s data-based activities create even more data to be analyzed. And that is the only way it keeps getting better.

There Is No Big Data
And all that has nothing to do with “Big.” If done right, small data can do plenty. And in fact, most companies’ transaction data for the past few years would easily fit in an iPhone. It is about what to do with the data, and that goal must be set from a business point of view. This is not just a new playground for data geeks, who may care more for new hip technologies that sound cool in their little circle.

I recently went to Brazil to speak at a data conference called QIBRAS, and I was pleasantly surprised that the main theme of it was the quality of the data, not the size of the data. Well, at least somewhere in the world, people are approaching this whole thing without the “Big” hype. And if you look around, you will not find any successful data players calling this thing “Big Data.” They just deal with small and large data as part of their businesses. There is no buzzword, fanfare or a big banner there. Because when something is just part of your everyday business, you don’t even care what you call it. You just do. And to those masters of data, there is no Big Data. If Google all of a sudden starts calling itself a Big Data company, it would be so uncool, as that word would seriously limit it. Think about that.

Death of the Salesman

There’s no question that the Willy Lomans of this world have been dying a slow, agonizing death—only instead of losing the fight to travel exhaustion, the opponent is the Internet … And marketing

There’s no question that the Willy Lomans of this world have been dying a slow, agonizing death—only instead of losing the fight to travel exhaustion, the opponent is the Internet.

According to a recent CEB article in the Harvard Business Review, 57 percent of purchase decisions are made before a customer ever talks to a supplier, and Gartner Research predicts that by 2020, customers will manage 85 percent of their relationship with an enterprise without interacting with a human. That shouldn’t surprise anyone since we spend much of our days tapping on keyboards or flicking our fingers across tiny screens.

In Willy’s day, the lead generation process would have consisted of making a phone call, setting up an appointment, hopping a plane to the prospect’s office, and dragging a sample case through the airport. In the 1980’s, that sample case turned into an overhead projector, then a slide projector and a laptop, and finally a mini projector linked to a mobile device or thumb drive. In 2014, salespeople are lucky if they can connect to a prospect on a video conferencing call.

Clearly the days of gathering in a conference room for the sales pitch are long gone. We’ve always known that sales people talk too much and buyers, who’ve never had the patience to listen, now have the tools to avoid them altogether: websites, whitepapers, case studies, videos, LinkedIn groups, webcasts—virtually anything and everything to avoid talking to sales.

As a result, the sales function has now been placed squarely in the hands of the content strategists and creators. And yes, that means that the sales function is now in the hands of marketing.

Now a different problem exists. Most marketing folks don’t know how to help the buyer along their journey because that’s not how they’ve been trained. They have no idea how different types of buyers think, or how they search for information, or make decisions, so they don’t know how to create nor position content in a meaningful and relevant way—and that’s long been the complaint of sales. In their opinion, all marketing does is churn out “fluff” that is irrelevant to a serious buyer.

Now marketers must step up and really understand how to optimize marketing tools in order to help that buyer reach the right brand decision at the end of their journey. That’s really why content has become the marketing buzz word.

And just like we despised the salesman who talked too much, potential buyers despise content that is full of sales-speak. While a product brochure has a purpose, it is not strategic content. Similarly, a webinar in which most of the supporting slides are simply advertising for the product, turns off participants who quickly express their displeasure via online chat tools to the host and by logging out of the event.

Great content should seek to:

  • Be authentic: What you say needs to sound genuine and ring true—no one believes you are the only solution to a problem. On the contrary, the discovery process is all about evaluating your options (the pros and the cons). Avoiding a question because your answer may reveal the flaws of your product or service only shines a spotlight on the issue. Honesty is always the best policy.
  • Be relevant: Share insightful information that leverages your expertise and experience; help the buyer connect the dots. “How to” articles are popular, as are comparison charts—if you’re not going to do it, the prospect will be doing it for themselves anyway, so why not help by pointing out comparison points (that benefit your product) they might not have previously considered?
  • Be timely: To get a leg up in the marketplace, you need to be prepared to add value when the timing is ripe. It’s highly unlikely that your marketplace hasn’t changed in the last 50 years. Help show buyers how your product/service is relevant in today’s marketplace—how it deals with challenges you know they’re facing or are going to face tomorrow.

Smart marketers have a lead nurturing strategy in place—an organized and logical method of sharing relevant content along the buy cycle. And that content is well written and segmented by type of decision maker. The CFO has a different set of evaluation criteria from the CEO and the CTO. Business owners look at purchase decisions through a completely different lens than a corporate manager.

Depending on the industry, business buyers have different problems they’re trying to solve, so generic content has less relevance than content that addresses specific issues in an industry segment. Those in healthcare, for example, perceive a problem from a different perspective than those in transportation.

The new name of the selling game is “Educate the Buyer—but in a helpful and relevant way.” And while Willy Loman may continue to sit at his desk making cold calls or sending out prospecting emails, the reality is nobody has the patience or interest to listen to his sales pitch any more. So marketers need to step up and accept responsibility for lead generation, lead nurturing and, in many instances, closing the sale.

How (Not) to Run an Agency RFP

Over the last several years, I’ve noticed an alarming trend in the RFP process – and I’ll boil it down to three words: Lack of respect

Over the last several years, I’ve noticed an alarming trend in the RFP process—and I’ll boil it down to three words: Lack of respect.

Agencies are always delighted when invited to participate in an Request for Proposal (RFP) process. While many may choose not to engage due to client conflict or the belief that their likelihood of being awarded the contract is nominal—or the budget outlined in the brief doesn’t come close to paying for the amount of work that will be required to achieve the client’s objectives—those that do participate have an expectation that the process will be fair and somewhat transparent.

Any agency worth its salt invests significant time, energy and out-of-pocket expense in a new business pitch. Whether it’s the early stages of completing the “competency” response (where the focus is on written information that provides an overview of the agency, some case studies that are relevant, industry experience, team bios, etc.,) or it’s a later stage when preparing for a face-to-face pitch, net-net, it takes a lot of hard work to prepare a smart, tightly integrated response that will help put your firm in the best possible light with the target decision makers. After all, we’re all supposed to be marketing experts and if we can’t market ourselves properly to a target audience of our peers, what kind of marketers are we?

That aside, recently we were included in three separate searches for a new agency and they shared a common trait—the big, black, hole.

We received the RFP, spent countless hours researching the brand to fully understand their point of differentiation, talked to past and current customers, participated in the Q&A process, coordinated with partners to fill in some capabilities gaps, and attempted to understand the financial metrics to ensure we could provide intelligent and thoughtful solutions that would actually yield a positive ROI. After weeks of work, we carefully assembled our response, printed multiple copies, bound the decks and invested in a courier to deliver it on the designated date to the clients’ location.

The next milestone on the RFP was to notify agencies that made it to the next round by XX/XX/XX.

Despite emails and phone calls to the RFP contact, we never heard a peep … even weeks and weeks after the deadline had passed.

In one instance, we finally got a junior staffer on the phone who told us the search had been cancelled and they renewed their contract with the incumbent—apparently they shopped around and convinced themselves there was no one better, but didn’t have the courage to let each participant know of their decision. But why? Afraid we’re going to try and talk them out of their decision??

In another instance, we finally got an email from a procurement officer advising us that the RFP had been cancelled—period—no other explanation. After a little sleuthing, we figured out the company hired a new marketing director in the middle of the search, and they probably wanted to regroup before proceeding. Fair enough—but don’t leave us all hung out to dry!

In a third instance, we finally tracked down an insider who told us the marcom team was going through a reorganization, and no one knew what was happening. Gosh. So glad I invested in THAT opportunity!

I’ve also noticed that many clients running RFPs are often ill-equipped to conduct the search properly. When we go through the Q&A process, they can’t seem to answer key questions that will drive strategically smart solutions. Or even basic things like:

  • Why are you looking for a new agency?
  • What are the biggest marketing challenges you’re facing today and, if you know, in the future?
  • What marketing efforts are you executing currently that are working and not working and why?
  • Who is your target audience—SPECIFICALLY?
  • What are your business metrics?
    • What is a new customer worth?
    • What is your churn rate?
    • How many products/services does a typical customer own?

The more you can share during the RFP process, the more likely you are to get intelligent, insightful ideas that can make a real difference to your business. And yes, that takes signing mutual NDA’s, investing real time and energy into the review process, and working with agency teams to discover who feels like a good “fit” and brings fresh ideas to the process that seem viable to your business.

It’s NOT a fishing expedition for free creative. (Would you go to a doctor and ask for a diagnosis without paying?) It’s NOT an exercise to freak out your incumbent so they’ll work harder/reduce their fees/change the way they do business. If that’s what you want, tell them that’s what you need, and if they don’t deliver, advise them you’re going to search for a replacement and that they needn’t participate as you have no intention of keeping the business with them.

After all, we’d all prefer not to work long nights and weekends if we don’t have a hope of winning. That’s just plain respectful.

Not All Databases Are Created Equal

Not all databases are created equal. No kidding. That is like saying that not all cars are the same, or not all buildings are the same. But somehow, “judging” databases isn’t so easy. First off, there is no tangible “tire” that you can kick when evaluating databases or data sources. Actually, kicking the tire is quite useless, even when you are inspecting an automobile. Can you really gauge the car’s handling, balance, fuel efficiency, comfort, speed, capacity or reliability based on how it feels when you kick “one” of the tires? I can guarantee that your toes will hurt if you kick it hard enough, and even then you won’t be able to tell the tire pressure within 20 psi. If you really want to evaluate an automobile, you will have to sign some papers and take it out for a spin (well, more than one spin, but you know what I mean). Then, how do we take a database out for a spin? That’s when the tool sets come into play.

Not all databases are created equal. No kidding. That is like saying that not all cars are the same, or not all buildings are the same. But somehow, “judging” databases isn’t so easy. First off, there is no tangible “tire” that you can kick when evaluating databases or data sources. Actually, kicking the tire is quite useless, even when you are inspecting an automobile. Can you really gauge the car’s handling, balance, fuel efficiency, comfort, speed, capacity or reliability based on how it feels when you kick “one” of the tires? I can guarantee that your toes will hurt if you kick it hard enough, and even then you won’t be able to tell the tire pressure within 20 psi. If you really want to evaluate an automobile, you will have to sign some papers and take it out for a spin (well, more than one spin, but you know what I mean). Then, how do we take a database out for a spin? That’s when the tool sets come into play.

However, even when the database in question is attached to analytical, visualization, CRM or drill-down tools, it is not so easy to evaluate it completely, as such practice reveals only a few aspects of a database, hardly all of them. That is because such tools are like window treatments of a building, through which you may look into the database. Imagine a building inspector inspecting a building without ever entering it. Would you respect the opinion of the inspector who just parks his car outside the building, looks into the building through one or two windows, and says, “Hey, we’re good to go”? No way, no sir. No one should judge a book by its cover.

In the age of the Big Data (you should know by now that I am not too fond of that word), everything digitized is considered data. And data reside in databases. And databases are supposed be designed to serve specific purposes, just like buildings and cars are. Although many modern databases are just mindless piles of accumulated data, granted that the database design is decent and functional, we can still imagine many different types of databases depending on the purposes and their contents.

Now, most of the Big Data discussions these days are about the platform, environment, or tool sets. I’m sure you heard or read enough about those, so let me boldly skip all that and their related techie words, such as Hadoop, MongoDB, Pig, Python, MapReduce, Java, SQL, PHP, C++, SAS or anything related to that elusive “cloud.” Instead, allow me to show you the way to evaluate databases—or data sources—from a business point of view.

For businesspeople and decision-makers, it is not about NoSQL vs. RDB; it is just about the usefulness of the data. And the usefulness comes from the overall content and database management practices, not just platforms, tool sets and buzzwords. Yes, tool sets are important, but concert-goers do not care much about the types and brands of musical instruments that are being used; they just care if the music is entertaining or not. Would you be impressed with a mediocre guitarist just because he uses the same brand of guitar that his guitar hero uses? Nope. Likewise, the usefulness of a database is not about the tool sets.

In my past column, titled “Big Data Must Get Smaller,” I explained that there are three major types of data, with which marketers can holistically describe their target audience: (1) Descriptive Data, (2) Transaction/Behavioral Data, and (3) Attitudinal Data. In short, if you have access to all three dimensions of the data spectrum, you will have a more complete portrait of customers and prospects. Because I already went through that subject in-depth, let me just say that such types of data are not the basis of database evaluation here, though the contents should be on top of the checklist to meet business objectives.

In addition, throughout this series, I have been repeatedly emphasizing that the database and analytics management philosophy must originate from business goals. Basically, the business objective must dictate the course for analytics, and databases must be designed and optimized to support such analytical activities. Decision-makers—and all involved parties, for that matter—suffer a great deal when that hierarchy is reversed. And unfortunately, that is the case in many organizations today. Therefore, let me emphasize that the evaluation criteria that I am about to introduce here are all about usefulness for decision-making processes and supporting analytical activities, including predictive analytics.

Let’s start digging into key evaluation criteria for databases. This list would be quite useful when examining internal and external data sources. Even databases managed by professional compilers can be examined through these criteria. The checklist could also be applicable to investors who are about to acquire a company with data assets (as in, “Kick the tire before you buy it.”).

1. Depth
Let’s start with the most obvious one. What kind of information is stored and maintained in the database? What are the dominant data variables in the database, and what is so unique about them? Variety of information matters for sure, and uniqueness is often related to specific business purposes for which databases are designed and created, along the lines of business data, international data, specific types of behavioral data like mobile data, categorical purchase data, lifestyle data, survey data, movement data, etc. Then again, mindless compilation of random data may not be useful for any business, regardless of the size.

Generally, data dictionaries (lack of it is a sure sign of trouble) reveal the depth of the database, but we need to dig deeper, as transaction and behavioral data are much more potent predictors and harder to manage in comparison to demographic and firmographic data, which are very much commoditized already. Likewise, Lifestyle variables that are derived from surveys that may have been conducted a long time ago are far less valuable than actual purchase history data, as what people say they do and what they actually do are two completely different things. (For more details on the types of data, refer to the second half of “Big Data Must Get Smaller.”)

Innovative ideas should not be overlooked, as data packaging is often very important in the age of information overflow. If someone or some company transformed many data points into user-friendly formats using modeling or other statistical techniques (imagine pre-developed categorical models targeting a variety of human behaviors, or pre-packaged segmentation or clustering tools), such effort deserves extra points, for sure. As I emphasized numerous times in this series, data must be refined to provide answers to decision-makers. That is why the sheer size of the database isn’t so impressive, and the depth of the database is not just about the length of the variable list and the number of bytes that go along with it. So, data collectors, impress us—because we’ve seen a lot.

2. Width
No matter how deep the information goes, if the coverage is not wide enough, the database becomes useless. Imagine well-organized, buyer-level POS (Point of Service) data coming from actual stores in “real-time” (though I am sick of this word, as it is also overused). The data go down to SKU-level details and payment methods. Now imagine that the data in question are collected in only two stores—one in Michigan, and the other in Delaware. This, by the way, is not a completely made -p story, and I faced similar cases in the past. Needless to say, we had to make many assumptions that we didn’t want to make in order to make the data useful, somehow. And I must say that it was far from ideal.

Even in the age when data are collected everywhere by every device, no dataset is ever complete (refer to “Missing Data Can Be Meaningful“). The limitations are everywhere. It could be about brand, business footprint, consumer privacy, data ownership, collection methods, technical limitations, distribution of collection devices, and the list goes on. Yes, Apple Pay is making a big splash in the news these days. But would you believe that the data collected only through Apple iPhone can really show the overall consumer trend in the country? Maybe in the future, but not yet. If you can pick only one credit card type to analyze, such as American Express for example, would you think that the result of the study is free from any bias? No siree. We can easily assume that such analysis would skew toward the more affluent population. I am not saying that such analyses are useless. And in fact, they can be quite useful if we understand the limitations of data collection and the nature of the bias. But the point is that the coverage matters.

Further, even within multisource databases in the market, the coverage should be examined variable by variable, simply because some data points are really difficult to obtain even by professional data compilers. For example, any information that crosses between the business and the consumer world is sparsely populated in many cases, and the “occupation” variable remains mostly blank or unknown on the consumer side. Similarly, any data related to young children is difficult or even forbidden to collect, so a seemingly simple variable, such as “number of children,” is left unknown for many households. Automobile data used to be abundant on a household level in the past, but a series of laws made sure that the access to such data is forbidden for many users. Again, don’t be impressed with the existence of some variables in the data menu, but look into it to see “how much” is available.

3. Accuracy
In any scientific analysis, a “false positive” is a dangerous enemy. In fact, they are worse than not having the information at all. Many folks just assume that any data coming out a computer is accurate (as in, “Hey, the computer says so!”). But data are not completely free from human errors.

Sheer accuracy of information is hard to measure, especially when the data sources are unique and rare. And the errors can happen in any stage, from data collection to imputation. If there are other known sources, comparing data from multiple sources is one way to ensure accuracy. Watching out for fluctuations in distributions of important variables from update to update is another good practice.

Nonetheless, the overall quality of the data is not just up to the person or department who manages the database. Yes, in this business, the last person who touches the data is responsible for all the mistakes that were made to it up to that point. However, when the garbage goes in, the garbage comes out. So, when there are errors, everyone who touched the database at any point must share in the burden of guilt.

Recently, I was part of a project that involved data collected from retail stores. We ran all kinds of reports and tallies to check the data, and edited many data values out when we encountered obvious errors. The funniest one that I saw was the first name “Asian” and the last name “Tourist.” As an openly Asian-American person, I was semi-glad that they didn’t put in “Oriental Tourist” (though I still can’t figure out who decided that word is for objects, but not people). We also found names like “No info” or “Not given.” Heck, I saw in the news that this refugee from Afghanistan (he was a translator for the U.S. troops) obtained a new first name as he was granted an entry visa, “Fnu.” That would be short for “First Name Unknown” as the first name in his new passport. Welcome to America, Fnu. Compared to that, “Andolini” becoming “Corleone” on Ellis Island is almost cute.

Data entry errors are everywhere. When I used to deal with data files from banks, I found that many last names were “Ira.” Well, it turned out that it wasn’t really the customers’ last names, but they all happened to have opened “IRA” accounts. Similarly, movie phone numbers like 777-555-1234 are very common. And fictitious names, such as “Mickey Mouse,” or profanities that are not fit to print are abundant, as well. At least fake email addresses can be tested and eliminated easily, and erroneous addresses can be corrected by time-tested routines, too. So, yes, maintaining a clean database is not so easy when people freely enter whatever they feel like. But it is not an impossible task, either.

We can also train employees regarding data entry principles, to a certain degree. (As in, “Do not enter your own email address,” “Do not use bad words,” etc.). But what about user-generated data? Search and kill is the only way to do it, and the job would never end. And the meta-table for fictitious names would grow longer and longer. Maybe we should just add “Thor” and “Sponge Bob” to that Mickey Mouse list, while we’re at it. Yet, dealing with this type of “text” data is the easy part. If the database manager in charge is not lazy, and if there is a bit of a budget allowed for data hygiene routines, one can avoid sending emails to “Dear Asian Tourist.”

Numeric errors are much harder to catch, as numbers do not look wrong to human eyes. That is when comparison to other known sources becomes important. If such examination is not possible on a granular level, then median value and distribution curves should be checked against historical transaction data or known public data sources, such as U.S. Census Data in the case of demographic information.

When it’s about the companies’ own data, follow your instincts and get rid of data that look too good or too bad to be true. We all can afford to lose a few records in our databases, and there is nothing wrong with deleting the “outliers” with extreme values. Erroneous names, like “No Information,” may be attached to a seven-figure lifetime spending sum, and you know that can’t be right.

The main takeaways are: (1) Never trust the data just because someone bothered to store them in computers, and (2) Constantly look for bad data in reports and listings, at times using old-fashioned eye-balling methods. Computers do not know what is “bad,” until we specifically tell them what bad data are. So, don’t give up, and keep at it. And if it’s about someone else’s data, insist on data tallies and data hygiene stats.

4. Recency
Outdated data are really bad for prediction or analysis, and that is a different kind of badness. Many call it a “Data Atrophy” issue, as no matter how fresh and accurate a data point may be today, it will surely deteriorate over time. Yes, data have a finite shelf-life, too. Let’s say that you obtained a piece of information called “Golf Interest” on an individual level. That information could be coming from a survey conducted a long time ago, or some golf equipment purchase data from a while ago. In any case, someone who is attached to that flag may have stopped shopping for new golf equipment, as he doesn’t play much anymore. Without a proper database update and a constant feed of fresh data, irrelevant data will continue to drive our decisions.

The crazy thing is that, the harder it is to obtain certain types of data—such as transaction or behavioral data—the faster they will deteriorate. By nature, transaction or behavioral data are time-sensitive. That is why it is important to install time parameters in databases for behavioral data. If someone purchased a new golf driver, when did he do that? Surely, having bought a golf driver in 2009 (“Hey, time for a new driver!”) is different from having purchased it last May.

So-called “Hot Line Names” literally cease to be hot after two to three months, or in some cases much sooner. The evaporation period maybe different for different product types, as one may stay longer in the market for an automobile than for a new printer. Part of the job of a data scientist is to defer the expiration date of data, finding leads or prospects who are still “warm,” or even “lukewarm,” with available valid data. But no matter how much statistical work goes into making the data “look” fresh, eventually the models will cease to be effective.

For decision-makers who do not make real-time decisions, a real-time database update could be an expensive solution. But the databases must be updated constantly (I mean daily, weekly, monthly or even quarterly). Otherwise, someone will eventually end up making a wrong decision based on outdated data.

5. Consistency
No matter how much effort goes into keeping the database fresh, not all data variables will be updated or filled in consistently. And that is the reality. The interesting thing is that, especially when using them for advanced analytics, we can still provide decent predictions if the data are consistent. It may sound crazy, but even not-so-accurate-data can be used in predictive analytics, if they are “consistently” wrong. Modeling is developing an algorithm that differentiates targets and non-targets, and if the descriptive variables are “consistently” off (or outdated, like census data from five years ago) on both sides, the model can still perform.

Conversely, if there is a huge influx of a new type of data, or any drastic change in data collection or in a business model that supports such data collection, all bets are off. We may end up predicting such changes in business models or in methodologies, not the differences in consumer behavior. And that is one of the worst kinds of errors in the predictive business.

Last month, I talked about dealing with missing data (refer to “Missing Data Can Be Meaningful“), and I mentioned that data can be inferred via various statistical techniques. And such data imputation is OK, as long as it returns consistent values. I have seen so many so-called professionals messing up popular models, like “Household Income,” from update to update. If the inferred values jump dramatically due to changes in the source data, there is no amount of effort that can save the targeting models that employed such variables, short of re-developing them.

That is why a time-series comparison of important variables in databases is so important. Any changes of more than 5 percent in distribution of variables when compared to the previous update should be investigated immediately. If you are dealing with external data vendors, insist on having a distribution report of key variables for every update. Consistency of data is more important in predictive analytics than sheer accuracy of data.

6. Connectivity
As I mentioned earlier, there are many types of data. And the predictive power of data multiplies as different types of data get to be used together. For instance, demographic data, which is quite commoditized, still plays an important role in predictive modeling, even when dominant predictors are behavioral data. It is partly because no one dataset is complete, and because different types of data play different roles in algorithms.

The trouble is that many modern datasets do not share any common matching keys. On the demographic side, we can easily imagine using PII (Personally Identifiable Information), such as name, address, phone number or email address for matching. Now, if we want to add some transaction data to the mix, we would need some match “key” (or a magic decoder ring) by which we can link it to the base records. Unfortunately, many modern databases completely lack PII, right from the data collection stage. The result is that such a data source would remain in a silo. It is not like all is lost in such a situation, as they can still be used for trend analysis. But to employ multisource data for one-to-one targeting, we really need to establish the connection among various data worlds.

Even if the connection cannot be made to household, individual or email levels, I would not give up entirely, as we can still target based on IP addresses, which may lead us to some geographic denominations, such as ZIP codes. I’d take ZIP-level targeting anytime over no targeting at all, even though there are many analytical and summarization steps required for that (more on that subject in future articles).

Not having PII or any hard matchkey is not a complete deal-breaker, but the maneuvering space for analysts and marketers decreases significantly without it. That is why the existence of PII, or even ZIP codes, is the first thing that I check when looking into a new data source. I would like to free them from isolation.

7. Delivery Mechanisms
Users judge databases based on visualization or reporting tool sets that are attached to the database. As I mentioned earlier, that is like judging the entire building based just on the window treatments. But for many users, that is the reality. After all, how would a casual user without programming or statistical background would even “see” the data? Through tool sets, of course.

But that is the only one end of it. There are so many types of platforms and devices, and the data must flow through them all. The important point is that data is useless if it is not in the hands of decision-makers through the device of their choice, at the right time. Such flow can be actualized via API feed, FTP, or good, old-fashioned batch installments, and no database should stay too far away from the decision-makers. In my earlier column, I emphasized that data players must be good at (1) Collection, (2) Refinement, and (3) Delivery (refer to “Big Data is Like Mining Gold for a Watch—Gold Can’t Tell Time“). Delivering the answers to inquirers properly closes one iteration of information flow. And they must continue to flow to the users.

8. User-Friendliness
Even when state-of-the-art (I apologize for using this cliché) visualization, reporting or drill-down tool sets are attached to the database, if the data variables are too complicated or not intuitive, users will get frustrated and eventually move away from it. If that happens after pouring a sick amount of money into any data initiative, that would be a shame. But it happens all the time. In fact, I am not going to name names here, but I saw some ridiculously hard to understand data dictionary from a major data broker in the U.S.; it looked like the data layout was designed for robots by the robots. Please. Data scientists must try to humanize the data.

This whole Big Data movement has a momentum now, and in the interest of not killing it, data players must make every aspect of this data business easy for the users, not harder. Simpler data fields, intuitive variable names, meaningful value sets, pre-packaged variables in forms of answers, and completeness of a data dictionary are not too much to ask after the hard work of developing and maintaining the database.

This is why I insist that data scientists and professionals must be businesspeople first. The developers should never forget that end-users are not trained data experts. And guess what? Even professional analysts would appreciate intuitive variable sets and complete data dictionaries. So, pretty please, with sugar on top, make things easy and simple.

9. Cost
I saved this important item for last for a good reason. Yes, the dollar sign is a very important factor in all business decisions, but it should not be the sole deciding factor when it comes to databases. That means CFOs should not dictate the decisions regarding data or databases without considering the input from CMOs, CTOs, CIOs or CDOs who should be, in turn, concerned about all the other criteria listed in this article.

Playing with the data costs money. And, at times, a lot of money. When you add up all the costs for hardware, software, platforms, tool sets, maintenance and, most importantly, the man-hours for database development and maintenance, the sum becomes very large very fast, even in the age of the open-source environment and cloud computing. That is why many companies outsource the database work to share the financial burden of having to create infrastructures. But even in that case, the quality of the database should be evaluated based on all criteria, not just the price tag. In other words, don’t just pick the lowest bidder and hope to God that it will be alright.

When you purchase external data, you can also apply these evaluation criteria. A test-match job with a data vendor will reveal lots of details that are listed here; and metrics, such as match rate and variable fill-rate, along with complete the data dictionary should be carefully examined. In short, what good is lower unit price per 1,000 records, if the match rate is horrendous and even matched data are filled with missing or sub-par inferred values? Also consider that, once you commit to an external vendor and start building models and analytical framework around their its, it becomes very difficult to switch vendors later on.

When shopping for external data, consider the following when it comes to pricing options:

  • Number of variables to be acquired: Don’t just go for the full option. Pick the ones that you need (involve analysts), unless you get a fantastic deal for an all-inclusive option. Generally, most vendors provide multiple-packaging options.
  • Number of records: Processed vs. Matched. Some vendors charge based on “processed” records, not just matched records. Depending on the match rate, it can make a big difference in total cost.
  • Installment/update frequency: Real-time, weekly, monthly, quarterly, etc. Think carefully about how often you would need to refresh “demographic” data, which doesn’t change as rapidly as transaction data, and how big the incremental universe would be for each update. Obviously, a real-time API feed can be costly.
  • Delivery method: API vs. Batch Delivery, for example. Price, as well as the data menu, change quite a bit based on the delivery options.
  • Availability of a full-licensing option: When the internal database becomes really big, full installment becomes a good option. But you would need internal capability for a match and append process that involves “soft-match,” using similar names and addresses (imagine good-old name and address merge routines). It becomes a bit of commitment as the match and append becomes a part of the internal database update process.

Business First
Evaluating a database is a project in itself, and these nine evaluation criteria will be a good guideline. Depending on the businesses, of course, more conditions could be added to the list. And that is the final point that I did not even include in the list: That the database (or all data, for that matter) should be useful to meet the business goals.

I have been saying that “Big Data Must Get Smaller,” and this whole Big Data movement should be about (1) Cutting down on the noise, and (2) Providing answers to decision-makers. If the data sources in question do not serve the business goals, cut them out of the plan, or cut loose the vendor if they are from external sources. It would be an easy decision if you “know” that the database in question is filled with dirty, sporadic and outdated data that cost lots of money to maintain.

But if that database is needed for your business to grow, clean it, update it, expand it and restructure it to harness better answers from it. Just like the way you’d maintain your cherished automobile to get more mileage out of it. Not all databases are created equal for sure, and some are definitely more equal than others. You just have to open your eyes to see the differences.

5 Copy Approaches to Influence Gut Reaction

Call it a gut reaction, but oftentimes our prospects and customers make decisions and respond based on intuition, a hunch, or professional judgment. In direct response, we want quick action. We know if the prospect drifts away from our message we’ll lose them, usually forever. So while the logic and quantification of your sales story may be overwhelmingly in your favor, it can be intuition that turns the prospect away because

 

Call it a gut reaction, but oftentimes our prospects and customers make decisions and respond based on intuition, a hunch or professional judgment. In direct response, we want quick action. We know if the prospect drifts away from our message, we’ll lose them, usually forever. So while the logic and quantification of your sales story may be overwhelmingly in your favor, it can be intuition that turns the prospect away because of something that felt too good to be true, leaves room for skepticism, or an unintended nuance in copy that you overlooked and loses the sale for you.

Even if all the arguments you’ve made in your content are authentically and credibly in your favor, a person’s gut decision often prevails.

And here’s what is frustrating: Studies suggest that often a person’s gut reaction is wrong because it’s subject to bias. Your prospect might overestimate his or her ability to assemble a product, for example. Or may think it takes too much time to read your information, learning materials or book. Perhaps when your prospect has made a mistake related to what you’re selling, he or she doesn’t understand why, or is hesitant to ask for help or feedback. And she or he forgets. That is, customers forget the last time they made a poor decision based on their gut instead of listening to logic.

How do you overcome gut emotion and reaction? You have to help your prospective customers or donors through the decision making process. Do it with these ideas:

  1. Lead your prospect to a sense of revelation. That happens when the obvious in your conscious mind finally learns something that your subconscious mind already knew. Ask yourself: When are you most creative (what you might consider right brain thinking)? For most people, it’s when we are exercising, walking, jogging, listening to music, in the shower, or in an unfamiliar environment. Some of my best ideas have struck me while on vacation, when my mind is suspended from the consciousness of day-to-day responsibilities. Lead your prospect to an awakening.
  2. Give ’em chills. A reaction inside the mind often is accompanied by a physical sensation. It could be chills or goosebumps. For some people, it may be an unusual feeling in the stomach or throat. You can create these physical sensations when copy is accompanied by strong visuals that paint a picture. Music is another way to stimulate a physical reaction. While you can’t pipe in music to printed material, you can use music in video or on your website.
  3. Past experience recall. Your brain’s hippocampus stores long-term memory. Long-term memories are with you for your entire life, unless something comes along to pave new grooves and create a new memory. You aren’t likely to replace past long-term memories, but you do have the opportunity to create another memory that neutralizes a bad memory, or enhance a good memory. Creating new memory is harder to do than drawing on a past memory. When you can, allow your content to take your prospect to a positive place, or hit a negative memory head-on with something so strong you can overcome the negativity.
  4. Challenge the perceptual rules made up in the mind. For some people, changing an ingrained rule is impossible, even if it’s wrong. If when a person can’t articulate why the rule exists, you may be able to use an overwhelming amount of empirical data or statistics from credible third party sources to turn around a rules-based individual. But don’t count on it.
  5. Recognize Patterns and Cross-Index. Help your prospect see something familiar to engage intuitive skills. The more material about your product or service that you can provide to cross-index in the mind, the higher likelihood your prospect’s intuition will kick in on a positive note for you.

You won’t always be able to prevail over intuition or gut reaction, but when you anticipate that probability in your copy, you can turnaround a potential lost sale.

Extended Coverage: USPS – Will It Disappear?

When your editor makes a decision to defend you in the comments section below a feature article, then the article must have hit a nerve! I talked to several mailers, and association leaders who represent them, in a feature this month in the magazine … as I should: mailers have a lot to say about goings-on at the Postal Service

When your editor makes a decision to defend you in the comments section below a feature article, then the article must have hit a nerve!

I talked to several mailers, and association leaders who represent them, in a feature this month in the magazine … as I should: mailers have a lot to say about goings-on at the Postal Service (and not-goings-on in Congress) leading some mail marketers to re-evaluate the medium. I’d say it is a timely premise—particularly with the recent exigent postage hike on top of the inflation-indexed hike.

Far more was offered than I could include in the feature. However, “Marketing Sustainably” has a bit of room and—with my editor’s permission—allow me to share a few more observations.

Let me be clear, every mailer I talked to wants the Postal Service to succeed. The prescriptions may vary. What may be unclear is how it will succeed…

Always the Postologist, Charley Howard of Harte-Hanks had these points to share on a future path:

“If the Postal Service is allowed to manage its own healthcare, get the pre-retirement funding relief from Congress that it is due, and get Congress to back off on leaning in on operations, I believe that we would have a USPS that is both viable and competitive. We should close post offices that only see 1.5 people a day, limit some mail delivery to five days (keep the parcels moving) and have the USPS become more sensitive to pricing. These outcomes require enabling legislation—and that’s a big ‘if’ and certainly not likely in an election year, never mind by 2020 or 2025.”

“I believe the leadership of the USPS, Postmaster General Patrick Donahoe in particular, has made the right decisions to try and save the post office,” says Paul Ercolino of U.S. Monitor. “Cost cutting, Network Rationalization and five-day delivery are all controversial decisions, but they are essential if the Post Office is to survive in the coming years.”

Hamilton Davison of the American Catalog Mailers Association spoke about innovation—but still sees challenges because of the process of oversight:

“Innovation on the revenue side, or improvements to [the Postal Service’s] cost structure, will only occur if it is given the freedom to experiment free from regulatory or political concerns. While it is right and proper that the enormous market power of the Postal Service not be unchecked, it should be given greater freedom in advancing markets or improving its cost structure without undue concern about these regulatory and political pressures. Management today is handcuffed in too many areas. Barriers to experimentation on a modest scale must be removed so the USPS can demonstrate pathways for greater innovation that can then be rolled out system-wide under the review of a regulator. Getting the regulator involved in early stage exploration of potential innovation is much more cumbersome.”

And Joel Quadrucci of Quad-Graphics spoke to mail’s role in a multichannel, digital-savvy world:

“We live in a multichannel media world, and print is—and will continue to be—a critical marketing and communications channel,” he said. “Print is especially powerful when connected with other channels. Direct mail is a critical channel because of its ability to drive action to numerous other media channels. Direct mail and digital marketing channels will move forward hand in hand, with direct mail creating a compelling call to action and digital marketing channels giving consumers a way to act.”

“The entire world of logistics is evolving along with retail,” Quadrucci continued. “More and more consumers are opting for the convenience of shopping online. We already see it with Amazon building distribution centers all over the country with the goal of facilitating same-day delivery of its products. The USPS could play a pivotal role in this evolving world of logistics; it is has many strengths. But in order to be competitive with alternative delivery systems, it must address its current challenges head-on.”

Clearly marketers must stay engaged with the Postal Service—and with Congress—as we tackle these challenges together. The Postal Service clearly has my support, too. Now if I could only sate Denny Hatch.