It’s no new news that brands track our purchases and then send us coupons, promotions, special offers and “news” that fit our shopping patterns. That’s cool. Bring it on as, in most cases, we win with worthwhile discounts, loyalty rewards, and such that pay off in one way or another.
We expect this kind of personalized communications for simple products bought at Wal-Mart, Target, Amazon and so on. In most cases, we all know its happening, and its okay because its data that is not threatening. Who cares if Walmart knows I buy Newman’s spaghetti sauce, or that I have a fetish for glitter green nail polish? Right?
But, with all of the new technology available to track, monitor and influence consumers’ purchasing behavior in real-time, the game is changing.
We now are being listened to on our social sites so Facebook and others can serve us up ads for products we just browsed and might have left in our shopping cart, upping its profits if the social network can get us to go back and buy.
And we are being watched by big data users when we go to the store physically — not just online.
And for most — myself, included — this doesn’t feel so good.
Consider this: When out of town, shopping at a store where I don’t usually shop, I bought mouse traps as I unwittingly let one of these unpleasant creatures in my house. That night, while opening up the Solitaire app on my iPhone to help me find sleep, an ad for that very brand and type of mousetrap appeared on my phone. Odd, but I noted that someone was possibly tracking my purchases via my credit card and then appending that to my phone. Okay. Not what I signed up for, but I understood it — at least for this one purchase.
Then consider this: My husband went to the store and used his credit card to buy a little-known brand of gluten free bread — two uncommon variables, right there. Within the hour, an ad for that very brand and product showed up on MY phone, not his, but MY phone. Suddenly “watching” my purchases and those of my family is not okay with me any more, and it conjured up a lot of “what ifs.”
- My husband had just bought me a 2-carat sapphire ring and wanted to keep it a surprise?
- What if my husband had just bought medication for an illness he had not told me about yet?
- And what if I were advertising my house on VRBO for holiday rentals and somehow my phone number on the listing was associated with that mousetrap purchase and all potential renters saw an ad in their side bar about mouse traps? That could conjure up a lot of yucky feelings, unconsciously, which could be unintentionally associated with my listing.
The list goes on … and so do:
The Questions All of Us Marketers Must Ask Ourselves
At what point does data tracking, customer profiling and targeted, automated marketing cross the line from “personalized customer service and care” to “creepy, stalkerish behavior” that makes consumers feel exposed, vulnerable and just downright uncomfortable?
How you answer this question and adapt your automated marketing messages and campaigns is critical. You might argue that our devices are anonymized and that brands really don’t know who goes with what IP address or device codes. But is this really accurate in terms of the possibility to pinpoint specifics about individuals? Consider the following example from an article posted on DeZyre.com.
An office supply store sent a customer a promotional letter and set up the personalization process to reference a personal detail or transaction on the envelope. In this case, that personalized envelope “teaser” was “Daughter killed in car crash.” That was not information he had opted to share with this office supply store, and clearly not information that was related to anything the store needed to know to offer him more laser pointers or copy paper at a discount. It is information that clearly was gleaned from other sources about his personal life and potentially legal or government records; which, clearly, he did not volunteer to a store for customer service purposes.
Be Honest — With Yourselves
Again, managers and servers of big data maintain that their promotional messages are sent to devices that are anonymized, so no secrets are revealed and consumers are not exposed. But at the end of the day, is it really? Any database that has customer transactions that also contains devices, IP addresses and names can be tracked back to an individual. Just ask the FBI, CIA, Mueller and any other investigative unit.
And is it really anonymized when social listening takes place? Track your conversations online and see what ads pop up shortly thereafter.
Beyond asking yourselves where you should cross the line, ask consumers how they feel about ads that “creep” up outside of personalized coupons you send via an opt-in program. I did just that on my Facebook page and here’s what came back from consumers:
- “Scary and happening more frequently. Not okay.”
- “It bothers me to no end. Once I started noticing it, I have become increasingly aware of it and it scares the $%^( out of me.”
- “If I don’t sign up for it, it bothers me.”
- “I always find it creepy when I’ve been looking/shopping for something and all of a sudden I get an ad for it.”
- “Time to live off the grid and pay cash.”
- “This is very scary.”
- “No way!”
If consumers are scared of what you know about them, its time to rethink that proverbial line. Don’t cross it just because you can or because you’ve invested in the technology that automatically delivers those ads, so you have to use it fully to get your promised ROI. Think about how you can use this amazing data and technology for real-time marketing across devices and channels in ways that actually please customers vs. scare them, like inviting them to opt in like we have for so many other channels.
It’s not just a courtesy to involve customers in the decision to watch them in order to serve them really relevant timely ads, it’s critical to our future as an industry. How? Because if we don’t do it, we will likely increase more of those opt-outs and even legal regulations that will force us to stop communicating despite honest and good intentions we might have.
Consequences for Marketers
Think about it. Consumers have spoken up about getting harassed on the phone by opting into the “do not call” list. Consumers have shut down unwanted emails by advocating against spam and assuring they have a choice to opt out. Brands that spam are blacklisted and shut down by email servers as a result.
Just these two examples of consumer backlash have impacted the way we communicate with consumers and laws have been passed that we can’t get around. If we continue to serve “anonymized” ads to personal devices on apps that are personal, like my Solitaire game, are we setting ourselves up for more regulation — in addition to increased opt-outs for “permission” marketing from more angry, frustrated consumers who leave our brand to patronize one that doesn’t follow their every move?
As marketers, we have a big responsibility not to just do our jobs and fuel sales and lifetime value, but to consumers and our customers to preserve what matters most to them: anonymity, privacy and security.
Curious about your thoughts? Agree? Disagree? Please post your thoughts, suggestions and ideas for how we can continue to use the power of personalization, big data and automated marketing for the greater good? (The greater good for us and our happy, lifelong customers.)