It’s not just policymakers who are trying to figure out how to act on consumer sentiments toward data privacy. We all, overwhelmingly, want it — business and consumer.
We are all seeking a U.S. federal privacy law to “repair” what may be broken in Europe (hey, the toaster needs fixing), and to correct any perceived privacy shortcomings in California’s new law (scheduled to take effect in January). Will such a federal law pass this year?
One of the ongoing challenges for policy in this area is what’s been called the privacy paradox. The paradox? Privacy in the form of consumer attitudes, and privacy in the form of consumer demands and behaviors, rarely are in sync. Sometimes, they are polar opposites, simultaneously!
- Should law be enacted on how we feel, or respectful of what we actually do?
- How do we define privacy harms and focus regulation only what is harmful — and to go light, very light, or even foster wholly beneficial uses?
- Should private sector controls and public sector controls be differentiated?
- Do existing laws and ethical codes of conduct apply, and how might they be modified for the digital age?
On top of this, consumer expectations with data and technology are not fixed. Their comfort levels with how information is used — at least in the advertising sector — change over time. In fact, some marketers can’t keep pace with consumer demands to be identified, recognized and rewarded across channels. Generations, too, have differences in attitudes and behaviors.
What’s creepy today may in fact be tomorrow’s consumer-demanded convenience.
Case in point: It used to be people complained about remarketing — the ad following them around on the Net as they browsed. (All the same, remarketing works — that’s why it was so pervasive.) Today, in role reversal, consumers sound off when the product they purchased is the same product they still see in the display ad. The consumer has little patience when brand data is locked in data silos: the transaction database doesn’t inform the programmatic media buy, in this scenario.
The marketing and advertising business have been trying to solve for the privacy paradox since the Direct Marketing Association assembled its first code of ethics in the 1960s and introduced the Mail Preference Service in 1971. (Today, the Mail Preference Service is now known as dmaChoice, and DMA is now part of the Data Marketing & Analytics division of the Association of National Advertisers.) During the 1970s, consumers could use MPS to both add their names to marketing lists, and to remove their names from marketing lists for direct mail. At that time, far more consumers sought to add their names. Later, MPS strictly devoted itself to offering consumers an industry-wide opt-out for national direct mail, with add-ons for sweepstakes and halting mail to the deceased.
During the ’70s, DMA also required its member mailers (and later telemarketers and emailers) to maintain their own in-house suppression lists. These ethics behaviors were codified, to some extent, when the U.S. government enabled the Do-Not-Call registry and enacted the CAN-SPAM Act to complement these efforts.
Fair Information Practice Principles — A Framework That Still Works Wonders
So here we are in the digital age, where digital display and mobile advertising are among addressable media’s growing family. Again, the marketing community rose to the challenge — enacting the Digital Advertising Alliance YourAdChoices program (disclaimer, a client) and offering consumers an opt-out program for data collection used for interest-based advertising for Web browsing (desktop and mobile) and mobile applications.
Over and over again, the pattern is the same: Give consumers notice, give consumers control, prevent unauthorized uses of marketing data, protect sensitive areas — recognize advertising’s undeniable social and economic power, enable brands to connect to consumers through relevance and trust — and act to prevent real harms, rather than micromanage minor annoyances. Allow marketing innovations that create diversity in content, competition and democratization of information. Let the private sector invest in data where no harms exist.
‘I own my data!’
Data ownership is a dicey concept. Isn’t there sweat equity when a business builds a physical or virtual storefront and you choose to interact with it? Is there not some expectation of data being contributed in fair exchange for the digital content we freely consume — and the apps we download and enjoy? And once we elect to become a customer, isn’t it better for the brand to know you better, to serve you better? Shouldn’t loyalty over time be rewarded? That’s an intelligent data exchange, and the economy grows with it.
The demand for access to everything free, without ads, and without data exchange, without payment to creators is a demand for intellectual property theft. Sooner than later, the availability and diversity of that content would be gone. And so would democracy. If you put everything behind an ad-free paywall, then only the elites would have access.
‘But I pay for my Internet service. I pay for my phone service!’
Sure you do — and that pays for the cell towers, and tech and Web infrastructure, union labor — with some profit for the provider. But unless you’re also paying for subscriptions and content — it’s advertising that is footing the bill for the music you listen to, the news you read, the apps you use, and so on. All the better when ads are relevant.
At the end of the day, the consumer is always right — and privacy is personally defined.
I’m all for limits on what governments can do with data when it comes to surveillance, and how it goes about maintaining our safety and security (a paradox of its own).
On the private sector side, policymakers might best act to give a privacy floor (do no harm) — and where economic benefits accrue (to serve consumers without harms) — allow consumers freely accessible tools to set their own privacy walls, using browser settings, industry opt-outs, brand preference centers and other widely available no-cost filters. It’s a wise society that can encourage responsible data flows, while blocking altogether irresponsible data flows. Get it right, and we all participate in a thriving 21st Century Information Economy. Get it wrong, and Europe and China will set the global rules. With some luck and deliberation, we’ll get this right.