I refuse to jump on the privacy “scandal” bandwagon.
It is rough listening in this week to certain lawmakers fail to recognize the absolute benefits accrued by consumers through the responsible collection and use of data for commerce, advertising and innovation. Yes, data handling requires stewardship — but that doesn’t mean “data” in and of itself — constitutes anything close to being a harm that needs to be regulated, as if there were no rules already in place.
Whether or not I, as a voter, saw Russian-administered content online, in an alleged bid to stir up controversy and division among the American electorate, is an understandable concern. It should be investigated — because we need to isolate and diminish any and all “malicious” actors in our digital economy and democracy. Fraud, too, is a harm that must be isolated, identified and eradicated. You’ll have no debate from me here.
But let’s not conflate malicious actors with bona fide relevant advertising and content being presented to, and engaged by, consumers — a wholly beneficial outcome that public policy must acknowledge. Data, commerce and advertising are not dirty words — they are engines for job growth, innovation and — yes, even government revenue through taxes. Ads finance content freely available and useful to consumers and other businesses — and pay for journalism, too, and a diversity of content on the Net.
But all of this responsible data collection and use are useless if there is no consumer trust.
I argue that the U.S. approach to data regulation — legal restriction where sensitive personal information is collected, transferred and applied through sector-specific laws concerning health data, certain government IDs, personal finance, children’s data and credit — and where Fair Information Principles are applied for other data categories that do not have such propensity for harm. Consumers are extremely well-served by this regimen. It is far more harmful to have a breach of credit data, for example, than to have business entities share consumers’ advertising profiles in non-sensitive categories — the latter data use being wholly beneficial. U.S. data protections thus are wisely targeted and measured.
Ads pay the freight — and the economy grows as a direct result. Better for an ad to be relevant to an audience. I believe this is superior (at least for Americans) than other highly prescriptive regimes (read, Europe) that make little distinctions between data categories and require “opt-in” permissions for any and all types of data sharing, including that related to advertising and marketing. If consumers are inundated with opt-in requests for less sensitive types of data use, even for categories of use where they directly benefit, then inertia and fatigue will prevail and useful data flows won’t happen. I’m hoping businesses handling European citizen data will find ways to prove me wrong!
Is the motivation of such draconian restrictions in Europe really privacy protection as a fundamental human right? Or is it somehow questioning commerce, competition, diversity, trade and innovation, and other important social aims financed by advertising — all sacrificed on the altar of permission? Yes, consumer control should be a default with ad data — but affirmative consent should be reserved for data categories where the propensity for harm is real. This point of view, in Europe at least, is rhetorical — because the law is the law. And as of May 25, the General Data Protection Regulation is enforceable on all global entities touching EU citizen data. God Save the Queen and her data subjects, and perhaps all of us.
At a recent privacy conference, one of the primary architects of the European Union’s ePrivacy Regulation (a follow-up to GDPR) said, “What we are aiming at is to abolish surveillance-driven advertising.”
“Surveillance-driven” advertising and “surveillance capitalism” — are Europe- and privacy-academic speak that seeks to link or conflate interest-based advertising with government surveillance of citizens. I mean, really? Ad tech may include companies not known to the consumer, and because brands enlist outside ad tech companies to help them make advertising inventory more relevant (and useful) to site and app users, and disclose such data sharing through enhanced notice mechanisms and privacy policies, somehow this still constitutes “surveillance” with all of its negative connotations. Let’s not forget that the intended result of these investments in consumer engagement is a more relevant ad, not a dossier!
We can see where some (important) heads may be motivated.
Back to the U.S.
Even something as innocuous as an ad served to a device — we have plenty of guidance for privacy rules of the road: Fair Information Practice Principles that are global, Data & Marketing Association Data Standards 2.0, Digital Advertising Alliance Principles and the YourAdChoices program (a client), IAB technical standards … and an active Federal Trade Commission (and other agencies, in certain data categories) overseeing the ecosystem, and enforcing privacy expectations — is both self-regulatory and legal.
The idea that the United States is a laissez-faire data free-for-all is pointedly not a correct assessment. In the digital world, we have 20-plus years of self-regulation, based on nearly 60 years of self-regulation offline. All of this premised on building and bolstering consumer trust. We have federal and state law in important data sectors. Through all of these decades, we’ve had an FTC minding the advertising/marketing store, growing the market, and — enforcing privacy and security of data through meaningful enforcement actions and consent decrees that serve as teaching tools to other businesses.
Instead of grandstanding and undermining years of hard-earned consumer trust, more policymakers — and perhaps industry leaders, too — should recognize how responsible data flows serve the consumer. Thus, they should back existing industry regimes that promote stewardship and governance — and hold us to it. Serving consumers, earning their trust, after all, is a shared goal by all responsible stakeholders.