What’s the Price on ‘My Data’? Let the Marketplace Set the Rate

A bipartisan bill in Congress would assign the U.S. Securities and Exchange Commission with the task of determining what consumer data is worth; at least when it comes to Big Digital giants. So what’s my data worth?

A bipartisan bill in Congress would assign the U.S. Securities and Exchange Commission with the task of determining what consumer data is worth; at least when it comes to Big Digital giants. So what’s my data worth?

On the face of it, having the government mirror the private sector, and recognize that consumer data is a valuable asset, is actually quite wise. Data is worth something — and accounting rules, risk management, capitalism, and a reverence for asset protection — all point to a need to understand data’s worth and secure it accordingly. But should the government come up with the arithmetic? Really? And why limit this to Big Digital … data drives all economy sectors!

If this is about commerce and productivity, and facilitating next-generation accounting and capitalism, then I’d be all gung-ho. If it’s about setting the stage for just being punitive, then perhaps we can and must do better.

Take privacy. I’m already getting click fatigue — with permission notices on every site I want to visit, as well as the apps I use, it’s no wonder people are questioning if laws like GDPR and CCPA really afford any meaningful privacy protection at all, as well-intended as they may be. Privacy is personally defined — though universal principles need apply. Again, I think we can and must do better.

Recognizing data’s value — as the fuel for today’s economy — means recognizing data’s limitless beneficial uses (and encouraging such uses and further innovation), while putting a no-go ring around unreasonable uses (like throwing elections).

Business Efforts to Calculate Data’s Worth

“My data” is a misnomer. On the data valuation front, we from the direct marketing world — purveyors of personally identifiable information (PII) — have been putting a price on data for years … and understand data’s value, intrinsically. Big reveal: It’s not about me. (Sorry, Taylor Swift.)

Worldata, for example, has been tracking list prices for decades, and dutifully reporting on this. In the world of direct response, there’s “sweat equity” in both response and compiled lists. For response lists, some enterprise built a list of customers (or donors). The value of that list is derived from the shared attribute those customers have – and not, as some privacy advocates would have it, with the sum of one individual after another appearing on that list. With compiled lists, observable data is harnessed and staged also for marketing use – providing a more complete view of prospects and customers. Again, the value is derived from the attributes that data subjects share.

Even in digital data driving today’s media placement for advertising (more accurately, audience placement) — the algorithms deployed in search, social, and display — the values of these formulae are derived from affinities in these proprietary calculations, much of it anonymized from a traditional PII perspective. Yes, there are lots of data — nearly $21.2 billion in U.S. trade alone — but it’s not hoarding; it’s being put to productive use — in effect, 1:1 at mass scale.

With any innovations, there are bound to be mistakes by good companies, and some bad players, too. But it’s amazing to see how the marketplace weeds these out, over time. The marketplace, in time, weeds out the wheat from the chaff. The industry comes up with brand safety, privacy, security, chain-of-trust, and other initiatives to help facilitate more transparency and control. And testing shows which data sources are timely and reliable — and which ones where data quality is in question.

Predict This: Data Unleashed for Responsible Use Unleashes Consumer Benefits

Recently, I heard a current federal official say that data may be fuel — but it’s not like oil. Oil is finite. Data, on the other hand, is a limitless resource — like fusion. And it can be replicated. In fact, he went on to say, the more it is shared for responsible data use, the more consumers, citizens, commerce, and the economy benefit. This is correct. The commercialization of the Internet, indeed, gave us today’s global Digital Economy — giving billions access to information where they are able to derive limitless benefits.

That’s why potential breaches of data do need to be risk-assessed, prevented, understood for a likelihood of harm — with data governance and employee training thoroughly implemented. That’s also why government should investigate significant breaches to detect lax practices, and to instruct enterprises how to better protect themselves from bad actors. Here, I can see a viable SEC role, where all publicly held companies, and privately held too, are called into question – not just one type of company.

Where privacy is concerned … don’t just divide Big Digital revenue by the number of users with social accounts — and start menacing on what data about me online may be worth. That immediately starts off with a false assumption, fails to recognize information’s exponential value in the economy, and denies the incredible social benefits afforded by the digitization of information.

The Digital Advertising Alliance (a client) conducted a study in 2016, and found that consumers assign a value of nearly $1,200 a year to the “free” ad-financed content they access and rely upon via digital and mobile. However, if they were forced to pay that amount – most would not be willing (or able) to pay such a premium.

This research shows why we need to protect and facilitate ad-financed content. But it’s part of a larger discussion. It’s about why the commercialization of the Internet has been a 25-year success (happy birthday, October 24) and we must keep that moving forward. As consumers, we all have prospered! Let’s start our discussion on data valuation here.

 

7 Privacy UX Tips From a Privacy and Marketing Expert

There are all kinds of marketing awards, but how about one for privacy UX? How do you make your customers comfortable with your privacy user experience? It’s not just agencies — but ad tech and martech companies, data providers, analytics firms and even management consulting firms that are in the data-driven mix.

Do we need to have an award for a better Privacy UX?

With the Association of National Advertisers’ acquisition of the Data & Marketing Association last year came new ownership, too, of the International ECHO Awards. As a lover of data-driven marketing (and an ECHO Governor), it’s very exciting to see brands recognize the strategic role of data in driving more relevant consumer (and business) engagement, and the myriad ad and data partners that brands rely on to make this engagement happen.

It’s not just agencies — but ad tech and martech companies, data providers, analytics firms and even management consulting firms that are in the data-driven mix. These are the facilitators of today’s consumer intelligence that forms the basis for smarter and more efficient brand communication. Some folks even eschew the term “advertising” as we move into a world where branded and even non-branded content underlie data-inspired storytelling that are hallmarks of today’s forward-thinking campaigns.

By the way, the call for entries for this year’s ECHO Awards (to be presented March 2020 as ANA moves what was the DMA conference from this Fall to next Spring) is happening soon — though the entry portal is now open. Let me know if you’d like an invite to the launch party in New York (Wednesday, May 22, in the afternoon).

An Important Part of Brand-Consumer Dialogue — Privacy Notices

One category that won’t be part of this year’s ECHOs is related to privacy-specific communication from brands.

You’ve seen it. I’ve seen it. Again and again — all over our smartphone and laptops … communications asking for our consent for cookies, for newsletters, for device recognition, for terms and conditions — all in an effort to help enable data collection to serve the brand-consumer value exchange and subsequent dialogue.

Some of this is mandated from Europe’s General Data Protection Regulation, with halo impact in other nations and markets. Others are anticipating such notice requirements from California’s forthcoming privacy and advertising law. Still others are simply adopting heightened transparency (and choice) as part of self-regulatory and best practices regimes, where no laws may yet exist.

All of this devoted to one objective: getting a consumer (or business individual) to say “yes” to data collection about them, their devices and digital behaviors, in an effort to serve them better.

This week, during the International Association of Privacy Professionals’ Global Privacy Summit 2019 in Washington, DC, one expert — Darren Guarnaccia, Chief Product Officer, Crownpeak — offered some research insights from some 17 million preference experiences that Crownpeak has helped to facilitate on behalf of its brands. These experiences are focused on Europe in light of GDPR, but the findings offer good counsel to any brand that is thinking through its privacy UX.

Some Privacy Communications Concepts to Test

Here are just a few of the tips Guarnaccia reported:

  • Privacy Notices are Not Just a Matter of Compliance: Yes, they may be legally required in some jurisdictions – but more vitally, they should be treated with the same discipline and care of any other branded communication. Because the ultimate goal is to earn trust — going beyond compliance and permission. As a result, the whens, wheres and hows of such notices are vital to test and perfect.
  • Avoiding Legal Penalty Is Table Stakes — We Ought to Design Such Notices for Higher Purpose: To extend the previous point on consumer trust, there’s a higher price to pay if a privacy notice simply meets a legal expectation, and nothing more. Many consumers have gone “stealth” — using ad blockers and going incognito on browsers. We must remind, convince or persuade consumers of the value a brand seeks to offer in exchange for permissions and consents for data collection, analysis and application. Are we extending such notice in plain language at the right time?
  • Brand’ the Privacy Communication: This may seem obvious — but it’s often overlooked. Does the privacy notice look like it’s coming from the brand — or from somewhere else (such as a browser or ad tech partner)? In gaining consent, it’s always superior for the notice to be owned, cared and looked after by the brand itself — even if a third-party (such as an ad tech provider) is facilitating the notice. Does the creative of the notice match the colors, fonts and point sizes of the brand content behind it? By extending brand requirements to such communication, a brand is taking “ownership” of the data collection, consent and trust-building directly — as it should, in the eyes of the user.
  • Earn Before You Ask: Oftentimes, the consumer is presented with a cookie or related privacy notice upon entering a brand’s digital property — first page, upon entry. Test giving consumers a more anonymized experience for the few page visits, and then present a notice — “Are You Enjoying What You’re Seeing?” where a data collection permission is then sought. This allows the consumer to indeed value what’s on offer in information on the site.
  • Give Consumers Both an ‘Accept’ and a ‘Decline’ Choice or Button: Many sites offer only an “accept” button, leaving the consumer with an impression that they can “take it or leave it,” with no sense of real control. Test offering both an accept or decline offer — just seeing the word “decline” reminds consumers they are in control — and the actual decision to “decline” becomes more apparent for those consumers who indeed wish to be stealth.
  • Test Progressive Consent: Not every Website (or app) may need immediate access to user data for all purposes of consumer engagement. For data minimization purposes, perhaps ask visitors permission to collect only basic information (say, for contact, site optimization or customer recognition purposes) first. Then, only when necessary for utility, ask permissions for location data or other data categories, alongside the rationale for such collection and consent, as those needs arise. Asking for everything, upfront, all at once, can be a real turnoff — especially if a user is “new” to a brand. Consumers love — and frankly, need to know — the context for the permissions they give (or deny).
  • Test Privacy Notices by Market: Did you know users in the United Kingdom, for example, are 1.4 times more likely to give consent than those in France and Germany? How notices are worded and rationales explained — how transparency is conveyed — can have a big impact between markets, so it’s best to test notices by individual market (and language) to optimize consent rates. In short, national cultures and language nuance matter, too, in privacy communication.

Conclusion

In summary, there’s more payback than just permission. Consent rates in Europe can go as high as 60 to 70 percent — and hurtling over cookie walls at 80 to 90 percent — when privacy communications are optimized. Crownpeak offered far more tips (and real-market examples) in its session — about search engine optimization, personalization, analytics disclosures and other related topics. But there’s also lifetime value, and indeed consumer trust in the balance. We have an entirely new area for many marketers to test, working with their counsel and technology colleagues.

Who knows? Maybe the best such privacy-focused campaigns could still win a 2020 ECHO — based on compelling strategy, creative and results toward an earn-their-trust purpose. Is there a courageous brand ready to show us how? After all, this is one area where we all benefit from ways to raise consumer trust in advertising by sharing successful case studies. We shall see.

As Amped-Up Ad, Data Privacy Laws Near, Self-Regulated Programs Matter More

As we prepare ourselves for federal (and state) legislation around privacy and advertising, it’s worth taking account of our own industry’s self-regulated programs — both those here at home and worldwide.

As we prepare ourselves for federal (and state) legislation around privacy and advertising, it’s worth taking account of our own industry’s self-regulated programs — both those here at home and worldwide.

Why? Because even in an age of regulation, self-regulation — and adherence to self-regulatory principles and ethics codes of business conduct — matter. One might argue that legal compliance in industry is good enough, but business reputations, brand equity and consumer trust are built on sterner stuff.

Having a code of conduct is exemplary in itself, but I’d like to address a vital component of such codes: enforcement.

Self-Regulated programs
Transparency & Accountability in Advertising Self-Regulation Matter Greatly. | Credit: Chet Dalzell

Credibility in Codes Requires Peer Review & Accountability

Behind the scenes, every day, there are dozens of professionals in our field who serve — as volunteers and as paid professionals — to monitor the ethical practice of advertisers, who devise and update the codes we adhere to, who educate companies that proactively reach out to them, who work with companies and brands that go astray to resolution, and who enforce and refer non-compliant companies to government agencies, when necessary.

They may take complaints directly from consumers, competitors and industry observers. They may employ technologies and their own eyes and ears to monitor the marketplace. They may meet regularly as volunteers as a jury to deliberate on any need for corrective action. And, usually, they have a “contact us, before we contact you” operations effect: brands and businesses can proactively ask ethics programs questions about the “right” way (by the consumer) to execute a marketing practice, so it doesn’t prompt a formal query after a mistake is made after the fact.

Importantly, credibility depends, too, on reporting publicly on outcomes — potentially to “name and shame,” but most often to work cooperatively with businesses and to serve as an industry education vehicle in the reporting of correction and the resolution process. Generally, “punitive” is when a non-cooperative company is referred to a government agency for further action. Government agencies, for their part, tend to wholeheartedly welcome any effective effort to keep the marketplace aligned with the consumer. It helps when brands and consumer interests are in sync.

Accountability Programs Deserve Our Industry’s Expertise & Ongoing Financial Support

All told, these important players in our field serve us well, even as we face what might be referred to as co-regulation (government regulation on top of self-regulation). While any potential business mishap — for example, in the handling of consumer data or the questionable content of an ad — has its own set of facts and ramifications, a demonstration of good-faith efforts to adhere to ethical business practices might be seen as a mitigating factor, even as a brand finds itself needing to take a corrective action.

Agility, flexibility and responsiveness … these are all attributes of successful self-regulation — as well as successful accountability. Effective self-regulation serves to keep pace with innovations in our field, and “point the way” for other companies, as issues arise. (The rigidity of laws rarely can accommodate such innovations.)

While industry professionals may serve as volunteers on juries and review panels — it can be fascinating to serve on such panels — there is almost always an infrastructure of programs and staffs underpinning self-regulation success. Trade associations may finance some of these efforts with membership dollars — but usually businesses can lend their own resources directly, too. It’s great to have a seat at the table.

Marketing Ethics & Self-Regulation Programs — A Partial Listing

In all likelihood, there are potentially many more codes of conduct — particularly in vertical fields (pharma, travel, non-profit, retail, etc.) — but here is a brief listing of advertising-related codes and programs that may be helpful to catalog, bookmark, research and support, with some of which I’ve had the honor to be associated:

Please feel free to use the Comments section to suggest others. And thank you to every volunteer and staff person who serves or has served in an industry accountability capacity. It makes a world of difference, with marketplace trust of advertising and advertisers being the ultimate goal.

Marketers Doing the Data Privacy Balancing Act Ask What ‘I Want My Privacy’ Means

It’s not just policymakers who are trying to figure out how to act on consumer sentiments toward data privacy. We all, overwhelmingly, want it — business and consumer.

data privacy
Credit: Pexels.com

It’s not just policymakers who are trying to figure out how to act on consumer sentiments toward data privacy. We all, overwhelmingly, want it — business and consumer.

We are all seeking a U.S. federal privacy law to “repair” what may be broken in Europe (hey, the toaster needs fixing), and to correct any perceived privacy shortcomings in California’s new law (scheduled to take effect in January). Will such a federal law pass this year?

One of the ongoing challenges for policy in this area is what’s been called the privacy paradox. The paradox? Privacy in the form of consumer attitudes, and privacy in the form of consumer demands and behaviors, rarely are in sync. Sometimes, they are polar opposites, simultaneously!

  • Should law be enacted on how we feel, or respectful of what we actually do?
  • How do we define privacy harms and focus regulation only what is harmful and to go light, very light, or even foster wholly beneficial uses?
  • Should private sector controls and public sector controls be differentiated?
  • Do existing laws and ethical codes of conduct apply, and how might they be modified for the digital age?

On top of this, consumer expectations with data and technology are not fixed. Their comfort levels with how information is used at least in the advertising sector change over time. In fact, some marketers can’t keep pace with consumer demands to be identified, recognized and rewarded across channels. Generations, too, have differences in attitudes and behaviors.

What’s creepy today may in fact be tomorrow’s consumer-demanded convenience.

Case in point: It used to be people complained about remarketing the ad following them around on the Net as they browsed. (All the same, remarketing works that’s why it was so pervasive.) Today, in role reversal, consumers sound off when the product they purchased is the same product they still see in the display ad. The consumer has little patience when brand data is locked in data silos: the transaction database doesn’t inform the programmatic media buy, in this scenario.

The marketing and advertising business have been trying to solve for the privacy paradox since the Direct Marketing Association assembled its first code of ethics in the 1960s and introduced the Mail Preference Service in 1971. (Today, the Mail Preference Service is now known as dmaChoice, and DMA is now part of the Data Marketing & Analytics division of the Association of National Advertisers.) During the 1970s, consumers could use MPS to both add their names to marketing lists, and to remove their names from marketing lists for direct mail. At that time, far more consumers sought to add their names. Later, MPS strictly devoted itself to offering consumers an industry-wide opt-out for national direct mail, with add-ons for sweepstakes and halting mail to the deceased.

During the ’70s, DMA also required its member mailers (and later telemarketers and emailers) to maintain their own in-house suppression lists. These ethics behaviors were codified, to some extent, when the U.S. government enabled the Do-Not-Call registry and enacted the CAN-SPAM Act to complement these efforts.

Fair Information Practice Principles A Framework That Still Works Wonders

So here we are in the digital age, where digital display and mobile advertising are among addressable media’s growing family. Again, the marketing community rose to the challenge enacting the Digital Advertising Alliance YourAdChoices program (disclaimer, a client) and offering consumers an opt-out program for data collection used for interest-based advertising for Web browsing (desktop and mobile) and mobile applications.

Over and over again, the pattern is the same: Give consumers notice, give consumers control, prevent unauthorized uses of marketing data, protect sensitive areas recognize advertising’s undeniable social and economic power, enable brands to connect to consumers through relevance and trust and act to prevent real harms, rather than micromanage minor annoyances. Allow marketing innovations that create diversity in content, competition and democratization of information. Let the private sector invest in data where no harms exist.

‘I own my data!’

Data ownership is a dicey concept. Isn’t there sweat equity when a business builds a physical or virtual storefront and you choose to interact with it? Is there not some expectation of data being contributed in fair exchange for the digital content we freely consume and the apps we download and enjoy? And once we elect to become a customer, isn’t it better for the brand to know you better, to serve you better? Shouldn’t loyalty over time be rewarded? That’s an intelligent data exchange, and the economy grows with it.

The demand for access to everything free, without ads, and without data exchange, without payment to creators is a demand for intellectual property theft. Sooner than later, the availability and diversity of that content would be gone. And so would democracy. If you put everything behind an ad-free paywall, then only the elites would have access.

‘But I pay for my Internet service. I pay for my phone service!’

Sure you do and that pays for the cell towers, and tech and Web infrastructure, union labor with some profit for the provider. But unless you’re also paying for subscriptions and content it’s advertising that is footing the bill for the music you listen to, the news you read, the apps you use, and so on. All the better when ads are relevant.

At the end of the day, the consumer is always right and privacy is personally defined.

I’m all for limits on what governments can do with data when it comes to surveillance, and how it goes about maintaining our safety and security (a paradox of its own).

On the private sector side, policymakers might best act to give a privacy floor (do no harm) and where economic benefits accrue (to serve consumers without harms) allow consumers freely accessible tools to set their own privacy walls, using browser settings, industry opt-outs, brand preference centers and other widely available no-cost filters. It’s a wise society that can encourage responsible data flows, while blocking altogether irresponsible data flows. Get it right, and we all participate in a thriving 21st Century Information Economy. Get it wrong, and Europe and China will set the global rules. With some luck and deliberation, we’ll get this right.

Getting ‘Facebook Sober’? What Marketers Should Know About Consumers’ Attitudes and Social Data

I thought I was pretty clever when someone told me they hadn’t been on Facebook in over a year and I said, “Wow, you’re one-year Facebook sober.” They laughed. The next day, another person said they’d been off for two years — same comment by me, same reaction. But later, I found the term on Urban Dictionary.

I thought I was pretty clever when someone told me they hadn’t been on Facebook in over a year and I said, “Wow, you’re one-year Facebook sober.” They laughed. The next day, another person said they’d been off for two years — same comment by me, same reaction. But later, I found the term “Facebook sober” on Urban Dictionary — so much for my right to claim ownership of the term.

It’s unlikely that a new 12-step program is going to keep a significant percentage of the more than 2 billion people off of the social media platform any time soon, even though they know Facebook is exploiting their personal data for profit. While studies show that consumers believe the economic benefit of Facebook to them is about $1,000 per year, based on how much they would need to be paid to stay off the platform for that period of time, most will not pay anything to keep a company from tracking their data.

A study published by PlosOne in December 2018 quantified the monetary value that users assigned to participating on Facebook, using an auction experiment design.

Though the populations sampled and the auction design differ across the experiments, we consistently find the average Facebook user would require more than $1,000 to deactivate their account for one year. While the measurable impact Facebook and other free online services have on the economy may be small,* our results show that the benefits these services provide for their users are large.
* (Of course, this statement neglects the $40 billion Facebook realizes in annual advertising revenue.) 

While people claim to be concerned about privacy, they’re not willing to pay for it. A Survey Monkey poll done for the news site Axios earlier this month shows that three-fourths of people are willing to pay less than $1 per month in exchange for a company not tracking their data while using their product — 54% of them are not willing to pay anything.

Researchers at Stanford and NYU sought to determine the effects that Facebook deactivation would have on people’s knowledge, attitudes, moods, and behaviors. “This Is Your Brain Off Facebook,” published by the New York Times on Jan. 13, reports on this study.  A portion of the study participants were paid $102 to stay off Facebook for one month. The researchers stated:

Using a suite of outcomes from both surveys and direct measurement, we show that Facebook deactivation (i) reduced online activity, including other social media, while increasing offline activities such as watching TV alone and socializing with family and friends; (ii) reduced both factual news knowledge and political polarization;(iii) increased subjective well-being; and (iv) caused a large persistent reduction in Facebook use after the experiment.

Despite these findings, the Times reported “some participants said that they had not appreciated the benefits of the platform until they had shut it down:

“What I missed was my connections to people, of course, but also streaming events on Facebook Live, politics especially, when you know you’re watching with people interested in the same thing,” said Connie Graves, 56, a professional home health aide in Texas, and a study subject. “And I realized I also like having one place where I could get all the information I wanted, boom-boom-boom, right there.”

As I noted in my post last month, “Gen Z College Students Weigh-in on Personal Data Collection,” some GenZers don’t mind giving up their personal data in exchange for the convenience of targeted ads and discounts; others are uneasy, but all are resigned to the inevitability of it. One student summed up our mass acquiescence, saying:

“I do not feel it is ethical for companies to distribute our activities to others. Despite my feelings on the situation, it will continue — so I must accept the reality of the situation.”

The reality of the situation is that people are not willing to go cold turkey on Facebook.

Data Privacy Policymaking Words of Warning of Europe

Two weeks back, two hearings in Congress were held about a possible forthcoming new federal data privacy law for the United States. Some of the testimony included fascinating insight.

Two weeks back, two hearings in Congress were held about a possible forthcoming new federal data privacy law for the United States. Some of the testimony included fascinating insight.

It’s been nearly nine months since the European Union’s (EU) General Data Protection Regulation (GDPR) took effect with its tentacle effects worldwide – and it is helpful to look at what has transcribed, and to avoid making GDPR’s mistakes. That’s what one of the witnesses, Roslyn Layton, visiting scholar, American Enterprise Institute, had to say to the House Committee on Energy and Commerce, Subcommittee on Consumer Protection and Commerce, in her statement titled “How the US Can Leapfrog the EU.”

GDPR’s Early Impacts Are Foreboding

From Dr. Layton’s testimony, I found these excerpts (footnotes removed) to be particularly insightful – and somewhat frightful, though some of it predictable. She examined GDPR’s early deleterious effects which we, in the United States and elsewhere, would be wise to reject:

GDPR Is Not about Privacy  It’s About Data Flows

“A popular misconception about the GDPR is that it protects privacy; it does not. In fact, the word ‘privacy’ does not even appear in the final text of the GDPR, except in a footnote. Rather, the GDPR is about data protection or, more correctly, data governance. Data privacy is about the use of data by people who are allowed to have it. Data protection, on the other hand, refers to technical systems that keep data out of the hands of people who should not have it. By its very name, the GDPR regulates the processing of personal data, not privacy.”

GDPR Has Only Concentrated Big Digital Since Taking Effect

“To analyze a policy like the GDPR, we must set aside the political pronouncements and evaluate its real-world effects. Since the implementation of the GDPR, Google, Facebook and Amazon have increased their market share in the EU.”

GDPR Has Decimated Small- and Mid-Sized Ad Tech

“One study suggests that small- and medium-sized ad tech competitors have lost up to one-third of their market position since the GDPR took effect. The GDPR does not bode well for cutting-edge firms, as scientists describe it as fundamentally incompatible with artificial intelligence and big data. This is indeed a perverse outcome for a regulation that promised to level the playing field.”

GDPR Raises Costs, Prohibitively Acting as a Trade Barrier

“To do business in the EU today, the average firm of 500 employees must spend about $3 million to comply with the GDPR. Thousands of US firms have decided it is not worthwhile and have exited. No longer visible in the EU are the Chicago Tribune and the hundreds of outlets from Tribune Publishing. This is concerning because the EU is the destination of about two-thirds of America’s exports of digital media, goods and services. Indeed, the GDPR can be examined as a trade barrier to keep small American firms out so that small European firms can get a foothold.”

GDPR Denies Valuable Content to European Citizens

“Of course, $3 million, or even $300 million, is nothing for Google, Facebook and Amazon (The Fortune 500 firms have reportedly earmarked $8 billion for GDPR upgrades.), but it would bankrupt many online enterprises in the US. Indeed, less than half of eligible firms are fully compliant with the GDPR; one-fifth say that full compliance is impossible. The direct welfare loss is estimated be about €260 per European citizen.”

What if the US Enacted GDPR Here … Oh, the Costs

“If a similar regulation were enacted in the US, total GDPR compliance costs for US firms alone would reach $150 billion; twice what the US spend on broadband network investment and one-third of annual e-commerce revenue in the US.”

Dr. Layton, in her testimony, also questioned the California Consumer Privacy Act, which may create even more enterprise requirements then GDPR. She suggested more pragmatic paths need to be forged.

A Better Way Privacy by Design

“Ideally, we need a technologically neutral national framework with a consistent application across enterprises. It should support consumers’ expectations to have same protections on all online entities. The law should make distinctions between personally identifiable information which deserves protection, but not require same high standard for public data, de-identified, and anonymized data which do not carry the same risks. Unlike the GDPR, the US policy should not make it more expensive to do business, reduce consumer freedom or inhibit innovation.”

Data ‘Seat Belts and Air Bags’ for Privacy

In a second hearing, before the Senate Committee on Commerce, Science and Transportation, Interactive Advertising Bureau (IAB) CEO Randall Rothenberg provided a spirited statement of data’s role in the U.S. economy and the benefits that continue to accrue. He, too, drew from an another industry’s history which he believes offers a helpful analogy and cooperative blueprint:

IAB CEO Randall Rothenberg | Credit: Photo: Chet Dalzell

Internet’s Profound Communication Power

“The Internet is perhaps the most powerful and empowering mode of communication and commerce ever invented. It is built on the exchange of data between individuals’ browsers and devices, and myriad server computers operated by hundreds of millions of businesses, educational institutions, governments, NGOs, and other individuals around the world.”

Advertising’s Essential Role Online Much of It Data-Driven

Advertising has served an essential role in the growth and sustainability of the digital ecosystem, almost from the moment the first Internet browsers were released to the public in the 1990s. In the decades since, data-driven advertising has powered the growth of e-commerce, the digital news industry, digital entertainment, and a burgeoning consumer-brand revolution by funding innovative tools and services for consumers and businesses to connect, communicate and trade.

The Indispensable Ingredient: Trust

“Central to companies’ data-fueled growth is trust. As in any relationship, from love to commerce, trust underlies the willingness of parties to exchange information with each other; and thus, their ability to create greater value for each other. The equation is simple: The economy depends on the Internet; the Internet runs on data; data requires trust. IAB strongly believes that legislative and regulatory mechanisms can be deployed in ways that will reinforce and enhance trust in the Internet ecosystem.”

Universal Truth: Consumer Data Is Good

“We recommend Congress start with a premise that for most of American history was self-evident, but today seems almost revolutionary: consumer data is a good thing. It is the raw material of such essential activities as epidemiology, journalism, marketing, business development, and every social science you can name.

The Auto Industry Offers Us a Proactive Model

“We believe our goals align with the Congress’ decision to take a proactive position on data privacy, rather than the reactive approach that has been adopted by Europe and some states. We believe we can work together as partners in this effort with you to advance consumer privacy. Our model is the partnership between government and industry that created the modern concept of automotive safety in the 1960s. Yes, the partnership began as a shotgun wedding. Yes, the auto industry resisted at first. But an undeniable consumer right to be safe on the highways met well-researched solutions, which the Congress embedded in well-crafted laws that were supported by the states.

Auto Safety and Digital Wellness

“The result has been millions of lives and billions of dollars saved. We believe the analogy holds well here. Americans have a right to be secure on the information superhighway. Well-researched solutions and well-crafted laws can assure their ‘digital wellness.’ We should be thorough, practical and collaborative. Our goal should be to find the three or five or 10 practices and mechanisms the seat belts and air bags of the Internet era  that companies can implement and consumers can easily adopt that will reinforce privacy, security and trust.”

Notice and Choice Bombardment Or Predictable Rules of the Road

“Together, based on our members’ experience, we can achieve this new paradigm by developing a federal privacy law that, instead of bombarding consumers with notices and choices, comprehensively provides clear, even-handed, consistent and predictable rules of the road that consumers, businesses and law enforcers can rely upon.

One Federal Standard in Harmony

“Without a consistent, preemptive federal privacy standard, the patchwork of state privacy laws will create consumer confusion, present significant challenges for businesses trying to comply with these laws, and ultimately fall short of consumers’ expectations about their digital privacy. We ask the Congress to harmonize privacy protections across the country through preemptive legislation that provides meaningful protections for consumers while allowing digital innovation to continue apace.”

It is worth reading the testimonies of the privacy advocates present at these two hearings, as well. These GDPR fans have many sympathetic voices in the media and Congress, and truly need to be part of any conversation where consensus ought to be built. It is my hope the right federal legislation will result. The early evidence from Europe where advocates won over reason portends the punitive risks of getting it wrong.

New Privacy Regulations Coming Your Way: California Consumer Privacy Act (CCPA)

Have you recovered from last spring’s GDPR adrenaline rush yet? As much anxiety as GDPR regulations provoked, that may soon look like the good old days. Now California passed a privacy initiative you will be expected to follow starting Jan. 1, 2020.

Editor’s Note: While this piece is directed at publishers, CCPA also will be something marketers will have to be compliant with, just like GDPR.

Have you recovered from last spring’s GDPR adrenaline rush yet? Everybody in publishing was nervous about finding the right way to comply with new European privacy regulations. It did not seem like there was one clear path to compliance.

As much anxiety as GDPR regulations provoked, that may soon look like the good old days. At least in the EU, 27 countries came together with one edict. They also spent the time necessary to be smart and coherent, whether or not you agree with all the details.

Now California passed a privacy initiative you will be expected to follow starting Jan. 1, 2020. In many industries as goes California law, so go U.S. standards. This will be, in practice, a new national standard. California is too dominant a market, larger than most countries on the globe. Add to that a quirk in the drafting of the law, which says you must treat anyone who has left California and intends to return as a Californian. What?

Newly minted California Governor Gavin Newsom hailed the “first-in-the-nation digital privacy law” in his first State of the State address, according to reporting by Wendy Davis in MediaPost. “Companies that make … billions of dollars collecting, curating, monetizing our personal data also have a duty to protect in. Consumers have the right to know and control how their data is being used.”

CCPA Is Not Like GDPR

“The California law was written in five days, and really shows,” says Christopher Mohr, VP of intellectual property and general counsel at SIIA. “It is an extraordinarily complicated and poorly written statute.” Adding insult to injury, it is grammatically inconsistent and difficult to understand. I can’t imagine what compelled them to rush such important legislation through. It sounds irresponsible when you consider the EU worked on GDPR for more than three years.

“This is not the same as GDPR — it’s much broader.” Not a statement the already GDPR-fearing publishing industry wants to hear. Mohr continues, “In GDPR the information is tied to a data subject, for example, an individual. The CCPA covers ‘households’ as well as individuals. In addition, the CCPA’s potential ban on the use of information extends not only to the information but to the ‘inferences’ you might draw from it.” Inferences? Yikes! The law goes on to explain what is meant, but the idea of inferring conclusions sounds ripe for misinterpretation to me.

The main goal of the law is to regulate the collection and sale of personally-identifiable (PI) consumer data to third parties and service providers. You do not need to get paid for the data. If you disclose it to another party, it is considered a transaction. Using outside vendors to help manage your data is not a problem, because you are the controlling party.

Everyone will now have the “right to delete.” I asked Mohr to confirm that means deleting people from your database, not from your articles. “That’s the intent, I think. Whether the words match the intent is a completely different issue, and it’s not as clear as it could be. Personal information covers any information that could be associated with an individual.”

Anyone can tell you to cease disclosing their data to others; and you must comply. You cannot deny goods or services to anyone because of their data opt-out. That becomes the new Catch-22: In order to know you are not supposed to have data on an individual, you must have that individual in your database. And since it is likely you must have data on an individual in order to do business with him or her, how do you conduct business with data exceptions? For those rare European GDPR complainants, admittedly some American publishers will simply delete; good-bye. In the Hotel California, “you can check out any time you like, but you can never leave.”

Preventing a Privacy Tower of Babel

Fortunately, enforcement is by state attorney general, not by individuals. In other words, thank God this is not an invitation to everyone in California to sue. Of course this law will be challenged in court. It may be too vague, according to some. It may be discriminatory, since non-profits (and government agencies) can ignore it and do what they want, the way it is written.

Living in this hyper-intrusive world, it’s hard to disagree with the intent of CCPA since we are all being personally data mined. But play this out. Imagine what mischief the other 49 states can do. Davis reports, Washington state “lawmakers are considering a bill that would not only give consumers the right to learn what data is collected about them, but would also allow them to prevent their personal data to be used for ad targeting.”

Federal legislation is coming on this after the recent grillings on Capitol Hill of some of the leading big-tech luminaries. Typically federal legislation trumps local law, which is what makes interstate commerce work. Hopefully there will be one law of the land, so any company handling data can maintain sanity versus bowing to every state, city, or county passing a law. But in these Alice in Wonderland times we are in, I will leave that speculation to you.

You have complied with GDPR so that means you now have DPO (data protection officer). The CCPA gives your DPO a little more to do.

I’m no lawyer, so I’ll provide the usual disclaimer on all the above. On the other hand, I am a member of and advocate for the Specialized Information Publishers Association, part of SIIA, whose general counsel Chris Mohr was invaluable in enabling me to share an understanding of this law. I believe it makes great sense to occasionally be involved with your peers and work on common problems like privacy laws. As a member of SIPA or Connectiv, you won’t need to call your lawyer every time there is a question about the new privacy landscape. You can take advantage of knowledgeable experts in your corner.

Do I have you pining for the muddy clarity of GDPR yet?

Our Digital Selves: Living Without the ‘Big 5’ — And 7,000 Others

There, once again, is the age-old privacy paradox, which predates our digital selves. Do we — individually, as a society, as a matter of policy — understand the data-for-value exchange that is inherent not just on the commercial Internet, but in practically every business arrangement we have?

our digital selves
Chet Dalzell snapped this photo with his smartphone’s camera. (Curses!) | Credit: Chet Dalzell

During the past couple of weeks, I’ve been enjoying a thorough attempt by one Gizmodo editor, Kashmir Hill, to live life one week at a time without the titled “Big 5” — Amazon, Apple, Facebook, Google and Microsoft — and then to do so all at once.

“It was hell,” she reported.

Well, that statement alone could be interpreted as “unpleasant” or “impossible” or “really inconvenient” or “unenjoyable, or maybe all of the above. Hill’s attempts to quit cold turkey appeared to be very earnest and objectively pursued, though her editorial approach is not without a point of view: “The tech giants, while troubling in their accumulation of data, power and societal control, do offer services that make our lives a hell of a lot easier.”

Do I feel powerless with no control? I do not, but that’s a personal choice.

There, once again, is the age-old privacy paradox, which predates our digital selves. Do we — individually, as a society, as a matter of policy — understand the data-for-value exchange that is inherent not just on the commercial Internet, but in practically every business arrangement we have?

To shut off all data flows might be thought of as an exercise of a Luddite. Every individual can choose to live life this way, at least in some measure. Or perhaps it’s an exercise of being jaded: Among us, there are those who believe social media’s popular “10-year challenge” is a not-so-secret plot to update everyone’s likeness for facial recognition software.

Take a Regular Digital Break, Please

I, too, pursue and relish a weekend where I put my devices away, and go off the digital grid for hours or even one day at a time. A walk in the woods, or park, or beach, with no device in reach — and with just my thoughts – is an empowering and recharging experience (for me). It can drive my friends and family nuts, wondering where I am — but they’re used to it by this time.

Mom:

“You didn’t play ‘Words with Friends’ with me yesterday. Is everything OK?”

On the other hand, every day, I observe fellow citizens who seem unable to navigate a sidewalk, or ride an elevator, or even sit at a bar or restaurant, without having their heads down in smartphones. Kudos to them for processing digital information constantly … I think. I certainly can’t do that.

Yet to have a bias — either in practice or in policy — that blocks responsible data flows, truly is an exercise in masochism. As participants in the marketing data supply chain, we have ethical and some legal obligations to be capable stewards of data. We have associations, self-regulatory codes, and regulators that teach and tell us what to do.

Beyond the Big 5, we also have thousands of companies in the adtech/martech ecosystem — at last count, nearly 7,000. Any could be the next “big thing,” as investment flows seem to indicate.

Image of Ad Tech - Mar Tech Breadth

Slide Source: “Outlook For Data Driven Marketing: First Look 2019,” The Winterberry Group, 2019.

On top of these, we have brands and agencies using information, responsibly, to attract (discover), create (convert) and retain (serve) customers. This is not evil. This is innovation — and we shouldn’t fault a data-flow framework that facilitates commerce, consumer choice and diversity of content. We should scrutinize it for harmful data usage — and regulate the harm.

In short, every information use should be vetted. Wisdom, rather than fear, must be our starting point in such examination, with a healthy dose of data reverence. In advertising, we can (and must) have both consumer privacy protection and digital innovation. Achieving such dual, laudable outcomes, however, cannot be achieved if we are required to just shut down.

 

 

Thankful for Being ‘Reasonable’ With Data-Driven Marketing

Marketers were given an early Thanksgiving: a recognition by the Federal Trade Commission that “data” is indeed the fuel of the digital economy, and that most consumers are pragmatic toward how data, and data-driven marketing, finances the online content they rely upon and enjoy.

Marketers were given an early Thanksgiving: a recognition by the Federal Trade Commission that “data” is indeed the fuel of the digital economy, and that most consumers are pragmatic toward how data, and data-driven marketing, finances the online content they rely upon and enjoy.

Some might call such a view logical. Some factual. Some realistic. Let’s call it all of these and “reasonable,” as well.

On Nov. 9, the FTC, in comments to the U.S. Department of Commerce’s National Telecommunications and Information Administration regarding the Administration’s approach to consumer privacy said:

“The FTC supports a balanced approach to privacy that weighs the risks of data misuse with the benefits of data to innovation and competition. Striking this balance correctly is essential to protecting consumers and promoting competition and innovation, both within the U.S. and globally.”

The comments articulate how the FTC has pursued enforcement action in its existing privacy enforcement, a bright line of various consumer harms: financial injury, personal injury, reputational injury and unwanted intrusion, the latter incorporating the sanctity of their homes and intimate lives.

A Succinct Recognition of Responsible Data Usage

The comments call out the benefits of responsible data flows in our economy (note: footnotes are omitted in excerpts):

“In addition to considering the risks identified above, any approach to privacy must also consider how consumer data fuels innovation and competition. The digital economy has benefited consumers in many ways, saving individuals’ time and money, creating new opportunities, and conferring broad social and environmental benefits. For example, recent innovations have enabled:

  • Better predictions about and planning for severe weather events, including updated flood warnings, real-time evacuation routes, and improved emergency responses and measures, that can allow people to plan for and avoid dangerous conditions.
  • Improved consumer fraud detection in the financial and banking sector, as institutions can obtain insights into consumers’ purchasing and behavior patterns that will allow them to proactively identify and immediately stop fraudulent transactions when they are discovered.
  • Free or substantially discounted services, including free communications technologies (email, VoIP, etc.), inexpensive and widely available financial products, and low-cost entertainment.
  • Safer, more comfortable homes, as IoT [Internet of Things] devices detect flooding in basements, monitor energy use, identify maintenance issues, and remotely control devices, such as lights and ovens.
  • Better health and wellness, as a variety of diagnostics, screening apps and wearables enable richer health inputs, remote diagnosis by medical professionals, and virtual consultations.
  • More convenient shopping, as retail stores track both sales and inventory in real-time via shopping data to optimize product inventory in each store.
  • More relevant online experiences, as retailers provide customized offers and video services recommend new shows.
  • Easier-to-find parking, as cities deploy smart sensors to provide residents with real-time data about available parking spots.
  • Increased connectivity, as consumers can get immediate answers to questions by asking their digital voice assistants and can remotely operate devices, such as lights and door locks, with a voice command or single touch on a phone.

“Privacy standards that give short shrift to the benefits of data-driven practices may negatively affect innovation and competition. Moreover, regulation can unreasonably impede market entry or expansion by existing companies; the benefits of privacy regulation should be weighed against these potential costs to competition.”

While we may believe the FTC is stating the obvious here, such matter-of-factness about marketplace observations cannot be taken for granted. An entirely new Internet regulation and regimen emanating from Europe  with its own U.S. fan base among some academics and privacy fundamentalists would take direct aim at these social and economic outcomes through cumbersome, inflexible, rigid consent schemes. These must be resisted not because privacy protections are not worth pursuing (they are), and not because consent is important (it is) but because, as the FTC comments also show, effective privacy enforcement is already soundly in place in America. And where new regulations are enacted, they ought to be flexible, measured and a balanced approach. “Reasonable” is the concept in play here.

A Risk-Based Approach

Thankfully, “a risk-based approach is in the FTC’s institutional DNA,” the FTC reports. For example, in this important area of consumer control, the commission writes (again, footnotes omitted):

“The FTC has long encouraged a balanced approach to control. Giving consumers the ability to exercise meaningful control over the collection and use of data about them is beneficial in some cases. However, certain controls can be costly to implement and may have unintended consequences. For example, if consumers were opted out of online advertisements by default (with the choice of opting in), the likely result would include the loss of advertising-funded online content.”

This is a pivotal moment. In effect, this is a recognition of two decades of responsible data collection and use at work in the Internet economy, and perhaps another 100 years of similar data use in the offline economy. In both cases, advertisers and marketers have implemented effective self-regulation conduct codes (disclosure, my professional relationships supports such codes), that are backed by enforcement and accountability that can refer companies to government agencies. The FTC actually used the NTIA comments to call out enforcement cases where private firms purportedly failed to follow self-regulatory codes of conduct.

As we debate public policy for privacy and security in the next Congress, and state legislatures, too and among ourselves as citizens and industry participants it’s wise to understand and appreciate what responsible data collection and use has brought forth in our economy, and how reasonable, risk-based approaches to policy making can best serve us all.

While I say “thank you” to the FTC for recognizing this I’m also thankful for an industry of practitioners who recognize and understand how and why data stewardship matters.

Warning: Marketing Data Policy-Making Ahead in the U.S.

U.S. data policy-making efforts make certain assumptions about marketing. It’s as if there’s a sign coming, saying: “Data Is a Weapon.” But what if lawmakers instead assumed data was a force for good?

U.S. data policy-making efforts make certain assumptions about marketing. It’s as if there’s a sign coming, saying: “Data Is a Weapon.” But what if lawmakers instead assumed data was a force for good?

Certainly, when dealing with the European data protection community — who may seek 4 percent of your global profits — it is wise to be deferential, even praiseworthy.

Apple CEO Tim Cook, in his speech last week to European data commissioners that hearkens back to President Eisenhower’s warning in 1961 about the “military-industrial complex,” identified commercial data collection interests as a “data-industrial complex” that has “weaponized” the collection and monetization of data with great efficiency.

Reading of this, one might extrapolate that all data collection is worrisome, and that this so-called trade in data amounts to “surveillance” that is inherently harmful.

To some, this might be 1961 all over again — or 1984, for that matter.

https://youtu.be/axSnW-ygU5g

In reality, some may be singing from the choir book brought to us by European Parliamentarians. Every time I see a cookie notice on my U.S. website visits, I’m reminded, perhaps gently, that our sovereignty is being visited upon by foreign lawmakers. Europe’s leaders are trying to remake the Internet in its image — while China’s leaders do the same — and the world may be a lot less friendly toward each other as a result.

Considerations of a Healthful Policy Debate

As consumers, we may welcome privacy and security in our nation’s Internet public policy debate. All is not the same, however. We must handle our own policy-making with utmost care. Europe’s General Data Protection Regulation (GDPR) is one model — but is this European law really the right fit for the United States or, for that matter, other regions of the world?

In the private sector:

  • Consider the role that ad-financing (read, digital data) plays in ensuring quality journalism necessary for a healthy democracy.
  • Consider what consent restrictions (read, opt-in) would play in diminishing the ability of start-ups and mid-sized companies to compete with established companies — competition in the digital economy.
  • Consider an appreciation of the long-tail of the Internet — and the diversity of content and niche interests that meet consumer demands, made available through small publishers.
  • Focus on who is at the center of privacy restrictions — the citizen, digital user and the consumer. In every aspect, what are the trade-offs that individuals would experience when responsible data flows are effectively shut down?
  • Appreciate that all data are not the same. Are there data collection scenarios where there is a greater likelihood for harm? Are there categories of personal and user data that are more harmful than others — to the interests of that individual? In the United States, we already highly and wisely regulate such data as credit, health, children’s data, government identification numbers and more.
  • And importantly, understand how private sector use of data — and public sector use of data — differ. How should the two exchange, and not exchange, data between them?

Globally and certainly here in the United States, data enables commerce, consumer choice and diversity of content. Truly, the commercialization of data drives incredibly powerfully beneficial social aims. Such aims deserve recognition as policymakers weigh measured regulation.

Some global business leaders, for whatever motivations, heap praise on GDPR, but there’s danger in assigning “one size fits all”-type regulation. “Surveillance,” too, is a very loaded word — especially where responsible data collection and use represent an unparalleled force in the private sector for good: jobs, economy, competition, ad-financed content and services, and much more. Even governments package public records for beneficial use in the private sector. Remember the only reasons businesses exist is to create and serve a customer.

Where Surveillance Is a Material Concern

On the other hand, where surveillance truly is not a loaded word is where the public sector gathers and uses digital and mobile information to monitor citizens. Or where a government, foreign or domestic, demands the handover or censorship of such information from and of the private sector.

Here, I applaud close – very close – attention to what our government, or any other government, does with digital data, including that which exists in the private sector. Within the U.S., warrants, court orders and subpoenas should be demanded before private sector entities satisfy any government requests for information (and/or deletion of information). As government indeed has honest objectives — combatting fraud, terrorism and other crimes, or advancing public safety or health, for example – then it is wise to provide for independent judicial overview as a necessary check and balance to validate such laudable goals.

Data is a weapon only when it’s perversely used to disserve a consumer, a voter or a democracy. Let the private sector freely use information responsibly for all else, for it unleashes forces for good that serve consumers, the economy and robust discussion.