Why Everyone Benefits When Marketing and Privacy Are Aligned

Privacy is one of the most pressing issues facing organizations today. And it’s not just affecting the companies that are making headlines over it, like Facebook, Google, Capital One, and Experian.

Privacy is one of the most pressing issues facing organizations today. And it’s not just affecting the companies that are making headlines over it, like Facebook, Google, Capital One, and Experian. The recent passage of privacy laws in the United States and abroad, and the resulting potential fines for mistakes, have been a wake-up call for many. Marketing teams always needed to consider privacy but; now, it’s imperative, and there are significantly higher stakes (ahem…billions of dollars).

Noga Rosenthal, Chief Privacy Officer and General Counsel at NCC Media, believes that far too often, marketing and privacy may be unknowingly working against each other or in silos. However, it is essential for these two departments to be closely aligned.

She points out:

“Every company is a data company, whether or not they realize it. You have CRM data and employee data. You’re collecting data off your website. Nearly everybody will be impacted by legislation like CCPA [California Consumer Privacy Act] and needs to be paying attention.”

What’s at Stake

The stakes are high, and the risks are greater when there’s a disconnect between the marketing and communications teams and the privacy and legal teams. There are two common vulnerabilities:

1. Corporate marketing and advertising aren’t taking into account privacy when promoting products and services.

  • Corporate advertising is creepy to customers.
  • Your company is using new and trendy technology vendors that haven’t been properly vetted by privacy teams.
  • You’re using terminology in your marketing, like “tracking” and “anonymous,” that will draw scrutiny from lawmakers.

2. Marketing and communications teams aren’t involved in security and privacy breach preparedness and response.

  • Marketing and communications haven’t contributed to the company’s incident response plan.
  • Marketing is looped in too late during a breach and is not given the resources needed to respond to stakeholders and meet disclosure requirements.

Companies that falter can be subject to hefty fines. They could alienate their customers. And they’ll likely find themselves in the middle of a PR nightmare.

The Benefits of Collaboration

“Marketing should have a seat at the table in all things data governance. It’s mission-critical,” says Peg Kuman, Chief Privacy Officer of V12.

Bringing privacy and marketing together benefits everyone. If you’ve ever tried to read a privacy policy, you know that privacy and legal speak needs to be more accessible and consumer-friendly. Disclosures and policies written by privacy teams would surely benefit from a marketing and communications lens.

If marketers are more in tune with privacy, your company can protect its brand reputation and avoid the painful privacy missteps in advertising that we’ve seen with Netflix, Spotify, Tinder, and countless others. For companies that face an incident, collaboration can ensure a proper response, such as how Twitter recently owned up to its privacy mistakes, used consumer-friendly language and succinctly apologized.

Also, with an overall heightened interest in privacy, companies can provide value to clients and their customers by proactively sharing relevant and easy-to-understand privacy updates. A client outreach strategy can only be effective by coupling the expertise and knowledge of the privacy team with the creativity, strategy, and reach of the marketing team.

According to Rosenthal:

“At times, it feels like marketing and privacy are at odds with each other. But as privacy becomes more important to consumers, and companies like Apple use it as a way to bring in customers and differentiate from competitors, there’s more of a need to lean on each other.”

Where to Begin

There are several ways to open the lines of communication and foster a stronger partnership between privacy and marketing teams.

Establish a Cross-Functional Team

Don’t wait for something bad to happen to get closely aligned. Proactively create a team consisting of privacy, legal, marketing, and communications focused on cross-functional initiatives. Meet regularly to discuss legislation, strategize, and surface ideas.

Use these meetings as a forum for education and awareness. Like Kuman and Rosenthal, most privacy leaders are involved in industry organizations and coalitions. Through their participation, they get vital information that can help their marketing teams.

Commit to Privacy Principles

Privacy principles should align with the company’s mission, vision, and purpose. A great place to start is thinking about what trust and transparency mean for your industry and organization.

Once you’ve determined what privacy means for your organization, make sure it’s clear in everything you and your employees say and do. Better yet, put some marketing power behind those principles, so they become synonymous with your brand.

Prioritize Policies, Protocol, and Incident Response

Your privacy and marketing teams will need to jointly decide where to focus efforts across your various stakeholders including employees, clients, consumers, prospects, partners/vendors, the media, lawmakers, and investors.

There should be a clear protocol for how marketing and privacy work together, and all parties should understand the role that they play in protecting corporate reputation and respecting consumers.

If your organization doesn’t have a breach response strategy, privacy and marketing should champion the development of one, in conjunction with other parts of the organization, such as technology, information security, and client services. Simulation exercises are valuable ways to identify vulnerabilities and prepare without the intense pressure of an actual crisis.

Raise Awareness Through Education

Privacy is likely not top-of-mind for the majority of your marketing staff, but awareness is critical. Education increases awareness. Curriculum specific to marketing helps the full marketing organization understand their role in supporting the company’s privacy principles. Training can also address when it’s necessary to engage your privacy resources.

Kuman prefers the term “socialization” over training.

She adds:

“Companies should socialize the notion that privacy is how we protect our customer, employee, and business assets.”

Privacy Is Everyone’s Job

Regardless of where privacy laws are headed next in the United States and abroad, we all play a role in privacy protection and we’ll be more successful if we’re working closely together.

What’s the Price on ‘My Data’? Let the Marketplace Set the Rate

A bipartisan bill in Congress would assign the U.S. Securities and Exchange Commission with the task of determining what consumer data is worth; at least when it comes to Big Digital giants. So what’s my data worth?

A bipartisan bill in Congress would assign the U.S. Securities and Exchange Commission with the task of determining what consumer data is worth; at least when it comes to Big Digital giants. So what’s my data worth?

On the face of it, having the government mirror the private sector, and recognize that consumer data is a valuable asset, is actually quite wise. Data is worth something — and accounting rules, risk management, capitalism, and a reverence for asset protection — all point to a need to understand data’s worth and secure it accordingly. But should the government come up with the arithmetic? Really? And why limit this to Big Digital … data drives all economy sectors!

If this is about commerce and productivity, and facilitating next-generation accounting and capitalism, then I’d be all gung-ho. If it’s about setting the stage for just being punitive, then perhaps we can and must do better.

Take privacy. I’m already getting click fatigue — with permission notices on every site I want to visit, as well as the apps I use, it’s no wonder people are questioning if laws like GDPR and CCPA really afford any meaningful privacy protection at all, as well-intended as they may be. Privacy is personally defined — though universal principles need apply. Again, I think we can and must do better.

Recognizing data’s value — as the fuel for today’s economy — means recognizing data’s limitless beneficial uses (and encouraging such uses and further innovation), while putting a no-go ring around unreasonable uses (like throwing elections).

Business Efforts to Calculate Data’s Worth

“My data” is a misnomer. On the data valuation front, we from the direct marketing world — purveyors of personally identifiable information (PII) — have been putting a price on data for years … and understand data’s value, intrinsically. Big reveal: It’s not about me. (Sorry, Taylor Swift.)

Worldata, for example, has been tracking list prices for decades, and dutifully reporting on this. In the world of direct response, there’s “sweat equity” in both response and compiled lists. For response lists, some enterprise built a list of customers (or donors). The value of that list is derived from the shared attribute those customers have – and not, as some privacy advocates would have it, with the sum of one individual after another appearing on that list. With compiled lists, observable data is harnessed and staged also for marketing use – providing a more complete view of prospects and customers. Again, the value is derived from the attributes that data subjects share.

Even in digital data driving today’s media placement for advertising (more accurately, audience placement) — the algorithms deployed in search, social, and display — the values of these formulae are derived from affinities in these proprietary calculations, much of it anonymized from a traditional PII perspective. Yes, there are lots of data — nearly $21.2 billion in U.S. trade alone — but it’s not hoarding; it’s being put to productive use — in effect, 1:1 at mass scale.

With any innovations, there are bound to be mistakes by good companies, and some bad players, too. But it’s amazing to see how the marketplace weeds these out, over time. The marketplace, in time, weeds out the wheat from the chaff. The industry comes up with brand safety, privacy, security, chain-of-trust, and other initiatives to help facilitate more transparency and control. And testing shows which data sources are timely and reliable — and which ones where data quality is in question.

Predict This: Data Unleashed for Responsible Use Unleashes Consumer Benefits

Recently, I heard a current federal official say that data may be fuel — but it’s not like oil. Oil is finite. Data, on the other hand, is a limitless resource — like fusion. And it can be replicated. In fact, he went on to say, the more it is shared for responsible data use, the more consumers, citizens, commerce, and the economy benefit. This is correct. The commercialization of the Internet, indeed, gave us today’s global Digital Economy — giving billions access to information where they are able to derive limitless benefits.

That’s why potential breaches of data do need to be risk-assessed, prevented, understood for a likelihood of harm — with data governance and employee training thoroughly implemented. That’s also why government should investigate significant breaches to detect lax practices, and to instruct enterprises how to better protect themselves from bad actors. Here, I can see a viable SEC role, where all publicly held companies, and privately held too, are called into question – not just one type of company.

Where privacy is concerned … don’t just divide Big Digital revenue by the number of users with social accounts — and start menacing on what data about me online may be worth. That immediately starts off with a false assumption, fails to recognize information’s exponential value in the economy, and denies the incredible social benefits afforded by the digitization of information.

The Digital Advertising Alliance (a client) conducted a study in 2016, and found that consumers assign a value of nearly $1,200 a year to the “free” ad-financed content they access and rely upon via digital and mobile. However, if they were forced to pay that amount – most would not be willing (or able) to pay such a premium.

This research shows why we need to protect and facilitate ad-financed content. But it’s part of a larger discussion. It’s about why the commercialization of the Internet has been a 25-year success (happy birthday, October 24) and we must keep that moving forward. As consumers, we all have prospered! Let’s start our discussion on data valuation here.


2019 to 2022: The Evolution of Consumer Consent and How to Adapt

By opening a dialogue with audiences about data collection and processing, as well as empowering them to decide how their data is used, marketers and publishers can enhance their relationships with consumers.

Editor’s Note: While this piece was originally written for the publishing audience, privacy legislation and consumer consent are still very important topics for marketers to navigate.

The EU’s General Data Protection Regulation (GDPR) laid the foundation for privacy legislation in 2018. One of its key aims was to give consumers more control over their personal information and make users understand they have a choice when it comes to providing consent to data collection and processing.

In the US, initial responses to the GDPR ranged from criticism to pay walls or completely blocking EU visitors from accessing content. While many companies made sure their legal teams were up to speed with the newly introduced regulation, multiple factors are now making US publishers sit up and further address data privacy, associated regulations, and the importance of consumer consent.

Google’s €50 million fine — issued by French regulator CNIL, for lack of valid consent in ad personalization — was one of the first to grab attention. Fines for data breaches might not have been as frequent as some expected, but regulators have had time to build cases and more large fines are expected to come, according to The Wall Street Journal.

In addition, US states are working to implement their own regulations. California’s Privacy Act (CCPA) is due to come into force in January 2020, Vermont’s new legislation is already in place, and a consumer privacy bill has been proposed in New York. The exact requirements of each regulation may differ from the GDPR — the CCPA relies on the user opting out rather than opting in, whereas opt-in is expected as part of the New York Privacy bill development. There may even be a federal regulation requiring opt-in on the horizon, with privacy advocates such as Apple CEO Tim Cook, as well as members of the Federal Trade Commission, calling for a stricter national privacy law.

Collecting user consent may not yet be a legal obligation, but publishers are realizing that it makes good business sense as consumers demand more control over their personal data. According to GlobalWebIndex, more than 70% of US consumers say they are both more aware of, and more concerned about how companies use their information than they were 12 months ago, while less than half feel they are in control of their personal data online.

By opening a dialogue with audiences about data collection and processing, as well as empowering them to decide how their data is used, publishers can enhance their relationship with consumers. Demonstrating transparency and responsible data use builds trust, but also educates users via one-to-one communications about the necessity of data to their business models and the inherent value exchange.

Publishers that go beyond the one-size-fits-all approach to create meaningful consent experiences right now will reap the rewards in the long term. The next two years will be critical for consent and there are a number of practical considerations to take into account when implementing consent programs.

Ensure Usability

The consent interface is often the first point of contact between publisher and consumer, so care needs to be taken in its design and functionality. Consent requests need to achieve the perfect balance between giving the user the details they need to make an informed choice and not alienating them with complex jargon they won’t understand or unnecessarily disrupting their user experience – all while ensuring legal compliance. Consent requests should be highly explicit, giving users the power to opt in or out of data collection for specific purposes or by particular companies. Ideally, publishers should test a variety of messaging formats to deliver the best possible experience.

Execute Consent Seamlessly

Once a user submits their consent preferences, publishers need to make sure they are integrated across the advertising supply chain. The IAB released its Transparency and Consent Framework (TCF) as a tool to help publishers and other participants in the digital advertising ecosystem comply with their obligations. The updated second version of the TCF increases the importance of consent by enabling users to object to data collection under legitimate interest, an alternative legal basis for data processing. It’s also important that consent preferences can be communicated across non-IAB vendors.  

Apply Preferences Across Devices

With consumers frequently switching between laptops, smartphones and TVs to consume content, it is best practice for publishers to share choices across multiple devices. The ability to connect user preferences to an authenticated profile and apply these everywhere the user interacts with a publisher’s content saves the user from having to supply consent every time they log in through a different device. An authenticated profile is a tool that allows users to manage their preferences across site, browser, and devices, and allows publishers to collect consent signals based on identity rather than cookies.

Publishers in the US who aren’t yet legally obliged to implement a consent program yet might be hesitant, with the fear it will be disruptive to their business or they will lose advertising revenue if their audiences fail to give permission. However, they need to consider that across the EU, the publishers that experienced the least disruption were those that adapted early and gave themselves plenty of time to get the consent process right.

By 2021, with GDPR well established, the CCPA in force, and other regulations underway, consent will be a user expectation if not a legal obligation. Publishers should start implementing consent programs now to build trusting relationships with their audiences, increase transparency around data processes, and put themselves in a good position to deal with the regulatory changes ahead.

Do Marketers Understand Privacy? Is That Why It’s Disappearing Before Our Eyes?

We should ask: What’s the nature of the “violation” or “privacy”? How much do we really want it and at what price (money or benefits or both) are we prepared to make believe that the violation didn’t really matter so much in the first place?

For many people, privacy is about being able to close and lock the bathroom door.

It is about documents or events which are meant to be “Private and Confidential,” but seldom stay that way. A wise friend, the CEO of a large and successful company, joked that the only way to keep something truly private was to tell no one, not even his wife, and to post it on the office bulletin board under an instruction:


“It’d be cool to talk about how marketers can create relevant marketing in a way where you don’t feel violated?” wrote a Target Marketing reader.

Or, put somewhat differently, we should ask what’s the nature of the “violation” or “privacy”; how much do we really want it and at what price (money or benefits or both) are we prepared to make believe that the violation didn’t really matter so much in the first place?

Like it or not, as marketers, part of our job, the bottom line of the plethora of articles headed “9 Ways to Penetrate the Inbox” is to discover the best ways — nine or 29 of them — to get past the privacy filter we think we have erected around ourselves. “But then how would I know about that great 24-hour deal on the camera I’ve always wanted?” asks the shutter bug, oblivious to having been tracked to the mall and bombarded with seductive cell phone messages, just as he was approaching the camera store.

How many of us give more than a passing thought to the degree to which our targeted marketing initiatives may represent a violation of the privacy of our prospects? Perhaps we should think of it as one of those unquantifiable #MeToo questions captured by the song:

“You must remember this
A kiss is just a kiss
A sigh is just a sigh
The fundamental things apply
As time goes by”

Dooley Wilson

Are we as marketers aware and sensitive to the “fundamental things” and have we any measure of how much each of our prospects values privacy? We should know that no two prospects are really alike no matter how we read the tea leaves of our statistical analysis. And one of the critical differences is in their unique perceptions of what does and does not constitute an invasion of their privacy.

One would have imagined that all the negative noise about Facebook’s violation of its subscriber’s privacy and the $30 to $50 billion fine they are going to have to pay to the Federal Trade Commission would focus their attention on this issue. But that’s yesterday, not today.

Even if the fine, described by Kara Swisher in the New York Times as “a parking ticket. Not a speeding ticket. Not a DUI — or a DUI(P), data under the influence of Putin. A parking ticket. … That’s why its stock rose significantly after the news … In other words, the privacy concerns raised loudly by politicians and the media have not hurt Facebook’s growth … they’re going to need a bigger fine if they actually want to stop Facebook from violating its users’ privacy.”

Today, I wouldn’t bet that this would make all that much difference. The website Popular Info asked: “ … what is the point of the Facebook policies if they are not enforced in advance of publication?”

Facebook is rightly getting blasted for failing to prevent Trump 2020 ads and others from white supremacists, anti-Muslim false news and anti-LGBT content that would surely have reached their target audiences at least once before being taken down and forced to replace the offensive content. And that involves money, something Facebook has shown little eagerness to give up.

There are only two things which might make platforms conform to playing by the rules:

  1. Platforms such as Facebook demanding of advertisers that each campaign or ad buy (even electronic) carry a certification from the advertiser that the content does not violate the platforms’ published rules with a significant financial penalty to the advertiser for non-compliance and a prohibition from using the platform for say, three months for the first violation, six for the second, etc. This would awaken advertisers to the cost of breaking the rules and cost the platforms considerably more than a parking ticket in lost advertising revenue, certainly enough to make them take notice.
  2. The only other thing that might change the current acceptance of privacy violation will be a noisy revolution from consumers, not impossible to imagine, but hardly already to be seen marching toward our doorsteps.

Penalizing the advertiser and the platform for usage violations would almost certainly work. As Deep Throat advised in “All the President’s Men,” “follow the money.”

Turning the rising tide of privacy issues is more problematic. A recent study published by Iterable reported:

  • 48% of shoppers will share data for more personalized service (Deloitte)
  • 11 hours a day is spent by Americans engaging with electronic media (Nielson)
  • 84% of marketers use customer data to inform their marketing (eMarketer)
  • 76% of marketers are prioritizing customer loyalty over customer acquisition in 2018 (IDG)

This research answers our reader’s question, how can marketers create relevant marketing in a way where the customer doesn’t feel violated, a difficult conundrum. Put simply: If today’s 11-hour electronic media addicts want all the promised goodies waiting for them out there in the digital universe, and they don’t rate more than “bathroom door privacy” as high on their priority lists — 48% don’t seem to care — we are going to have to face this reality: For those who want to live in the digital world, our privacy is disappearing before our eyes. Should we be up in arms? Probably? But in fact, we are not. And there is very little we seem to want to do about it. Whether it could rightly be called a “violation” is a question more of perception than substance.

The rest of us, those to whom privacy is important, will just have to bow to the majority, do without privacy and take solace in this perceptive limerick penned by a late biomedical electrical engineer who worked at that by day and who wrote science fiction (and limericks) by night:

Was there no Life before there was Twitter?
Was it stodgy, lackluster or bitter?
I find Life too fleeting
To spend time in Tweeting,
I’m a face-to-face kind of a critter!

Don’t Be a Data Hoarder — Why Data Governance Matters in Marketing

They say data is an asset. I say it, too. If collected data are wielded properly, they can definitely lead to financial gains, either through a revenue increase or cost reduction. But that doesn’t mean that possessing large amounts of data guarantees large dollar figures for the collector. Data governance matters.

They say data is an asset. I say it, too. If collected data are wielded properly, they can definitely lead to financial gains, either through a revenue increase or cost reduction. But that doesn’t mean that possessing large amounts of data guarantees large dollar figures for the collector. Data governance matters, because the operative words in my statement are “wielded properly,” as I have been emphasizing for years through this column.

Plus, collecting data also comes with risks. When sensitive data go into the wrong hands, it often leads to a direct financial burden for the data collector. In some countries, an assumed guardian of sensitive data may face legal charges for mishandling sensitive data. Even in the United States, which is known as the “freest” country for businesses when it comes to data usage, data breach or clear abuse of data can lead to a publicity nightmare for the organization; or worse, large legal settlements after long and costly litigations. Even in the most innocuous cases, mistreatment of sensitive data may lead to serious damage to the brand image.

The phrase is not even cool in the business community anymore, but “Big Data” worked like a magic word only a few years ago. In my opinion, that word “big” in Big Data misled many organizations and decision-makers. It basically gave a wrong notion that “big” is indeed “good” in the data business.

What is “good,” in a pure business sense? Simply, more money. What was the popular definition of Big Data back then? Three Vs, as in volume, velocity and variety. So, if varieties of data in large volumes move around really fast, it will automatically be good for businesses? We know the answer by now, that a large amount of unstructured, unorganized and unrefined data could just be a burden to the holder, not to mention the security concerns listed earlier.

Unfortunately, with the popularity of Big Data and emergence of cloud computing, many organizations started to hoard data with a hope that collected data would turn into gold one day. Here, I am saying “hoarding” with all of the negative connotations that come with the word.

Hoarders are the people who are not able to throw away anything, even garbage. Data hoarders are the same way. Most datasets are huge because the collector does not know what to throw out. If you ask any hoarder why he keeps so many items in the house, the most common answer would be “because you never know when you need them.” Data hoarders keep every piece of data indefinitely for the same reason.

Only Keep Useful Data

But if you are playing with data for business purposes, you should know what pieces of data are useful for decision-making. The sponsor of any data activity must have clear objectives to begin with. Analysts would then find out what kind of data are necessary to meet those goals, through various statistical analyses and cumulative knowledge.

Actually, good analysts do know that not all data are created equal, and some are more useful than others. Why do you think that the notion of a Data Lake became popular following the Big Data hype? Further, I have been emphasizing the importance of an even more concise data environment. (I call it an “Analytics Sandbox.”) Because the lake water in the Data Lake is still not drinkable. Data must get smaller through data refinement and analytics to be beneficial for decision-makers (refer to “Big Data Must Get Smaller”).

Nonetheless, organizations continue to hoard data, because no one wants to be responsible for purging data that may be useful someday. Government agencies may have some good reasons to maintain large amounts of data, because the cost of losing or misplacing data about some terrorist activities is too high. Even in that case, however, we should collectively be concerned if the most sensitive data about us — such as our biometrics data — reside in some government agency’s server somewhere, without clear and immediate purposes. In cities like London or Paris, cameras are on every street corner, linked to facial recognition algorithms. But we tolerate that because the benefit outweighs the risk (so we think). But that doesn’t mean that we don’t need to be concerned with data breach or abuse.

Hoarding Data Gives Brands the Temptation to Be Creepy

If the data are collected by businesses for their financial gains, then the subjects of such data collection (i.e., consumers) should question who gave them the right to collect data about every breath we take, every move we make and every claim we stake. It is one thing to retain data about mutual transactions, but it is quite another to collect data on our movement or whereabouts, unilaterally. In other words, it is one thing to be remembered (for better service and recommendation in the future), but it is another to be stalked (remember “Every Breath You Take” is a song about a stalker).

Have you heard a story about a stalker who successfully courted the subject as result of stalking? Why do marketers think that they will sell more of their products by stalking their customers and prospects? Since when did being totally creepy – as in “I know where you are and what you’re doing right now” – become an acceptable marketing tactic? (Refer to “Don’t Do It Just Because You Can.”)

In fact, even if you do possess such data, in the interest of “not” being creepy, you must make your message more innocuous. For example, don’t act like you are offering an item because you “know” that the target looked around similar items recently. That kind of creepy approach may work once in a while, but let’s not call that a good sales tactic.

Instead, sellers should make gentle nudges. Don’t say “I know you are looking for this particular skin care item.” The response to that would be “Who the hell are you, and how do you know that?” Instead, do say “Would you be interested in our new product for people with sensitive skin?” The desirable response would be “Hey, I was just looking for something like that!”

The difference between a creepy stalking and a gentle nudging is huge, from the receiving end.

Through many articles about personalization, I have been emphasizing the use of model-based personas, as they pack so much information in the form of answer to questions and cover the gap of missing data (as we’d never know everything about everyone). If I may add one more benefit of modeling, it coverts data into probabilities. Raw data is about “I know she is looking for a particular high-end skin care item,” where coverage of such data is seriously limited, anyway. Conversely, model scores are about “Her score for high-end beauty products is 8 out of 10 scale score,” even if we may not even have concrete data about that specific interest.

Now, users who only have access to the model score — which is “dull” information, in comparison to “sharp” data about some verified behavior — would be less temped to say “Oh, I know you did this.” Even for non-geeky types, the difference between “Is” and “Likely to be” is vast.

If converting sharp data into innocuous probability scores through modeling is too much for you to start with, then at least categorize the data, and expose data points to users that way. Yes, we are living in the world of SKU-level product suggestion (like Amazon does), but as a consumer, have you ever “liked” such blunt suggestions, anyway? Marketers do it because such personalization does better than not doing anything at all, but such a practice is hardly ideal for many reasons (Being creepy being one. Refer to “Personalization Is About the Person”).

The saddest part in all this is that most marketers don’t even know how to fully utilize what they collected. I’ve seen too many organizations that are still stuck with using a few popular data variables repeatedly, while hoarding data indiscriminately. Why risk all of those privacy and security concerns, not to mention the data maintenance cost, if that is the case?

Have a Goal for All of That Data

If analytics is part of the process, then the analysts will tell you with conviction, that you don’t need all those data points for certain types of prediction. For instance, why risk losing a bunch of credit card numbers, when the credit card type or payment method is all you need to predict responses and propensities on a customer level?

Of course, the organization must first decide what types of models and predictions are necessary to meet their goals. But that is the beginning part of the whole analytics game, anyway. Analytics is not about answering to some wishful thinking of data hoarders; it should be a goal-oriented activity, with carefully selected and refined data for clear purposes.

A goal-oriented mindset is even more important in the age of machine learning and automation. Because we should never automate bad behaviors. Imagine a powerful marketing automation engine in the hands of data hoarders. Forget about organizational inefficiency. As a consumer, don’t you get a chill down your spine just imagining how creepy the outcome would be? Well, maybe we don’t really have to imagine it, as we all get bombarded with ineffective and not-so-personal offers every day.


So, marketers, have clear purposes in data activities, and do not become mindless data hoarders. If you do possess data, wield them properly with analytics. And while at it, purge pieces of data that do not fit your goals. That “you never know” attitude really doesn’t help anyone. And you are supposed to know your own goals and what data and methodologies will get you there.

As Amped-Up Ad, Data Privacy Laws Near, Self-Regulated Programs Matter More

As we prepare ourselves for federal (and state) legislation around privacy and advertising, it’s worth taking account of our own industry’s self-regulated programs — both those here at home and worldwide.

As we prepare ourselves for federal (and state) legislation around privacy and advertising, it’s worth taking account of our own industry’s self-regulated programs — both those here at home and worldwide.

Why? Because even in an age of regulation, self-regulation — and adherence to self-regulatory principles and ethics codes of business conduct — matter. One might argue that legal compliance in industry is good enough, but business reputations, brand equity and consumer trust are built on sterner stuff.

Having a code of conduct is exemplary in itself, but I’d like to address a vital component of such codes: enforcement.

Self-Regulated programs
Transparency & Accountability in Advertising Self-Regulation Matter Greatly. | Credit: Chet Dalzell

Credibility in Codes Requires Peer Review & Accountability

Behind the scenes, every day, there are dozens of professionals in our field who serve — as volunteers and as paid professionals — to monitor the ethical practice of advertisers, who devise and update the codes we adhere to, who educate companies that proactively reach out to them, who work with companies and brands that go astray to resolution, and who enforce and refer non-compliant companies to government agencies, when necessary.

They may take complaints directly from consumers, competitors and industry observers. They may employ technologies and their own eyes and ears to monitor the marketplace. They may meet regularly as volunteers as a jury to deliberate on any need for corrective action. And, usually, they have a “contact us, before we contact you” operations effect: brands and businesses can proactively ask ethics programs questions about the “right” way (by the consumer) to execute a marketing practice, so it doesn’t prompt a formal query after a mistake is made after the fact.

Importantly, credibility depends, too, on reporting publicly on outcomes — potentially to “name and shame,” but most often to work cooperatively with businesses and to serve as an industry education vehicle in the reporting of correction and the resolution process. Generally, “punitive” is when a non-cooperative company is referred to a government agency for further action. Government agencies, for their part, tend to wholeheartedly welcome any effective effort to keep the marketplace aligned with the consumer. It helps when brands and consumer interests are in sync.

Accountability Programs Deserve Our Industry’s Expertise & Ongoing Financial Support

All told, these important players in our field serve us well, even as we face what might be referred to as co-regulation (government regulation on top of self-regulation). While any potential business mishap — for example, in the handling of consumer data or the questionable content of an ad — has its own set of facts and ramifications, a demonstration of good-faith efforts to adhere to ethical business practices might be seen as a mitigating factor, even as a brand finds itself needing to take a corrective action.

Agility, flexibility and responsiveness … these are all attributes of successful self-regulation — as well as successful accountability. Effective self-regulation serves to keep pace with innovations in our field, and “point the way” for other companies, as issues arise. (The rigidity of laws rarely can accommodate such innovations.)

While industry professionals may serve as volunteers on juries and review panels — it can be fascinating to serve on such panels — there is almost always an infrastructure of programs and staffs underpinning self-regulation success. Trade associations may finance some of these efforts with membership dollars — but usually businesses can lend their own resources directly, too. It’s great to have a seat at the table.

Marketing Ethics & Self-Regulation Programs — A Partial Listing

In all likelihood, there are potentially many more codes of conduct — particularly in vertical fields (pharma, travel, non-profit, retail, etc.) — but here is a brief listing of advertising-related codes and programs that may be helpful to catalog, bookmark, research and support, with some of which I’ve had the honor to be associated:

Please feel free to use the Comments section to suggest others. And thank you to every volunteer and staff person who serves or has served in an industry accountability capacity. It makes a world of difference, with marketplace trust of advertising and advertisers being the ultimate goal.

How Push Notifications Are Exposing Consumers to a New Breed of Cyber Attack

Push notifications, powerful engagement tools for marketers and publishers on mobile, are exposing readers to new types of cyber attacks known as push lockers. Marketers need to recognize these risks and ensure they’re taking the steps necessary to protect their audience.

Editor’s Note: This article was originally written for the publishing industry, but a number of marketers also employ push notifications, and should be aware of possible cyber security threats.

It should come as no surprise to anyone involved in the digital advertising ecosystem that fraudsters are always looking for new methods to target users with sophisticated digital attacks. As soon as innovative new ways of engaging with users are developed, cyber criminals aren’t far behind with a method for exploiting these innovations, particularly when there’s money to be made. Now, as push notification ads grow in popularity, a new threat to user security is growing within the format: push lockers.

Upon identifying these push notification specific lockers, between February and March AdSecure — an ad security verification tool and my employer — saw a 563% increase in the detection of browser locker attacks, and at the time of writing this article, we have protected our partners from more than 20 unique push lockers in under 24 hours.

While push notifications are a popular way for publishers to engage their readers, publishers must recognize the growing risk and take the necessary steps to ensure that their readers are protected. That includes working with ad partners who have the necessary technology to identify these cyber security threats and thwart them.

What Is a Push Notification Ad?

Push notification ads are simple clickable messages, accompanied by a small image, that are delivered to desktop browsers or mobile devices, but only once a user has consented to receiving them. This is a key point, as the users have agreed to see the ads, leaving the perception that they are less intrusive than traditional formats, and develop a higher level of engagement from the user.

Push notifications work by displaying an initial permission request — managed by the browser — when a user is visiting a site for the first time. Once the user agrees to receive these push notifications, they will receive them based on the frequency set out by the advertiser. Should a user opt not to see push notifications, the browser logs this choice as well, and they won’t be asked to subscribe to them again.

What Is a Push Locker?

The push notification format, while relatively new, is growing in popularity within the online marketplace for all the reasons mentioned previously: users have to opt-in to see them at all, and with that consent comes a higher rate of engagement. Brands using push notifications are seeing increased click through rates, and just as marketers are seeing the clear benefits the format provides, cyber criminals are becoming wise to the potential for driving malicious campaigns straight to users screens. What has developed out of these sinister intentions is a new form of browser locker specifically designed around the natural behavior of a push ad.

When you make the choice to opt-in, or out, of receiving push notifications on a particular site, the browser manages the request and saves the choice. However, it’s the way the browser saves this choice — either by domain, or subdomain — that can expose the user to trouble.

What happens if the user opts out, but the website redirects him automatically to another subdomain? Can you guess what’s coming? This allows the user to be prompted again to accept the push notification. So naturally, he declines this new request, and then he’s sent to yet another subdomain and asked again, and again, and again. Suddenly he is trapped in an endless looping push notification nightmare, and he can only escape it by giving in and “consenting” to receive the push notification.

Incredibly annoying, right? But this is tame compared to what other push lockers are capable of.

What Types of Push Lockers Are Out There?

There are various types of push lockers, some more sophisticated than others. Here are two examples:

Browser Hijacking

Crypto currency mining is a popular way for cyber criminals to hijack a user’s browser so that the user is unaware that his computer power is being secretly used to mine crypto currency for the hijacker. A push locker will keep the user locked on the consent page until he accepts the push, all the while quietly mining crypto currencies in the background.

Users who opt in are then redirected to a new offer page which also launches the cryptocurrency miner, leaving the user with no safe option to take. When this type of push locker is implemented on a mobile browser, the entire device is rendered useless for the owner, again until he is forced to consent. In all cases, the looping push notification locks the user into an action that he absolutely does not want to take, and puts him at severe risk of exposure to exploit flaws or other security breaches.

Full-Screen Hijacking

If a user clicks somewhere on the page other than the buttons to allow or block a push notification this causes the browser to switch to full screen mode. That prevents the user from doing anything else until he accepts the push notification, which in turn leads the user to a scam offer, or the forced download of malware, or a similar security threat.

What’s the Solution? 

The relative speed at which push lockers have appeared on the scene has caught some ad verification providers off guard. They either weren’t aware of the problem quickly enough, or they aren’t using the modern technology needed to detect push lockers with any degree of consistency and precision.

Push lockers are sophisticated and pernicious, and in order to catch them early and often, the ad verification scanning technology being used needs to be based on the most modern browser technology available, particularly a crawler powered by Chrome, as Google’s browser is the most commonly used.

As more publishers and ad platforms begin to work with the push notification ad format, push locker attacks will spread across the digital ads landscape. As a publisher, make sure that your partners are working with an ad verification provider that has the resources and the knowledge needed to track down push lockers and keep them from hurting your end users.


Marketers Doing the Data Privacy Balancing Act Ask What ‘I Want My Privacy’ Means

It’s not just policymakers who are trying to figure out how to act on consumer sentiments toward data privacy. We all, overwhelmingly, want it — business and consumer.

data privacy
Credit: Pexels.com

It’s not just policymakers who are trying to figure out how to act on consumer sentiments toward data privacy. We all, overwhelmingly, want it — business and consumer.

We are all seeking a U.S. federal privacy law to “repair” what may be broken in Europe (hey, the toaster needs fixing), and to correct any perceived privacy shortcomings in California’s new law (scheduled to take effect in January). Will such a federal law pass this year?

One of the ongoing challenges for policy in this area is what’s been called the privacy paradox. The paradox? Privacy in the form of consumer attitudes, and privacy in the form of consumer demands and behaviors, rarely are in sync. Sometimes, they are polar opposites, simultaneously!

  • Should law be enacted on how we feel, or respectful of what we actually do?
  • How do we define privacy harms and focus regulation only what is harmful and to go light, very light, or even foster wholly beneficial uses?
  • Should private sector controls and public sector controls be differentiated?
  • Do existing laws and ethical codes of conduct apply, and how might they be modified for the digital age?

On top of this, consumer expectations with data and technology are not fixed. Their comfort levels with how information is used at least in the advertising sector change over time. In fact, some marketers can’t keep pace with consumer demands to be identified, recognized and rewarded across channels. Generations, too, have differences in attitudes and behaviors.

What’s creepy today may in fact be tomorrow’s consumer-demanded convenience.

Case in point: It used to be people complained about remarketing the ad following them around on the Net as they browsed. (All the same, remarketing works that’s why it was so pervasive.) Today, in role reversal, consumers sound off when the product they purchased is the same product they still see in the display ad. The consumer has little patience when brand data is locked in data silos: the transaction database doesn’t inform the programmatic media buy, in this scenario.

The marketing and advertising business have been trying to solve for the privacy paradox since the Direct Marketing Association assembled its first code of ethics in the 1960s and introduced the Mail Preference Service in 1971. (Today, the Mail Preference Service is now known as dmaChoice, and DMA is now part of the Data Marketing & Analytics division of the Association of National Advertisers.) During the 1970s, consumers could use MPS to both add their names to marketing lists, and to remove their names from marketing lists for direct mail. At that time, far more consumers sought to add their names. Later, MPS strictly devoted itself to offering consumers an industry-wide opt-out for national direct mail, with add-ons for sweepstakes and halting mail to the deceased.

During the ’70s, DMA also required its member mailers (and later telemarketers and emailers) to maintain their own in-house suppression lists. These ethics behaviors were codified, to some extent, when the U.S. government enabled the Do-Not-Call registry and enacted the CAN-SPAM Act to complement these efforts.

Fair Information Practice Principles A Framework That Still Works Wonders

So here we are in the digital age, where digital display and mobile advertising are among addressable media’s growing family. Again, the marketing community rose to the challenge enacting the Digital Advertising Alliance YourAdChoices program (disclaimer, a client) and offering consumers an opt-out program for data collection used for interest-based advertising for Web browsing (desktop and mobile) and mobile applications.

Over and over again, the pattern is the same: Give consumers notice, give consumers control, prevent unauthorized uses of marketing data, protect sensitive areas recognize advertising’s undeniable social and economic power, enable brands to connect to consumers through relevance and trust and act to prevent real harms, rather than micromanage minor annoyances. Allow marketing innovations that create diversity in content, competition and democratization of information. Let the private sector invest in data where no harms exist.

‘I own my data!’

Data ownership is a dicey concept. Isn’t there sweat equity when a business builds a physical or virtual storefront and you choose to interact with it? Is there not some expectation of data being contributed in fair exchange for the digital content we freely consume and the apps we download and enjoy? And once we elect to become a customer, isn’t it better for the brand to know you better, to serve you better? Shouldn’t loyalty over time be rewarded? That’s an intelligent data exchange, and the economy grows with it.

The demand for access to everything free, without ads, and without data exchange, without payment to creators is a demand for intellectual property theft. Sooner than later, the availability and diversity of that content would be gone. And so would democracy. If you put everything behind an ad-free paywall, then only the elites would have access.

‘But I pay for my Internet service. I pay for my phone service!’

Sure you do and that pays for the cell towers, and tech and Web infrastructure, union labor with some profit for the provider. But unless you’re also paying for subscriptions and content it’s advertising that is footing the bill for the music you listen to, the news you read, the apps you use, and so on. All the better when ads are relevant.

At the end of the day, the consumer is always right and privacy is personally defined.

I’m all for limits on what governments can do with data when it comes to surveillance, and how it goes about maintaining our safety and security (a paradox of its own).

On the private sector side, policymakers might best act to give a privacy floor (do no harm) and where economic benefits accrue (to serve consumers without harms) allow consumers freely accessible tools to set their own privacy walls, using browser settings, industry opt-outs, brand preference centers and other widely available no-cost filters. It’s a wise society that can encourage responsible data flows, while blocking altogether irresponsible data flows. Get it right, and we all participate in a thriving 21st Century Information Economy. Get it wrong, and Europe and China will set the global rules. With some luck and deliberation, we’ll get this right.

Getting ‘Facebook Sober’? What Marketers Should Know About Consumers’ Attitudes and Social Data

I thought I was pretty clever when someone told me they hadn’t been on Facebook in over a year and I said, “Wow, you’re one-year Facebook sober.” They laughed. The next day, another person said they’d been off for two years — same comment by me, same reaction. But later, I found the term on Urban Dictionary.

I thought I was pretty clever when someone told me they hadn’t been on Facebook in over a year and I said, “Wow, you’re one-year Facebook sober.” They laughed. The next day, another person said they’d been off for two years — same comment by me, same reaction. But later, I found the term “Facebook sober” on Urban Dictionary — so much for my right to claim ownership of the term.

It’s unlikely that a new 12-step program is going to keep a significant percentage of the more than 2 billion people off of the social media platform any time soon, even though they know Facebook is exploiting their personal data for profit. While studies show that consumers believe the economic benefit of Facebook to them is about $1,000 per year, based on how much they would need to be paid to stay off the platform for that period of time, most will not pay anything to keep a company from tracking their data.

A study published by PlosOne in December 2018 quantified the monetary value that users assigned to participating on Facebook, using an auction experiment design.

Though the populations sampled and the auction design differ across the experiments, we consistently find the average Facebook user would require more than $1,000 to deactivate their account for one year. While the measurable impact Facebook and other free online services have on the economy may be small,* our results show that the benefits these services provide for their users are large.
* (Of course, this statement neglects the $40 billion Facebook realizes in annual advertising revenue.) 

While people claim to be concerned about privacy, they’re not willing to pay for it. A Survey Monkey poll done for the news site Axios earlier this month shows that three-fourths of people are willing to pay less than $1 per month in exchange for a company not tracking their data while using their product — 54% of them are not willing to pay anything.

Researchers at Stanford and NYU sought to determine the effects that Facebook deactivation would have on people’s knowledge, attitudes, moods, and behaviors. “This Is Your Brain Off Facebook,” published by the New York Times on Jan. 13, reports on this study.  A portion of the study participants were paid $102 to stay off Facebook for one month. The researchers stated:

Using a suite of outcomes from both surveys and direct measurement, we show that Facebook deactivation (i) reduced online activity, including other social media, while increasing offline activities such as watching TV alone and socializing with family and friends; (ii) reduced both factual news knowledge and political polarization;(iii) increased subjective well-being; and (iv) caused a large persistent reduction in Facebook use after the experiment.

Despite these findings, the Times reported “some participants said that they had not appreciated the benefits of the platform until they had shut it down:

“What I missed was my connections to people, of course, but also streaming events on Facebook Live, politics especially, when you know you’re watching with people interested in the same thing,” said Connie Graves, 56, a professional home health aide in Texas, and a study subject. “And I realized I also like having one place where I could get all the information I wanted, boom-boom-boom, right there.”

As I noted in my post last month, “Gen Z College Students Weigh-in on Personal Data Collection,” some GenZers don’t mind giving up their personal data in exchange for the convenience of targeted ads and discounts; others are uneasy, but all are resigned to the inevitability of it. One student summed up our mass acquiescence, saying:

“I do not feel it is ethical for companies to distribute our activities to others. Despite my feelings on the situation, it will continue — so I must accept the reality of the situation.”

The reality of the situation is that people are not willing to go cold turkey on Facebook.

New Privacy Regulations Coming Your Way: California Consumer Privacy Act (CCPA)

Have you recovered from last spring’s GDPR adrenaline rush yet? As much anxiety as GDPR regulations provoked, that may soon look like the good old days. Now California passed a privacy initiative you will be expected to follow starting Jan. 1, 2020.

Editor’s Note: While this piece is directed at publishers, CCPA also will be something marketers will have to be compliant with, just like GDPR.

Have you recovered from last spring’s GDPR adrenaline rush yet? Everybody in publishing was nervous about finding the right way to comply with new European privacy regulations. It did not seem like there was one clear path to compliance.

As much anxiety as GDPR regulations provoked, that may soon look like the good old days. At least in the EU, 27 countries came together with one edict. They also spent the time necessary to be smart and coherent, whether or not you agree with all the details.

Now California passed a privacy initiative you will be expected to follow starting Jan. 1, 2020. In many industries as goes California law, so go U.S. standards. This will be, in practice, a new national standard. California is too dominant a market, larger than most countries on the globe. Add to that a quirk in the drafting of the law, which says you must treat anyone who has left California and intends to return as a Californian. What?

Newly minted California Governor Gavin Newsom hailed the “first-in-the-nation digital privacy law” in his first State of the State address, according to reporting by Wendy Davis in MediaPost. “Companies that make … billions of dollars collecting, curating, monetizing our personal data also have a duty to protect in. Consumers have the right to know and control how their data is being used.”


“The California law was written in five days, and really shows,” says Christopher Mohr, VP of intellectual property and general counsel at SIIA. “It is an extraordinarily complicated and poorly written statute.” Adding insult to injury, it is grammatically inconsistent and difficult to understand. I can’t imagine what compelled them to rush such important legislation through. It sounds irresponsible when you consider the EU worked on GDPR for more than three years.

“This is not the same as GDPR — it’s much broader.” Not a statement the already GDPR-fearing publishing industry wants to hear. Mohr continues, “In GDPR the information is tied to a data subject, for example, an individual. The CCPA covers ‘households’ as well as individuals. In addition, the CCPA’s potential ban on the use of information extends not only to the information but to the ‘inferences’ you might draw from it.” Inferences? Yikes! The law goes on to explain what is meant, but the idea of inferring conclusions sounds ripe for misinterpretation to me.

The main goal of the law is to regulate the collection and sale of personally-identifiable (PI) consumer data to third parties and service providers. You do not need to get paid for the data. If you disclose it to another party, it is considered a transaction. Using outside vendors to help manage your data is not a problem, because you are the controlling party.

Everyone will now have the “right to delete.” I asked Mohr to confirm that means deleting people from your database, not from your articles. “That’s the intent, I think. Whether the words match the intent is a completely different issue, and it’s not as clear as it could be. Personal information covers any information that could be associated with an individual.”

Anyone can tell you to cease disclosing their data to others; and you must comply. You cannot deny goods or services to anyone because of their data opt-out. That becomes the new Catch-22: In order to know you are not supposed to have data on an individual, you must have that individual in your database. And since it is likely you must have data on an individual in order to do business with him or her, how do you conduct business with data exceptions? For those rare European GDPR complainants, admittedly some American publishers will simply delete; good-bye. In the Hotel California, “you can check out any time you like, but you can never leave.”

Preventing a Privacy Tower of Babel

Fortunately, enforcement is by state attorney general, not by individuals. In other words, thank God this is not an invitation to everyone in California to sue. Of course this law will be challenged in court. It may be too vague, according to some. It may be discriminatory, since non-profits (and government agencies) can ignore it and do what they want, the way it is written.

Living in this hyper-intrusive world, it’s hard to disagree with the intent of CCPA since we are all being personally data mined. But play this out. Imagine what mischief the other 49 states can do. Davis reports, Washington state “lawmakers are considering a bill that would not only give consumers the right to learn what data is collected about them, but would also allow them to prevent their personal data to be used for ad targeting.”

Federal legislation is coming on this after the recent grillings on Capitol Hill of some of the leading big-tech luminaries. Typically federal legislation trumps local law, which is what makes interstate commerce work. Hopefully there will be one law of the land, so any company handling data can maintain sanity versus bowing to every state, city, or county passing a law. But in these Alice in Wonderland times we are in, I will leave that speculation to you.

You have complied with GDPR so that means you now have DPO (data protection officer). The CCPA gives your DPO a little more to do.

I’m no lawyer, so I’ll provide the usual disclaimer on all the above. On the other hand, I am a member of and advocate for the Specialized Information Publishers Association, part of SIIA, whose general counsel Chris Mohr was invaluable in enabling me to share an understanding of this law. I believe it makes great sense to occasionally be involved with your peers and work on common problems like privacy laws. As a member of SIPA or Connectiv, you won’t need to call your lawyer every time there is a question about the new privacy landscape. You can take advantage of knowledgeable experts in your corner.

Do I have you pining for the muddy clarity of GDPR yet?