How Long Should Your Content Marketing Articles Be?

How long your content marketing articles are is critical to their success, but there is no one right length. How long any particular article should be depends on what that article’s purpose is, who you’re trying to reach, and where they are in the buying process.

If you’re like most marketers, you’ve got two very different voices whispering in your ears about length for your content marketing materials. They may not be devil and angel exactly, but they are most certainly not in agreement.

On the one hand, er, shoulder, you’ve got a voice telling you that nobody reads anymore, everyone scans, so don’t bother making long-form content. Keep it short and digestible.

On the other shoulder, there is a voice (perhaps in the form of your SEO expert) telling you that every article needs to break at least 300 words — ideally, 500 — to effectively rank well.

As you try to decide which voice to heed, here are a few things to consider.

What Data Tells Us About Content Length

A quick Google search will give you all sorts of information about how long your content marketing pages should be.

Plenty of sources will site the 300- to 500-word minimum mentioned above.

Neil Patel says that he focuses on content in the 2,000- to 3,000-word range. (While, at the same time, advising us to not write content that is too in-depth!)

Seth Godin seems to be doing quite well for himself with much shorter content.

So who’s right? Everyone and no one. Patel is doing what works for him. Godin has found a different path. You could — and should — argue that those aren’t really fair comparisons, as both of those marketers are “stars” on some level, and have much larger followings than you might.

That’s the point, though; there are always mitigating circumstances. And what’s right for you won’t necessarily work for someone else. Which means what the data should tell you is that you need to gather your own data.

Start with whatever you’re comfortable doing. If more frequent, shorter pieces feel right, dive right in. If you feel that longer-form articles are more your speed, that’s great. In either case, track what you’re doing, monitor the results, and experiment with content at other lengths. (And in other formats, for that matter.)

That’s the only way to find out what your audience wants from you.

What Is Your Article Designed to Do?

The next question you should be asking is, “What is my goal for this content?” Presumably, you’ll publish content of different types and with different goals in mind. Long-form content may be just the ticket for prospects who are close to making a buying decision, while shorter pieces that link to a lead magnet of some kind are the right way to gain trust with prospects who are just discovering you.

Similar differences might exist for different audience segments or for different product/service lines you may be marketing. Be sure you match the length and format of your content to its intended purpose and audience.

How to Use Varying Content Lengths to Your Advantage

Once we come to understand that different content lengths will work for us in different ways, we can layer on the ways in which our content elements should relate to one another. One popular way of thinking about this is the solar system model.

As you’d imagine, the idea here is to have a variety of “smaller” content elements orbiting around a bigger piece of cornerstone content. Not all of those orbiting pieces will necessarily be shorter, but there will be a general progression of large to small as you move away from the center.

For example, a how-to guide in the form of an eBook might be your cornerstone content. Each chapter of that book could perhaps be developed into a presentation (and slide deck) of its own. Many of the slides in that deck might work well as individual short videos.

Don’t Forget the Common Sense

What’s important to keep in mind is that while copy length does matter for your content marketing, there is no ideal length for all content marketing articles. There are many ideal lengths.

If you’re just starting out — or are wiping the decks and making a fresh start — and aren’t sure what lengths will work, it may be helpful to think about the conversations your sales, marketing, and customer service teams have with your prospects and clients. There will be an arc to those conversations that should guide the depth of your content for prospects at various places in the buying process. Your content length should match that arc.

When you’ve got it right, your data will let you know, and you would be wise to match your ongoing work to your data — while still experimenting to find the next great sweet spot for your content marketing.

Stop Expecting Data Scientists to Be Magical: Analytics Is a Team Sport

Many organizations put unreasonable expectations on data scientists. Their job descriptions and requirements are often at a super-human level. “They” say — and who are they? — that modern-day data scientists must be good at absolutely everything. Okay, then, what’s “everything,” in this case?

Many organizations put unreasonable expectations on data scientists. Their job descriptions and requirements are often at a super-human level. “They” say — and who are they? — that modern-day data scientists must be good at absolutely everything. Okay, then, what’s “everything,” in this case?

First, data scientists have to have a deep understanding in mathematics and statistics, covering regression models, machine learning, decision trees, clustering, forecasting, optimization, etc. Basically, if you don’t have a post-graduate degree in statistics, you will fail at “hello.” The really bad news is that even people with statistics degrees are not well-versed in every technique and subject matter. They all have their specialties, like medical doctors.

Then data scientists have to have advanced programming skills and deep knowledge in database technologies. They must be fluent in multiple computer languages in any setting, easily handling all types of structured and unstructured databases and files in any condition. This alone is a full-time job, requiring expert-level experience, as most databases are NOT in analytics-ready form. It is routinely quoted that most data scientists spend over 80% of their time fixing the data. I am certain that these folks didn’t get an advanced degree in statistics to do data plumbing and hygiene work all of the time. But that is how it is, as they won’t see what we call a “perfect” dataset outside schools.

Data scientists also have to have excellent communication and data visualization skills, being able to explain complex ideas in plain English. It is hard enough to derive useful insights out of mounds of data; now they have to construct interesting stories out of them, filled with exciting punchlines and actionable recommendations at the end. Because most mortals don’t understand technical texts and numbers very well — many don’t even try, and some openly say they don’t want to think — data scientists must develop eye-popping charts and graphs, as well, using the popular visualization tool du jour. (Whatever that tool is, they’d better learn it fast).

Finally, to construct the “right” data strategies and solutions for the business in question, the data scientist should have really deep domain and industry knowledge, at a level of a management and/or marketing consultant. On top of all of that, most job requirements also mention soft skills — as “they” don’t want some data geeks with nerdy attitudes. In other words, data scientists must come with kind and gentle bedside manners, while being passionate about the business and boring stuff like mathematics. Some even ask for child-like curiosity and ability to learn things extremely fast. At the same time, they must carry authority like a professor, being able to influence non-believers and evangelize the mind-numbing subject of analytics. This last part about business acumen, by the way, is the single-most important factor that divides excellent data scientists who add value every time they touch data, and data plumbers who just move data around all day long. It is all about being able to give the right type of homework to themselves.

Now, let me ask you: Do you know anyone like this, having all of these skills and qualities in “one” body? If you do, how many of them do you personally know? I am asking this question in the sincerest manner (though I am quite sarcastic, by nature), as I keep hearing that we need tens of thousands of such data scientists, right now.

There are musicians who can write music and lyrics, determine the musical direction as a producer, arrange the music, play all necessary instruments, sing the song, record, mix and master it, publish it, and promote the product, all by themselves. It is not impossible to find such talents. But if you insist that only such geniuses can enter the field of music, there won’t be much music to listen to. The data business is the same way.

So, how do we divide the task up? I have been using this three-way division of labor — as created by my predecessors — for a long time, as it has been working very well in any circumstance:

  • A Statistical Analyst will have deep knowledge in statistical modeling and machine learning. They would be at the core of what we casually call analytics, which goes way beyond some rule-based decision-making. But these smart people need help.
  • A Master Data Manipulator will have excellent coding skills. These folks will provide analytics-ready datasets on silver platters for the analysts. They will essentially take care of all of the “before” and “after” steps around statistical modeling and other advanced analytics. It is important to remember that most projects go wrong in data preparation and post-analytics application stages.
  • A Business Analyst will need to have a deep understanding of business challenges and the industry landscape, as well as functional knowledge in modeling and database technologies. These are the folks who will prescribe solutions to business challenges, create tangible projects out of vague requests, evaluate data sources and data quality, develop model specifications, apply the results to businesses, and present all of this in the form of stories, reports, and data visualization.

Now, achieving master-level expertise in one of these areas is really difficult. People who are great in two of these three areas are indeed rare, and they will already have “chief” or “head” titles somewhere, or have their own analytics practices. If you insist only procuring data scientists who are great at everything? Good luck to you.

Too many organizations that are trying to jump onto this data bandwagon hire just one or two data scientists, dump all kinds of unorganized and unstructured data on them, and ask them to produce something of value, all on their own. Figuring out what type of data or analytics activity will bring monetary value to the organization isn’t a simple task. Many math geeks won’t be able to jump that first hurdle by themselves. Most business goals are not in the form of logical expressions, and the majority of data they will encounter in that analytics journey won’t be ready for analytics, either.

Then again, strategic consultants who develop a data and analytics roadmap may not be well-versed in actual modeling, machine learning implementation, or database constructs. But such strategists should operate on a different plane, by design. Evaluating them based on coding or math skills would be like judging an architect based on his handling of building materials. Should they be aware of values and limitations of data-related technologies and toolsets? Absolutely. But that is not the same as being hands-on, at a professional level, in every area.

Analytics has always been a team sport. It was like that when the datasets were smaller and the computers were much slower, and it is like that when databases are indeed huge and computing speed is lightning fast. What remains constant is that, in data play, someone must see through the business goals and data assets around them to find the best way to create business value. In executing such plans, they will inevitably encounter many technical challenges and, of course, they will need expert-level technicians to plow through data firsthand.

Like any creative work, such as music producing or movie-making, data and analytics work must start with a vision, tangible business goals, and project specifications. If these elements are misaligned, no amount of mathematical genius will save the day. Even the best rifles will be useless if the target is hung in a wrong place.

Technical aspects of the work matter only when all stakeholders share the idea of what the project is all about. Simple statements like “maximizing the customer value” need a translation by a person who knows both business and technology, as the value can be expressed in dollars, visits, transactions, dates, intervals, status, and any combination of these variables. These seemingly simple decisions must be methodically made with a clear purpose, as a few wrong assumptions by the analyst at-hand — who may have never met the end-user — can easily derail the project toward a wrong direction.

Yes, there are people who can absolutely see through everything and singlehandedly take care of them all. But if your business plan requires such superheroes and nothing but such people, you must first examine your team development roadmap, org chart, and job descriptions. Keep on pushing those poor and unfortunate recruiters who must find unicorns within your budget won’t get you anywhere; that is not how you’re supposed to play this data game in the first place.

The Data-Inspired Big Idea: Why That Matters in the Ad Business

We are amid an age where consumers are royalty — and it’s the brands that serve them. Yes, data science is required to uncover insights and inform the creative strategy, for both prospecting and retention. But that big idea still lies in the creative execution.

I just got schooled this past week at the Association of National Advertisers Masters of Marketing Conference in Orlando, along with 3,000-plus industry colleagues.

You see, I’m a data- and direct marketing- junkie. Advertising is worthless if it’s not accountable and measurable (check and check). As I was reminded repeatedly this week it also must be memorable (not always checked).

What does this mean? That in today’s always-on but distracted consumer marketplace, the ad message must tell a story. It needs compelling creative, a message that resonates, and a big idea that’s transparent and authentic and unique to a brand.

We are amid an age where consumers are royalty and it’s the brands that serve them. Yes, in the customer experience mix, data plays a pivotal role. Yes, data science is required to uncover insights and inform the creative strategy, for both prospecting and retention. But that big idea still lies in the creative execution that’s the clincher. If it doesn’t hook, then it’s not going to stick.

Brand-Building Requires Purpose and Perspective

Consider some of these executions showcased at the conference, and look for how the brand creates an emotional connection:

Disney | The Little Duck

Target | Design for All

Chipotle | Bee For Real

Ally | Banksgiving

Dunkin | Fuel Your Destiny

https://youtu.be/31A1EsTZlHA

The Data Play in ‘Brand Crave’

Then ask yourself, what role does data play in these brand stories?

At the conference, there were plenty of CMOs discussing first-party data, customer journey mapping, personas, net promoter scores, operational data, transactional data, and sentiment scoring among other metrics and inputs. Even second- and third-party data were mentioned (albeit briefly here) about how to expand reach, discover new customers, and deepen understanding with existing customers. These data points also inform the creative brief, as well as shape the media strategy.

Researchers still report that consumers still base many of their buying decisions on impulse, and on emotion. According to Kirk Perry, president of global client and agency solutions at Google, as much as 70% of advertising success depends on creative; and Kai Wright, lecturer at Columbia University, reported on how emotion weighs into consumer consideration and purchase behavior (see Image 1).

Image 1:  Emotion & Experiential Data Motivate Consumer Behavior, Perhaps More Than Audience Data

Data-Inspired big idea image
Credit: Kai Wright, Columbia University, ANA Masters of Marketing Conference, 2019.

SAP CMO Alicia Tillman reports that humans experience (and act upon) 27 emotions (Image 2). “Any one can make or break a brand or category.”

Image 2: Lots of Sentiment Scoring

Data-Inspired big idea sentiment scoring
Credit: Alicia Tillman, SAP, at ANA Masters of Marketing Conference, 2019

“Nobody can differentiate on data! It’s data-inspired storytelling that is going to win the future,” said Rishad Tobaccowala, chief growth officer at Publicis Groupe.

We are great at curating audience data. For a next-generation data ecosystem, what are we doing to help create more effective marketing through finding innovative ways to score emotion, at-scale?  What are we doing to include these consumer motivators in our business rules, algorithms and to help enhance creative prowess in authentic ways? You solve for these opportunities and there are many brand leaders and CMOs likely ready to talk to you.

It’s time to help brands tell their data-inspired stories.

 

Purging (and Blocking) Bot Traffic From Email Reporting Metrics

How many “fake” email metrics are out there — spurious traffic measured in opens, clickthroughs, and other engagement metrics? How many of these email reporting metrics may be built into service-level guarantees offered by some email service providers (ESPs)? And what should we do about it?

How many “fake” email metrics are out there spurious traffic measured in opens, clickthroughs, and other engagement metrics? How many of these email reporting metrics may be built into service-level guarantees offered by some email service providers (ESPs)? And what should we do about it?

For those of who pay attention to such metrics (thank you for reading this far), perhaps we need to do more data investigation, working closely with our ESPs to make sure there’s nothing “fake” in our marketing performance reporting.

This was essentially the point of Stirista Global CEO Ajay Gupta in a blog post he shared after a competitor’s operations were reportedly shut down by its new parent company this summer. I’m using this post to share some of his observations  which may be helpful as we look to our email campaigns, and read the engagement data in order to best ascertain accuracy. [Disclaimer: Stirista is a continuing client. My interest in amplifying this content is intended to serve email marketers, at large.]

A Cautionary Tale: Take 5 Media Group Shutdown

Gupta gave permission to share his Aug. 9 post:

Ajay Gupta
Ajay Gupta

Stirista Global CEO Ajay Gupta has something to say about email reporting fraud.

“Tongues have been wagging in the marketing world ever since the New York Times’ shocking exposé in early 2018 about how easy it is to buy social followers. And, how most of the followers you buy turn out to be ‘bots’ or fake accounts, and not real people.

“I was not surprised, because I work in digital media and knew about this practice. So, I cried, screamed, and wrote about an even bigger epidemic in the world of email. My articles were received with polite applause and not much more in terms of action.

“But then last week happened. One of our competitors, Take 5 Media Group, shut down operations with a ‘ceased operations’ message on its website. While details are still murky, one of our partners shared an email from them that mentioned the parent company had completely shut down the business after discovering inconsistencies in how open and clickthrough rates were inaccurately reported to its clients.

“The parent company did the right thing, in after discovering these inconsistencies, took immediate action to first, take responsibility, and subsequently, offer its clients reimbursement for payment of services already rendered. Kudos to them for standing up for the right thing, but there are still at least a half dozen companies masquerading as legitimate entities that continue the practice.

“This incident is but a sobering reminder that bots remain a big problem in email marketing today. Sadly, when you order up a prospecting campaign from an email service provider, chances are that the large part of the campaign is being sent to fake bot accounts. And nobody seems to care.

“We have, as an industry, created a fake floor of 10% open on acquisition emails. When marketing managers of Fortune 1000 companies ask Stirista to guarantee 10% open just because some guy from Florida said so, we know we have a problem.

“Now, it should be clear to any marketer worth his or her salt, that if the bulk of the clicks come through bots, that conversion rates will be dismal. So, I can only assume that the marketers ordering up these campaigns aren’t keeping their eyes on conversions. They must judge them on clicks and opens. Or, maybe they don’t care. We are here today because many large data companies that outsource email campaigns have subsidized fraud.

“Let Take 5 serve [as] a cautionary tale, but realize that this is not an isolated incident. The pressure to deliver fake open, fake clicks, and fake form fills transcend one company and one incident. Collectively, this industry has turned a blind eye to fraud, just because ‘so and so’ is a nice guy and a vegetarian who loves animals.

“These fraudulent providers often work quietly, behind the scenes, for a reputable agency or data provider. Many times, marketers are shielded from the dirty dealings underneath the hood. But all parties involved — the providers, their partners, and the marketers themselves — should be ashamed of themselves. And, the FCC should be on their case. Until then, we must all be responsible for fighting back against bot fraud.

“I urge all marketers to shun this practice. It’s wasting your company’s money. And it’s given honest, transparent providers like me a bad name. Open rates are a terrible metric to track as in you can’t track it that well.

“So, if you hear a guarantee that sounds too good to be true, very likely it is. Walk, make that RUN, the other way, FAST.”

Back to Chet. I remember the first time I saw a data provider advertise a way to “buy” 5,000 followers on this-or-that social platform for some CPM, some 10 to 12 years ago and I thought then, “here we go again with the shysters living on and off the fringes of direct marketing.” In each and everywhere data is in play, and the compensation from it, we must guard ourselves from the “fake” and the “fraud.” Better to measure conversions, sales, and metrics that are real.

When You Fail, Don’t Blame Data Scientists First — or Models

The first step in analytics should be “formulating a question,” not data-crunching. I can even argue formulating the question is so difficult and critical, that it is the deciding factor dividing analysts into seasoned data scientists and junior number-crunchers.

Last month, I talked about ways marketing automation projects go south (refer to “Why Many Marketing Automation Projects Go South”). This time, let’s be more specific about modeling, which is an essential element in converting mounds of data into actionable solutions to challenges.

Without modeling, all automation efforts would remain at the level of rudimentary rules. And that is one of the fastest routes to automate wrong processes, leading to disappointing results in the name of marketing automation.

Nonetheless, when statistically sound models are employed, users to tend to blame the models first when the results are less than satisfactory. As a consultant, I often get called in when clients suspect the model performance. More often than not, however, I find that the model in question was the only thing that was done correctly in a series of long processes from data manipulation and target setting to model scoring and deployment. I guess it is just easier to blame some black box, but most errors happen before and after modeling.

A model is nothing but an algorithmic expression measuring likelihood of an object resembling — or not resembling — the target. As in, “I don’t know for sure, but that household is very likely to purchase high-end home electronics products,” only based on the information that we get to have. Or on a larger scale, “How many top-line TV sets over 65 inches will we sell during the Christmas shopping season this year?” Again, only based on past sales history, current marcom spending, some campaign results, and a few other factors — like seasonality and virality rate.

These are made-up examples, of course, but I tried to make them as specific and realistic as possible here. Because when people think that a model went wrong, often it is because a wrong question was asked in the first place. Those “dumb” algorithms, unfortunately, only provide answers to specific questions. If a wrong question is presented? The result would seem off, too.

That is why the first step in analytics should be “formulating a question,” not data-crunching. Jumping into a data lake — or any other form of data depository, for that matter — without a clear definition of goals and specific targets is often a shortcut to demise of the initiative itself. Imagine a case where one starts building a house without a blueprint. Just as a house is not a random pile of building materials, a model is not an arbitrary combination raw data.

I can even argue formulating the question is so difficult and critical, that it is the deciding factor dividing analysts into seasoned data scientists and junior number-crunchers. Defining proper problem statements is challenging, because:

  • business goals are often far from perfectly constructed logical statements, and
  • available data are mostly likely incomplete or inadequate for advanced analytics.

Basically, good data players must be able to translate all those wishful marketing goals into mathematical expressions, only using the data handed to them. Such skill is far beyond knowledge in regression models or machine learning.

That is why we must follow these specific steps for data-based solutioning:

data scientists use this roadmap
Credit: Stephen H. Yu
  1. Formulating Questions: Again, this is the most critical step of all. What are the immediate issues and pain points? For what type of marketing functions, and in what context? How will the solution be applied and how will they be used by whom, through what channel? What are the specific domains where the solution is needed? I will share more details on how to ask these questions later in this series, but having a specific set of goals must be the first step. Without proper goal-setting, one can’t even define success criteria against which the results would be measured.
  2. Data Discovery: It is useless to dream up a solution with data that are not even available. So, what is available, and what kind of shape are they in? Check the inventory of transaction history; third-party data, such as demographic and geo-demographic data; campaign history and response data (often not in one place); user interaction data; survey data; marcom spending and budget; product information, etc. Now, dig through everything, but don’t waste time trying to salvage everything, either. Depending on the goal, some data may not even be necessary. Too many projects get stuck right here, not moving forward an inch. The goal isn’t having a perfect data depository — CDP, Data Lake, or whatever — but providing answers to questions posed in Step 1.
  3. Data Transformation: You will find that most data sources are NOT “analytics-ready,” no matter how clean and organized they may seem (there are often NOT well-organized, either). Disparate data sources must be merged and consolidated, inconsistent data must be standardized and categorized, different levels of information must be summarized onto the level of prediction (e.g., product, email, individual, or household levels), and intelligent predictors must be methodically created. Otherwise, the modelers would spend majority of their time fixing and massaging the data. I often call this step creating an “Analytics Sandbox,” where all “necessary” data are in pristine condition, ready for any type of advanced analytics.
  4. Analytics/Model Development: This is where algorithms are created, considering all available data. This is the highlight of this analytics journey, and key to proper marketing automation. Ironically, this is the easiest part to automate, in comparison to previous steps and post-analytics steps. But only if the right questions — and right targets — are clearly defined, and data are ready for this critical step. This is why one shouldn’t just blame the models or modelers when the results aren’t good enough. There is no magic algorithm that can save ill-defined goals and unusable messy data.
  5. Knowledge Share: The models may be built, but the game isn’t over yet. It is one thing to develop algorithms with a few hundred thousand record samples, and it’s quite another to apply them to millions of live data records. There are many things that can go wrong here. Even slight differences in data values, categorization rules, or even missing data ratio will make well-developed models render ineffective. There are good reasons why many vendors charge high prices for model scoring. Once the scoring is done and proven correct, resultant model scores must be shared with all relevant systems, through which decisions are made and campaigns are deployed.
  6. Application of Insights: Just because model scores are available, it doesn’t mean that decision-makers and campaign managers will use them. They may not even know that such things are available to them; or, even if they do, they may not know how to use them. For instance, let’s say that there is a score for “likely to respond to emails with no discount offer” (to weed out habitual bargain-seekers) for millions of individuals. What do those scores mean? The lower the better, or the higher the better? If 10 is the best score, is seven good enough? What if we need to mail to the whole universe? Can we differentiate offers, depending on other model scores — such as, “likely to respond to free-shipping offers”? Do we even have enough creative materials to do something like that? Without proper applications, no amount of mathematical work will seem useful. This is why someone in charge of data and analytics must serve as an “evangelist of analytics,” continually educating and convincing the end-users.
  7. Impact Analysis: Now, one must ask the ultimate question, “Did it work?” And “If it did, what elements worked (and didn’t work)?” Like all scientific approaches, marketing analytics and applications are about small successes and improvements, with continual hypothesizing and learning from past trials and mistakes. I’m sure you remember the age-old term “Closed-loop” marketing. All data and analytics solutions must be seen as continuous efforts, not some one-off thing that you try once or twice and forget about. No solution will just double your revenue overnight; that is more like a wishful thinking than a data-based solution.

As you can see, there are many “before” and “after” steps around modeling and algorithmic solutioning. This is why one should not just blame the data scientist when things don’t work out as expected, and why even casual users must be aware of basic ins and outs of analytics. Users must understand that they should not employ models or solutions outside of their original design specifications, either. There simply is no way to provide answers to illogical questions, now or in the future.

4 Tips Aimed at Defending Digital Marketing’s Value

For B2B marketing, it isn’t always as easy to quantify success as we would like, even with the near-infinite measurability of digital marketing. Here are ideas for defending your digital marketing’s value.

“Half the money I spend on advertising is wasted; the trouble is, I don’t know which half.”

John Wanamaker’s famous quip may be less true today than it was when he said it — we have so many ways to track and assess advertising and marketing performance. And yet, those same tools — largely digital tools — have also created unrealistic expectations for many marketers. This especially true for B2B marketers for whom sales aren’t consummated after a website click.

So we’re left in a state where the data available to us (and boy, there’s a lot of data!) doesn’t tell the whole story. This can often put marketers at a disadvantage when talking to the C-suite crowd.

Their interest is in profit and loss. Clicks, likes, and follows aren’t a currency they care about.

The question is, what can you do as a marketer to demonstrate the value your team’s work delivers?

Tie Digital Marketing to Business Outcomes

Begin by admitting that you can’t rely on process metrics alone – the clicks, likes, and follows I mentioned above. You must tie your work to business metrics. Ideally, that’s profit, but you can also demonstrate a positive return if your work impacts other key performance indicators, like revenue, cost savings, lead quality, or lead volume.

Admit to Marketing’s Uncertainties

Get your peers and upper management to buy into the fact that nearly all B2B marketing includes some amount of uncertainty. As noted earlier, our sales are more complex and there’s rarely a “Buy” button for prospects to click after consuming a piece of your content or connecting with you via social media.

Make Metrics Work for You

For many of us, this is the holy grail. Unfortunately, it’s not always easy.

You may have to work backward by, for example, diving into your CRM data to examine the profiles of converted prospects.

  • How much content have they consumed?
  • Where have they interacted with you on social media?
  • Are they email subscribers?
  • Have they attended industry events at which your executives have presented?

This won’t necessarily paint a causal effect, but can help you make the case that your marketing work is making a difference.

Seek Ongoing Incremental Improvement

Though this again will require metrics data that can be hard to establish with confidence, it’s worth tracking your progress any way you can. For example, is the percentage of converted leads who began their relationship with your firm via the website increasing or decreasing, compared to other methods? If you don’t know, can you create the tools you need to gather this information?

Ideally, we’d all spend 100% of our resources on reaching and converting our ideal prospects. But don’t shy away from investing in the systems that will let you do so more consistently, and with more accountability.

How to Use Google Analytics to Improve Google Ads Performance

Google Analytics can be a treasure trove of information to help improve the performance of your Google Ads campaigns. However, trying to figure out all of the the various metrics within Google Analytics can be a big stumbling block for advertisers.

Google Analytics can be a treasure trove of information to help improve the performance of your Google Ads campaigns. However, trying to figure out all of the the various metrics within Google Analytics can be a big stumbling block for advertisers.

The sheer volume of numbers and data available can quickly get overwhelming.

The Key to Finding Value in Google Ads Metrics

Both Google Analytics and Google Ads metrics and reports should be looked at in the context of your business. Are you using the platform effectively enough in ways that benefit your business? What is it you value most, when it comes to your company?

These are a couple of questions you might want to focus on as you comb through your Google Analytics metrics. Understanding what you want to accomplish with your ad campaign can help you narrow down metrics that matter to your bottom line.

  1. What audience demographics do you wish to attract?
  2. Are visitors able to find the thing they are looking for after clicking your ad?
  3. Is your landing page delivering the type of conversions you are after?
  4. From which channels would you like to direct most of your traffic?

Let’s look at how certain Google Analytics metrics and reports can help with Google Ads.

Give Visitors a Great Experience

Do you know what visitors hate the most about clicking on an ad? Not finding what they need. This can ultimately hurt your brand, if your Google Ads campaigns are frustrating prospective customers.

Sure, your Google Ads conversion rate can help give you this insight, but it doesn’t give you the full story. If your ads are not converting as well as you’d like, then you need to dive into Google Analytics to see what’s going on.

First, take a look at your your landing page bounce rate. That’s the number of visitors who see your landing page and then leave without clicking to a second page. A high bounce rate means your landing pages are not living up to the promises you’re making in your ads.

Gain Insight Into Your Website Design

What good does it do to drive prospective customers  to your website, if they have a difficult time with the navigation?

If you are having difficulty getting your conversion rate up to where you would like, it could be an issue with website design. Part of the problem might be that your website design makes completing the path to a conversion overly tedious.

You can review this using the Google Analytics Users Flow report. The Users Flow report will show you how people are navigating through your website, starting with your landing page. You may see that prospective customers are getting distracted and clicking to pages that are not in your sales funnel. Use this information to redesign your landing pages and subsequent pages in the sales funnel to reduce drop off and increase the overall conversion rate of your Google Ads campaign.

Find Your Top Performing Audience Demographics & Interests

The Google Analytics demographics and interest reports can you give you great insight into your top performing audiences. Review these reports to see which audiences are performing best.

Then use the audience data to improve your Google Ads campaigns. Modify your demographic targeting, adjust bids, and even launch new campaigns to target the audiences you know perform best based on the Google Analytics data.

Summary

To be successful with Google Ads often requires using data that’s not available within the Google Ads reports. But one of the best sources of advertising performance-enhancing data is Google Analytics.

Review your landing page bounce rates to see how well you’re matching your landing page message to your ad copy. Use your Users Flow reports to see if your prospective customers are getting distracted on your website. And use your demographics and interests reports to improve the targeting in your Google Ads campaigns.

Want more tips to improve your Google Ads performance? Click here to grab a copy of our “Ultimate Google Ads” checklist.

 

 

Dare to Scare: What If ‘They’ Closed the Internet?

But what if “they” — starting with policymakers in this country — took the extreme step of mimicking Europe, eschewing third-party data collection and use, destroying all of the free content such data transfers pay for, and effectively put today’s open Web behind pay walls and data walls?

The fragmentation of the Internet is marching along.

Europe went all “opt-in” — effectively halting a significant part of the Internet’s financing mechanism all in the name of privacy, without fairly considering the social and economic ramifications on competition, diversity, and democracy. (Or worse, they considered these aspects — and shut it down, anyway.)

China (and most despotic countries) bar access to much Western content. Will Hong Kong be next? Meanwhile, many of these “closed” countries are active players in using digital channels to stoke up social division and to meddle in free nations’ democratic processes.

And then there’s the rest of the global Internet — and the organic, disruptive, and innovative way it is built, maintained, and paid for. Simply allowing data to flow to responsible uses, and enable such exchanges to finance news, apps, games, email, social platforms, video, niche content, and so many other content and conveniences it would be impossible to list them all.

But what if “they” — starting with policymakers in this country — took the extreme step of mimicking Europe, eschewing third-party data collection and use, destroying all of the free content such data transfers pay for, and effectively put today’s open Web behind pay walls and data walls?

Sound very elitist? It is. Sound anti-progressive? It’s that, too. Anti-commercial? You bet. Anti-competitive? Very much so. Anti-consumer? Oh yes, it’s that, too. The deleterious effects may be already underway.

And if we’re not careful, it may just happen in the country that is most responsible for building the Global Information Economy as we know it. What a travesty it would be to throw such leadership away.

A recent study — just looking at the app world — gives a glimpse of what’s at stake. Looking at just nine top-used mobile apps, consumers state they would value access to such content at approximately $173 billion per year — content that is free to them today, thanks to ad financing. Wow! Further, current ad revenue for these apps is a tiny fraction of these assigned values. So, net, there is a huge economic dividend to consumers (and the economy) because these funds stay in consumer pockets, or are spent elsewhere.

As we march forth on privacy-first, we must consider what could happen if such responsible data uses were shut down by short-sighted public policy. What if the result were a “dumb” Internet? There’s still time for U.S. leadership, pragmatism, and a sensible way forward.

Why Everyone Benefits When Marketing and Privacy Are Aligned

Privacy is one of the most pressing issues facing organizations today. And it’s not just affecting the companies that are making headlines over it, like Facebook, Google, Capital One, and Experian.

Privacy is one of the most pressing issues facing organizations today. And it’s not just affecting the companies that are making headlines over it, like Facebook, Google, Capital One, and Experian. The recent passage of privacy laws in the United States and abroad, and the resulting potential fines for mistakes, have been a wake-up call for many. Marketing teams always needed to consider privacy but; now, it’s imperative, and there are significantly higher stakes (ahem…billions of dollars).

Noga Rosenthal, Chief Privacy Officer and General Counsel at NCC Media, believes that far too often, marketing and privacy may be unknowingly working against each other or in silos. However, it is essential for these two departments to be closely aligned.

She points out:

“Every company is a data company, whether or not they realize it. You have CRM data and employee data. You’re collecting data off your website. Nearly everybody will be impacted by legislation like CCPA [California Consumer Privacy Act] and needs to be paying attention.”

What’s at Stake

The stakes are high, and the risks are greater when there’s a disconnect between the marketing and communications teams and the privacy and legal teams. There are two common vulnerabilities:

1. Corporate marketing and advertising aren’t taking into account privacy when promoting products and services.

  • Corporate advertising is creepy to customers.
  • Your company is using new and trendy technology vendors that haven’t been properly vetted by privacy teams.
  • You’re using terminology in your marketing, like “tracking” and “anonymous,” that will draw scrutiny from lawmakers.

2. Marketing and communications teams aren’t involved in security and privacy breach preparedness and response.

  • Marketing and communications haven’t contributed to the company’s incident response plan.
  • Marketing is looped in too late during a breach and is not given the resources needed to respond to stakeholders and meet disclosure requirements.

Companies that falter can be subject to hefty fines. They could alienate their customers. And they’ll likely find themselves in the middle of a PR nightmare.

The Benefits of Collaboration

“Marketing should have a seat at the table in all things data governance. It’s mission-critical,” says Peg Kuman, Chief Privacy Officer of V12.

Bringing privacy and marketing together benefits everyone. If you’ve ever tried to read a privacy policy, you know that privacy and legal speak needs to be more accessible and consumer-friendly. Disclosures and policies written by privacy teams would surely benefit from a marketing and communications lens.

If marketers are more in tune with privacy, your company can protect its brand reputation and avoid the painful privacy missteps in advertising that we’ve seen with Netflix, Spotify, Tinder, and countless others. For companies that face an incident, collaboration can ensure a proper response, such as how Twitter recently owned up to its privacy mistakes, used consumer-friendly language and succinctly apologized.

Also, with an overall heightened interest in privacy, companies can provide value to clients and their customers by proactively sharing relevant and easy-to-understand privacy updates. A client outreach strategy can only be effective by coupling the expertise and knowledge of the privacy team with the creativity, strategy, and reach of the marketing team.

According to Rosenthal:

“At times, it feels like marketing and privacy are at odds with each other. But as privacy becomes more important to consumers, and companies like Apple use it as a way to bring in customers and differentiate from competitors, there’s more of a need to lean on each other.”

Where to Begin

There are several ways to open the lines of communication and foster a stronger partnership between privacy and marketing teams.

Establish a Cross-Functional Team

Don’t wait for something bad to happen to get closely aligned. Proactively create a team consisting of privacy, legal, marketing, and communications focused on cross-functional initiatives. Meet regularly to discuss legislation, strategize, and surface ideas.

Use these meetings as a forum for education and awareness. Like Kuman and Rosenthal, most privacy leaders are involved in industry organizations and coalitions. Through their participation, they get vital information that can help their marketing teams.

Commit to Privacy Principles

Privacy principles should align with the company’s mission, vision, and purpose. A great place to start is thinking about what trust and transparency mean for your industry and organization.

Once you’ve determined what privacy means for your organization, make sure it’s clear in everything you and your employees say and do. Better yet, put some marketing power behind those principles, so they become synonymous with your brand.

Prioritize Policies, Protocol, and Incident Response

Your privacy and marketing teams will need to jointly decide where to focus efforts across your various stakeholders including employees, clients, consumers, prospects, partners/vendors, the media, lawmakers, and investors.

There should be a clear protocol for how marketing and privacy work together, and all parties should understand the role that they play in protecting corporate reputation and respecting consumers.

If your organization doesn’t have a breach response strategy, privacy and marketing should champion the development of one, in conjunction with other parts of the organization, such as technology, information security, and client services. Simulation exercises are valuable ways to identify vulnerabilities and prepare without the intense pressure of an actual crisis.

Raise Awareness Through Education

Privacy is likely not top-of-mind for the majority of your marketing staff, but awareness is critical. Education increases awareness. Curriculum specific to marketing helps the full marketing organization understand their role in supporting the company’s privacy principles. Training can also address when it’s necessary to engage your privacy resources.

Kuman prefers the term “socialization” over training.

She adds:

“Companies should socialize the notion that privacy is how we protect our customer, employee, and business assets.”

Privacy Is Everyone’s Job

Regardless of where privacy laws are headed next in the United States and abroad, we all play a role in privacy protection and we’ll be more successful if we’re working closely together.

Why Many Marketing Automation Projects Go South

There are so many ways to mess up data or analytics projects, may they be CDP, Data Lake, Digital Transformation, Marketing Automation, or whatever sounds cool these days. First off, none of these items are simple to develop, or something that you just buy off the shelf.

As a data and analytics consultant, I often get called in when things do not work out as planned or expected. I guess my professional existence is justified by someone else’s problems. If everyone follows the right path from the beginning and everything goes smoothly all of the time, I would not have much to clean up after.

In that sense, maybe my role model should be Mr. Wolf in the movie “Pulp Fiction.” Yeah, that guy who thinks fast and talks fast to help his clients get out of trouble pronto.

So, I get to see all kinds of data, digital, and analytical messes. The keyword in the title of this series “Big Data, Small Data, Clean Data, Messy Data” is definitely not “Big” (as you might have guessed already), but “Messy.” When I enter the scene, I often see lots of bullet holes created by blame games and traces of departed participants of the projects. Then I wonder how things could have gone so badly.

There are so many ways to mess up data or analytics projects, may they be CDP, Data Lake, Digital Transformation, Marketing Automation, or whatever sounds cool these days. First off, none of these items are simple to develop, or something that you just buy off the shelf. Even if you did, someone would have to tweak more than a few buttons to customize the toolset to meet your unique requirements.

What did I say about those merchants of buzzwords? I don’t remember the exact phrase, but I know I wouldn’t have used those words.

Like a veteran cop, I’ve developed some senses to help me figure out what went wrong. So, allow me to share some common traps that many marketing organizations fall into.

No Clear Goal or Blueprint

Surprisingly, a great ,many organizations get into complex data or analytics projects only with vague ideas or wish lists. Imagine building a building without any clear purpose or a blueprint. What is the building for? For whom, and for what purpose? Is it a residential building, an office building, or a commercial property?

Just like a building is not just a simple sum of raw materials, databases aren’t sums of random piles of data, either. But do you know how many times I get to sit in on a meeting where “putting every data source together in one place” is the goal in itself? I admit that would be better than data scattered all over the place, but the goal should be defined much more precisely. How they are going to be used, by whom, for what, through what channel, using what types of toolsets, etc. Otherwise, it just becomes a monster that no one wants to get near.

I’ve even seen so-called data-oriented companies going out of business thanks to monstrous data projects. Like any major development project, what you don’t put in is as important as what you put in. In other words, the summary of absolutely everyone’s wish list is no blueprint at all, but the first step toward inevitable demise of the project. The technical person in charge must be business–oriented, and be able to say “no” to some requests, looking 10 steps down the line. Let’s just say that I’ve seen too many projects that hopelessly got stuck, thanks to features that would barely matter in practice (as in “You want what in real-time?!”). Might as well design a car that flies, as well.

No Predetermined Success Metrics

Sometimes, the project goes well, but executives and colleagues still define it as a failure. For instance, a predictive model, no matter how well it is constructed mathematically, cannot single-handedly overcome bad marketing. Even with effective marketing messages, it cannot just keep doubling the performance level indefinitely. Huge jumps in KPI (e.g., doubling the response rate) may be possible for the very first model ever (as it would be, compared to the previous campaigns without any precision targeting), but no one can expect such improvement year after year.

Before a single bite of data is manipulated, project champions must determine the success criteria for the project. In terms of coverage, accuracy, speed of execution, engagement level, revenue improvement (by channel), etc. Yes, it would be hard to sell the idea with lots of disclaimers attached to the proposal, but maybe not starting the project at all would be better than being called a failure after spending lots of precious time and money.

Some goals may be in conflict with each other, too. For instance, response rate is often inversely related to the value of the transaction. So, if the blame game starts, how are you going to defend the predictive model that is designed primarily to drive the response rate, not necessarily the revenue per transaction? Set the clear goals in numeric format, and more importantly, share the disclaimer upfront. Otherwise, “something” would look wrong to someone.

But what if your scary boss wants to boost rate of acquisition, customer value, and loyalty all at the same time, no matter what? Maybe you should look for an exit.

Top-Down Culture

By nature, analytics-oriented companies are flatter and less hierarchical in structure. In such places, data and empirical evidences win the argument, not organizational rank of the speaker. It gets worse when the highest-ranking officer has very little knowledge in data or analytics, in general. In a top-down culture, no one would question that C-level executive in a nice suit. Foremost, the executive wouldn’t question his own gut feelings, as those gut-feelings put him in that position in the first place. How can he possibly be wrong?

Trouble is that the world is rapidly changing around any organization. And monitoring the right data from the right place is the best way to keep informed and take actions preemptively. I haven’t encountered any gut-feeling — including my own — that stood the test of time better than data-based decision-making.

Now sometimes, the top-down culture is a good thing, though. If the organizational goals are clearly set, and if the top executive does not launch blame games and support a big data project (no pun intended here). Then, an indefinite amount of inter-departmental conflicts will be mitigated upfront (as in, “Hey, everyone, we are doing this, alright?).

Conflicts Among Teams — No Buy-in, No Use

But no amount of executive force can eliminate all infighting that easily. Some may say “Yeah, yeah, yeah” in front of the CEO or CMO, but sabotage the whole project behind the scene. In fact, I’ve seen many IT departments get in the way of the noble idea of “Customer-360.”

Why? It could be the data ownership issue, security concerns, or lack of understanding of 1:1 marketing or advanced analytics. Maybe they just want the status quo, or see any external influence on data-related matters as a threat. In any case, imagine the situation where the very people who hold the key to the of source data are NOT cooperating with data or analytics projects for the benefit of other departments. Or worse, maybe you have “seen” such cases, as they are so common.

Another troublesome example would be on the user side. Imagine a situation where sales or marketing personnel do not buy into any new way of doing things, such as using model scores to understand the target better. Maybe they got burned by bad models in the past. Or maybe they just don’t want to change things around, like those old school talent scouts in the movie “Moneyball.” Regardless, no buy-in, no use. So much for that shiny marketing automation project that sucked up seven-figure numbers to develop and deploy.

Every employee puts their prolonged employment status over any dumb or smart project. Do not underestimate the people’s desire to keep their jobs with minimal changes.

Players Haven’t Seen Really Messy Situations Before

As you can see, data or analytics projects are not just about technologies or mathematics. Further, data themselves can be a hindrance. I’ve written many articles about “good” data, but they are indeed quite rare in real life. Data must be accurate, consistent, up-to-date, and applicable in most cases, without an excessive amount of missing values. And keeping them that way is a team sport, not something a lone tech genius can handle.

Unfortunately, most graduates with degrees in computer science or statistics don’t get to see a real bloody mess before they get thrown into a battlefield. In school, problems are nicely defined by the professors, and the test data are always in pristine conditions. But I don’t think I have seen such clean and error-free data since school days, which was indeed a lifetime ago.

Dealing with organizational conflicts, vague instructions, and messy data is the part of the job of any data professional. It requires quite a balancing act to provide “the least wrong answers” consistently to all constituents who have vastly different interests. If the balance is even slightly off, you may end up with a technically sound solution that no one adopts into their practices. Forget about full automation of anything in that situation.

Already Spent Money on Wrong Things

This one is a heart-breaker for me, personally. I get onto the scene, examine the case, and provide step-by-step solutions to get to the goal, only to find out that the client company spent money on the wrong things already and has no budget left to remedy the situation. We play with data to make money, but playing with data and technology costs money, too.

There are so many snake oil salespeople out there, over-promising left and right with lots of sweet-to-the-ears buzzwords. Yeah, if you buy this marketing automation toolset armed with state-of-the-art machine-learning features, you will get actionable insights out of any kind of data in any form through any channel. Sounds too good to be true?

Marketing automation is really about the “combination” of data, analytics, digital content, and display technologies (for targeted messaging). It is not just one thing, and there is no silver bullet. Even if some other companies may have found one, will it be applicable to your unique situation, as is? I highly doubt it.

The Last Word on How to Do Marketing Automation Right

There are so many reasons why marketing automation projects go south (though I don’t understand why going “south” is a bad thing). But one thing is for sure. Marketing automation — or any data-related project — is not something that one or two zealots in an organization can achieve single-handedly with some magic toolset. It requires organizational commitment to get it done, get it utilized, and get improved over time. Without understanding what it should be about, you will end up automating the wrong things. And you definitely don’t want to get to the wrong answer any faster.