A few days ago a fellow Android user recommended an app enabling me to view what risk each of my smartphone’s apps represent—what information they collect; what they share; and what degree of security they exhibit in collecting, storing and transmitting the data they collect. After the install, I was flabbergasted to learn how reckless these companies are, and I promptly performed sweeping deletes.
On the heels of this disconcerting revelation I started to think through our recent vetting of marketing-automation applications. Each vendor was quick to illustrate how its application—for instance—is a Google Analytics embed, augments these data points with individual user information, and boasts additional features profiling ever more complete pictures of visitors and subscribers. As a person hell-bent on keeping my private life private, have we marketers gone too far? Are we too nosy?
I think perhaps we have and we are.
With today’s technology, we are able to collect more information about individual users than we could possibly process and certainly more than we could find useful—but that in itself isn’t irresponsible; it’s when we’re not careful with or careful to protect this data to the extent we should. Yes, the big data bandwagon is here and many marketers are eager to jump on, but we also have an ethical responsibility to our constituents.
I see a great failure in our wastefulness; we collect data we don’t need and, in doing so, run the risk of eroding the trust of our constituents. When we collect data and send personalized campaigns referencing information our constituents have not knowingly provided us, they become aware of our activities, and in a way that may not be welcomed.
I received an email after visiting a website, and it started with, “Not to be creepy, but we saw that you visited our website …” Not to be creepy? The only possible reason for starting an email in such a manner is the sender is acutely aware their clients probably do not know that they can be personally tracked at a website. Most people believe their browsing activity is private, and by clearing their browser history, no one is the wiser about where they’ve been. We, of course, know differently.
Just because we can collect data doesn’t mean we should. We should be careful, useful, respectful and protective in our collection and use of both implicit and explicit data.
When I tell people that I am a behavioral marketer—I create campaigns designed to disclose the recipients’ behavior and interests—and describe how the process works, they are always shocked to learn how much information can be collected through drip and nurture marketing and website visits. They immediately draw parallels between the NSA, Google, and Spider Trainers or the clients for whom we work. They also swear off ever clicking another link—as if it were that simple.
Is full disclosure in our future? In the same way that website owners must adhere to what is commonly known as the cookie law, will we marketers be required to disclose what we are collecting about our subscribers, how it is used and allow them to opt out without unsubscribing? If we continue on our path of increased surveillance, wasteful collection and irresponsible use, yes, I believe we will.
 Clueful is the Android app. It’s also a feature included in the iOS of iPhones.
 A law that started as an E.U. directive, adopted by all E.U. countries in 2011, designed to protect online privacy, by making consumers aware of how information about them is collected and used online, and give them a choice to allow it or not.