Did you follow the #PlaneBae saga as it jumped from social media channels to national news item? If so, you saw the turn it took from lighthearted love story to a much darker reminder of the unexpected ways our digital footprints affect our (and others’) lives.
Unwanted social media exposure is just the tip of the iceberg when it comes to digital privacy concerns. With health insurers peeking at our online activities and AI algorithms analyzing online behaviors to assign everything from article preferences to loan or job suitability to prison sentences, protecting personal data is something everyone should be concerned with.
This concern should also extend to the work we do as content marketers. Changing privacy laws in Europe and, now, the U.S. have given us compliance-driven motivation to be more vigilant and transparent in our handling of consumer data. (See the privacy law recently passed in California.) But it’s the trusted relationships we hope to build with our audiences that give us even more compelling reasons to care about those who willingly share their personal lives with our brands.
Here’s a look at some recent articles that have the CMI team (and, likely, other content marketers) pondering big questions about what data we can collect and what practices and processes we should follow to create trustworthy content experiences for our audiences.
Should digital privacy encompass more than just data?
Read: The Woman in the #PlaneBae Saga Breaks Her Silence
Social media fans may recall the recent viral story in which an airline passenger claimed a simple act of switching seats led to a love connection between strangers. All indications seem to be that @roseybeeme (who reportedly gained 60,000 followers as a result of her posts) tweeted ongoing observations of the new seatmates with the best of intentions. (It has since been deleted.) But the woman being watched, dubbed #PrettyPlaneGirl by social media and who didn’t know about the tweets until after the fact, didn’t see the posts as a positive. She released a statement that referred to the incident as “a digital-age cautionary tale about privacy, identity, ethics, and consent.”
Consider: That seemingly innocuous public post you uploaded could be someone else’s private nightmare. The #PlaneBae saga played out without a brand’s intervention. But it’s not hard to imagine how easy it would be for an employee to post something similar on your brand’s behalf without considering the possible consequences. Before you (or your extended team) share someone else’s story in your brand’s content, exercise simple common sense – and common courtesy. Make sure your brand follows sound ethical practices by securing the appropriate permission. And don’t forget to verify that your take on the story is accurate.
You don’t want a brand employee to tweet a saga like #PlaneBae, says @joderama. Click To Tweet
What constitutes a valuable data exchange?
Read: Health Insurers Are Vacuuming Up Details About You
Your brand may be upfront about its intentions for using the audience data it collects through content, but that doesn’t mean situations can’t or won’t change. It also doesn’t guarantee that third-party partners or others with the power to mine that data will be equally transparent, responsible, or altruistic.
The possible uses of lifestyle data outlined in the article by ProPublica author Marshall Allen should raise some eyebrows. These examples show how health insurance companies, which partner with data brokers, can use tools to create predictive models based in part on an audience’s interactions with content (click on an ad for women’s plus-size clothing, for example, and you could be considered at risk for depression). And those lifestyle data models could be used, the piece suggests, to base the cost of someone’s health insurance policy on potentially incorrect or incomplete data collected without their knowledge.
Consider: Some things will always remain outside of your business’s control and the evolution of artificial intelligence and machine learning technology is chief among them. But it’s a good time to start asking: At what point does the digital industry’s increasing reliance on aggregated consumer data cross the line between delivering personalized value and exerting undue influence over their personal lives? And, following on that, are you adequately equipped to walk that line responsibly, interpret it accurately, and assure your audiences you won’t exploit the information they entrust to you?
When does aggregated consumer data cross line b/n personalized value & undue influence @joderama Click To Tweet
It’s a question that Robert Rose touches on in his recent discussion on whether GDPR is a gift in disguise for marketers. In his post, he raises the critical distinction between data scraped unwittingly for generating, nurturing, and converting leads, and the more valuable “emotional” data voluntarily supplied by customers and content consumers. But regardless of where your data comes from – or your ultimate intentions for it – his assertion that marketers need to become a “trusted source of interesting things” and his advice on instituting the role of “audience data shepherd” into your content teams still ring true.
Ethical ideas vs. ethical practices
Read: Doing Good Data Science
Speaking of ethical considerations, this thought-provoking discussion (the first in an ongoing series on the subject) reminds us that marketers aren’t the only players with a stake in the consumer data debate. Data scientists are tasked not only with developing the algorithms marketers use to gather and analyze customer insights, but also with determining the most effective and ethical ways to apply and evaluate their impact on performance.
Though focused on data science, the article explores data usage from several angles directly relevant to content strategy, including:
- How to approach user experience design so that the need to preserve consumer privacy doesn’t render AI applications useless
- How the concept of “informed consent” can take on a different meaning at points in a user’s journey
- What kinds of organizational changes it might take to balance our industry’s drive to “move fast” and “break things” with the ability to minimize the impact on those things we are likely to break
Consider: As the article’s authors wisely suggest, if you put ethical practices in place, you need to empower your teams to figure out what that really means – for your brands, your audiences, and for the content you create to serve them both. Critical components include cultivating an organizational culture that can support experimentation and implementing strategic processes built for agility and iteration. By systematically creating boundaries around your content efforts and committing to continual (if incremental) improvement, you make it easier for your teams to adapt when confronted by big changes like algorithm shifts, legislative trends, or emerging technologies.
Empower your team to figure out what your brand ethics mean for your audience and content, says @joderama. Click To Tweet
The content conclusion
Let’s face facts: Data breaches will likely continue; people will make ill-informed decisions about what to share online; the collective definition of “private information” may evolve; and corporate leadership or those in political power can’t be counted on to prioritize the interests of the average digital denizen over the potential to turn a profit.
Though content marketers may not have much control over many of the changes looming on the digital data horizon, you can do your part to engage with your audiences in thoughtful, responsible, and conscientious ways; own up when mistakes are made; and build trust consistently by keeping their needs top of mind.
Talk about the latest headlines affecting content marketers Sept. 4-7 in Cleveland, Ohio. Register today for Content Marketing World and use the code BLOG100 to save $100.
Cover image by Joseph Kalinowski/Content Marketing Institute