Editor’s Note: Who owns the data? For much of my career in survey research, it was an easy question to answer: the client owned the data, but not the names of the survey participants. That was confidential. Now, in the digital world, attaching all kinds of personal data to specific individuals for marketing purposes has become a staggeringly huge business. We all know about the blowback over this that has taken place over the past couple of years, and about efforts like GDPR to afford some protection. In this provocative piece, Sally Nicholls discusses a recent data scandal and implications for how we should be thinking about the answer to the question I opened with.
Since May 2018, GDPR has been well and truly embedded in data protection conversations and has become the ultimate safety net. People have the right to know what a company knows about them and why they hold it, which should really put the consumer in control of the information that they share.
But we still don’t read terms and conditions, and we probably never will. While we think we’re accepting website cookies on our terms, it’s become clear that our data still isn’t truly in our control.
Why do we Still Have Data Scandals?
The Facebook-Cambridge Analytica data scandal of 2018 was a significant news story. More recently, companies Maya and MIA Fem have sparked further controversy, being exposed for “extensive sharing of sensitive personal data with third parties, including Facebook”.
For those unfamiliar with this scandal, Privacy International found that Maya and MIA Fem were immediately sharing information about users before they agreed to a privacy policy, regardless of whether or not the user had a Facebook account. This allows developers to understand what the users of their app enjoy and improve their apps over time in accordance with this insight. Developers may also use Facebook services to monetize their apps through Facebook and subject to the Facebook user’s prior consent, supposedly, Facebook may also use this data to provide that user with more personalized ads.
Worryingly, these personalized ads are based on information that a user inputs into the app, eg. information around a woman’s menstrual cycle, how they feel around different times of the month, the contraception they’re using and even when they last had protected or unprotected sex.
It feels wrong that we’re still not sure what data we’re handing over to companies, and that there are prominent organizations such as Facebook who are using our data in immoral ways without our say-so.
We researchers have an ethical obligation to conduct research faithfully and objectively, which means we have to adhere to strict codes of conduct such as GDPR and the guidelines set out by organizations such as the Market Research Society (UK), Insights Association (US), and ESOMAR (Global). Participants provide their most personal data to market research companies in good faith, some information they wouldn’t even share with their friends (financial information, medical history, personal behaviors, etc.), and in return, we need to ensure that this data is handled with the utmost care and confidentiality. It’s a shame that this ethical way of operating hasn’t been incorporated into the core of some businesses.
Currently, Data is Viewed as a Hygiene Factor
Data is a necessity for businesses. It tells them where the customer is and provides essential moments of truth that are important for business growth. To have data is now the norm and isn’t a commodity anymore; but, it’s what you do with that data that sets your business apart. This was a hot topic at the ‘When Data Meets Creativity’ panel held by Dentsu Aegis this year. Their take is that thinking about data acquisition differently and using creativity to unlock ethically sourced data is the way forward for the industry. To quote a couple of panelists:
“Data is not owned by a company, people should own their data and can select which data they will give to a brand… People are wary about what technology companies are doing with their stored data. To use that data, you’ve got to do it in a very transparent way… that customers feel that you’re not stalking them or selling their information. It’s how you use it ethically that’s going to become interesting.”
Whilst this is a valid view, we can safely say that not all data is equal in value; the Maya/MIA Fem scandal has proven this. And so, perhaps this isn’t the best way of thinking about data. There are a number of issues that arise by treating data as a hygiene factor, and most of these issues arise through the perception of data that this treatment perpetuates: that while data is a necessary concern, a fundamental but intangible aspect that everyone has to manage, not everyone knows how to do it in the best way. This leads to a lot of misunderstandings that could otherwise be avoided with a change in perception.
Data Should be Treated like a Currency
Treating data as a currency rather than a hygiene factor will instigate a necessary shift in the consumers’ perception of data, and will result in them truly seeing the worth that their data has to businesses like Maya/MIA Fem. The notion of data being a type of currency is an interesting one. Trading personal information in return for more tailored experiences is almost becoming the model of success for companies who use the data well, such as Amazon and Netflix and even third-party providers assisting big brands like Under Armour.
The idea presented in one Forbes article is that using reams and reams of data (such as that produced by AI) successfully will help to cultivate the “story of one”, the unique path any one consumer may follow, despite millions of consumers taking a path on the same patch (think eBay). If this notion is to be believed, then the ethical responsibility is most definitely on businesses in this situation. Whilst being a double-edged sword for consumers (if you turned off personalized advertising, you would certainly miss it), they trust brands are using this information for the power of good.
Essentially, data can be treated as money, in that it can and already is exchanged for goods and services. We trust institutions such as banks with our money, but otherwise, as consumers we’re very protective over it. This attitude towards data will enable consumers to be extra careful with who they’re giving their data away to, and the change in perception will force businesses such as Maya/MIA Fem to wake up and treat the data as carefully as they already should be under GDPR.
Changing our Perception of Data
If the Privacy International report is to be believed, the Maya/MIA Fem story is just about the biggest data intrusion women could be subjected to. But gender aside, data privacy should be an integral part of a business and they must learn to treat data carefully and respectfully.
It’s 2020 – let’s learn from those who have these monumental errors and be the champions of ethical data handling as an example for others to follow.