Facebook has this week announced that it’s suspending the ability to target ads by excluding racial groups in order to give them time to investigate the issue. Again.
For the second time in the last year or so, an investigation has shown that Facebook’s intricate ad targeting options can be used to skirt federal laws, specifically in regards to housing. Last October, a ProPublica report showed that it was possible to use Facebook’s ‘ethnic affinities’ demographic segmentation to eliminate specific racial groups from your ad reach, in violation of the federal Housing Act.
Facebook vowed to fix it, and outlined a range of measures they were taking to address the issue, but more recently, ProPublica went through the process again and found that it was still possible to exclude certain racial groups from ad targeting.
That’s lead to Facebook’s latest suspension of the option, with Facebook’s COO Sheryl Sandberg noting that the company was determined to ‘do better’.
But there is a question over whether this is even possible, whether such profiling can be completely eliminated, given the depth of Facebook’s data-driven process.
For example, back in 2015, researchers from The University of Cambridge and Stanford University released a report which looked at how people’s Facebook activity could be used as an indicative measure of their psychological profile.
What they found was pretty amazing – using the results of a 100 question psychological study, which had been completed by more than 86,000 participants through an app, and mapped alongside their respective Facebook likes, the researchers developed a system which could then, based on Facebook activity alone, determine a person’s psychological make-up more accurately than their friends, their family – better even than their partners.
I interviewed one of the report’s co-authors, Dr. Michal Kosinski, who explained the depth of the data they found, and the insights they were able to glean.
“There are many intimate traits that are predictable from your digital footprint: smoking, drinking, taking drugs, sexual orientation, religious and political views, and so on. Actually, everything we tried predicting was predictable, to a degree, and quite often it was very accurate.”
Kosinski explained how your Facebook like profile, on a wide enough comparative scale, could be highly indicative. Aside from sexual orientation or personal habits (they could detect smokers even when they lied about smoking on the test), they could even predict whether your parents were divorced or not, based solely on your Facebook likes.
Given this, it’s safe to assume that a level of discriminatory, intrusive ad targeting is always going to be possible via Facebook’s system, whether through direct or indirect capacity.
Of course, it’s the direct targeting that’s the real problem, there shouldn’t be a conscious option for advertisers to target, or not target, specific ethnic groups, or people based on certain personal traits. But given the complexity of Facebook’s data systems, that is, and always will be, possible to a degree for those who know how to use it.
This is how Russian operatives were able to specifically hone in on certain subsets of US voters with their 2016 election ads – with so many data options available, there are ways to narrow your focus and gain exposure to people most receptive to your messaging. That’s a problem, as highlighted by stories of foreign actors inspiring protests and counter protests in US cities to further drive division.
That’s concerning, it’s no good. But again, it’s entirely possible.
Facebook faced similar issues when they introduced their in-depth Graph Search option back in 2013.
Graph Search was Facebook’s big attempt at tapping into the search market – rather than showing you relevant search results based on overall web queries, Graph Search could provide you with contextual information, based on what your friends like. Search for ‘Restaurants in London that my friends have been to’ and Graph Search provides relevant, personalized results.
And Graph Search was great, there were a heap of applications for it – but there were also a lot of privacy concerns, especially after users worked out intrusive ways in which graph queries could be used.
Graph Search has since been de-emphasized due to its potentially intrusive capacity, but the project highlights again just how much data Facebook has on its 2 billion users, and how that information can be used in nefarious ways in the hands of those with enough know-how.
For advertisers, this is great, it’s what the platform was built for, the most in-depth targeting engine ever created. But in terms of privacy concerns and potential discrimination – and violations of existing human rights laws – there are clearly concerns.
Definitely, Facebook needs to do all they can to address such issues, but specific ad options like this are really only a small part of the wider discriminatory capacity of Facebook ads. You’re just less likely to hear about the others, because they’re not spelled out so clearly, yet they do exist all the same.