Facebook’s Regulatory Reckoning | Inc.com



(This is a guest post by my coauthor on Modern Monopolies, Nicholas Johnson.)

What a mess.

Facebook once again finds itself at the heart of a media firestorm after revelations that Cambridge Analytica had unauthorized access to data on 50 million users.

To some extent, this issue has been massively blown out of proportion. This story sits at the intersection of two things much of the media loves to hate – Trump and Facebook.

It’s arguable if this would ever have become a major news story if the subtext weren’t that this data had been used to, allegedly, elect Trump. Nobody seemed to care when President Obama’s campaign used the same loophole to get data on and target tens of millions of voters. (Note: I am not a Trump supporter and I voted for Obama).

And when you combine that with Facebook, which the media loves to hate given its major dependence on the platform, you have the perfect storm.

Still, where there’s smoke, there’s fire.

Facebook finds itself in the interesting intersection between private companies and public governance. As noted in our book Modern Monopolies, long ago a more naive Zuckerberg used to talk about Facebook’s governance standards as akin to making public policy. Former Twitter Head of Platform Ryan Sarver has expressed a similar view. Facebook’s platform is both a private company and a public good. In other words, to use a familiar term, Facebook has become systemically important.

Facebook fixed years ago the original loophole in its developer program that allowed Cambridge Analytica to gain access to this data. But when one company can open up tens of millions of users to losing control of their data with an intentional design choice, that’s a serious issue.

Data and the Brave New Online World

What the news cycle gets wrong is that this isn’t just an issue about Facebook. If you “delete Facebook,” you’ve still effectively changed nothing. Unless you stop using the Internet, you will still be exposed to the same problems almost everywhere you go.

Data security isn’t a challenge that’s unique to Facebook or platform businesses. As Equifax and Yahoo showed, any online business needs to take data security seriously. Most don’t do so because there is little oversight, standards or enforcement around the issue.

With platforms though, the issue is scale. They have much more data about users than smaller linear companies like Equifax and Yahoo – this is in fact the same thing that attracts advertisers to them. Not only do they have account level meta data, they have a seemingly endless amount of behavioral data on every user. Again, this is the same thing that enables the microtargeting that advertisers love.

Facebook had every incentive to be as open as possible with user data. The more data it could collect on users and share with advertisers, the more money it could make. Facebook may not have intentionally turned a blind eye to abuses like the one that’s occurred with Cambridge Analytica, but it certainly was incentivized not to look to closely if it didn’t have to.

This data ‘breach’ (even though Facebook is determined not to call it that) illustrates an important issue for all online businesses – the asymmetry in information between the business in its users, particularly regarding user data. Again, for platforms their scale just pours gasoline on the fire. For modern monopolies like Facebook and Google, they effectively own the supply of information for hundreds of millions of users.

When the concerns of the platform don’t’ align with those of its users, as has often been the case with user privacy, users have little effective recourse. For Facebook, its cavalier attitude toward user privacy, with the default user sharing setting almost always being “open,” has been great for its bottom line but, as is becoming clearer, not so great for users. It opens them up to issues of misuse and manipulation with this very same data, a problem that Facebook is now being forced to deal with.

Should Facebook Be Regulated?

As with any systemically important company, the question isn’t really should it be regulated but how. Even Mark Zuckerberg acknowledged this fact last week in his very belated mea culpa to Facebook users.

So where should regulations start. Number one would be transparency around how data is used. Users should be able to see who has what data on them online. Most would likely be shocked by the sheer amount of data about them that exists digitally. The challenge is that the private sector is unlikely to ever enforce this effectively. Time and again, users have shown that while they may voice concerns about privacy, they almost never act on them in reality. There’s a serious mismatch between what people say and what people do when it comes to online privacy.

However, there are good public policy reasons to want to protect user data online. The issue of election interference is only one example, but it’s a case in point. The idea of a “data protection agency” has been floated around social media. This may be a good place to start. This agency would regulate all online businesses, the same way the SEC regulates investment banks. But as with banks, most of the focus will likely start with the systemically important companies.

Data is the “capital” of the online world, and modern monopolies like Facebook probably deserve closer monitoring and tighter restrictions on what they can do with user data.

As declining user growth shows, Facebook may not be “too big to fail,” but it can certainly create a big mess whenever it does.



Source link

WP Twitter Auto Publish Powered By : XYZScripts.com
Exit mobile version