Facebook’s Plan for 2020 Is Too Little Too Late, Critics Say


Mark Zuckerberg didn’t mince words on a call with reporters Monday: “The bottom line here is that elections have changed significantly since 2016, and Facebook has changed too.”

It’s true, the days of Zuckerberg arguing that filter bubbles are worse in the real world than on Facebook, and dismissing the notion that social media could influence the way people vote as a “pretty crazy idea” are long gone. Facebook, he said, has gone from being “on our back foot” to proactively seeking out threats and fighting coordinated influence operations ahead of the 2020 US presidential election.

As proof, he pointed to the slew of new efforts the company announced Monday to combat combat election interference and the spread of disinformation, describing the initiatives as one of his “top priorities.” But critics say he’s missing the point.

Disinformation and media manipulation researchers say Facebook’s announcements Monday left them frustrated and concerned about 2020. Though the policy updates show that Facebook understands that misinformation is a serious problem that can no longer be ignored, that message was undercut by the company’s reluctance to fully apply its own rules, particularly to politicians. What’s more, they say the new election integrity measures are riddled with loopholes and still fail to get at many of the most pressing issues they had hoped Facebook would address by this time.

“All of the tactics that were in play in 2016 are pretty much still out there,” says Joan Donovan, head of the Technology and Social Change Research Project at Harvard Kennedy’s Shorenstein Center.

READ ALSO  Going all-in on remote work: The technical and cultural changes

Among the features announced Monday were new interstitials—notices that appear in front of a post—that warn users when content in their Instagram or Facebook feeds has been flagged as false by outside fact checkers. Donovan says it makes sense to use a digital speed bump of sorts to restrict access to inaccurate content, but the notices may have the opposite effect.

“The first accounts that they choose to enforce that policy on are going to get a lot of attention,” from both the media and curious users, she explained. “We have to understand there’s going to be a bit of a boomerang effect.” She says “media manipulators” will test the system to see how Facebook responds, “and then they will innovate around them.”

Facebook did not respond to inquiries about when or where the feature would be rolled out, or whether it would apply to all content that had been rated partly or completely false by third-party fact-checkers.

Donovan says she’s not sure if the feature’s potential benefits are worth the risks of amplification, particularly since Facebook may not be able to identify and flag misleading content before it reaches people. “Taking it down two days later isn’t helpful,” nor is hiding it behind a notice, she says, “especially when it’s misinformation that’s traveling on the back of a viral news story, where we know that the first eight hours of that news story are the most consequential for people making assessments and bothering to read what the story is even about.”

Also Monday, Facebook said it would attach new labels to pages or ads run by media outlets that it deems to be “state-controlled,” like Russia Today. It said it will require that some pages with a lot of US-based users to be more transparent about who’s running them—this will at first apply only to verified business pages, and later include pages that run ads on social issues, elections or politics in the US. In addition, ads that discourage people from voting would no longer be permitted.

READ ALSO  Super-Resolution Microscope Sees Virus Particles Moving Through Cell

But researchers say that these measures are too little too late. “Every announcement like this, and all the recent publicity blitz has an undercurrent of inevitability,” says David Carroll, an associate professor at Parsons School of Design known for his quest to reclaim his Cambridge Analytica data. “It shows that they still need to show that they’re doing things. One advantage to these cosmetic things is that they look like they’re significant moves, but they’re really just like pretty small user interface tweaks.” But that’s not enough at this stage, he says.



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com