U.S. law is established when it comes to liability for social media platforms that host inappropriate, infringing, or violent content. Under §230 of the Communications Decency Act, online platforms have general immunity from content posted by a third party. While there is no indication presently that Congress will address §230 immunity, social media and tech companies may still need to change their policies on what content they can host. At least that is the position of Brad Smith, the president of Microsoft.
Pressure on Social Media Companies to Change
In the wake of live-streamed mass shootings and other content deemed dangerous, some countries have considered laws that would put the onus on social media to regulate content posted on their platforms. For example, after the Christchurch massacre in New Zealand, Australia passed the Sharing of Abhorrent Violent Material Act of 2019, which makes it a criminal offense to fail to remove violent content “expeditiously.” Punishments include heavy fines and even imprisonment.
Since tech companies are global enterprises, this means that corporations such as Facebook and Twitter will need to police content more stringently. Brad Smith recently tackled this issue in an interview with Reuters. He believes that Microsoft and other tech companies will need to adapt regardless of U.S. law.
Smith is promoting his book, “Tools and Weapons: The Promise and the Peril of the Digital Age.” The book argues that tech companies must be regulated. Smith is not alone in calling for regulation, however. Last March, Facebook CEO Mark Zuckerberg called for governments to set standards about how to handle and remove harmful content in an op-ed in The Washington Post.
The Push to Police Content
While Facebook and Twitter may not be facing liability in the U.S. anytime soon, the pressure to police content is increasing. Both political parties have criticized Facebook and Twitter for failing to remove biased and incorrect information, putting social media sites in an awkward position of balancing free speech considerations with safety concerns.
Of course, calling for government regulation is easier than agreeing on the specifics. For example, Australia’s law has been criticized for being overly vague, potentially allowing for criminal charges anytime violent content is posted on a social media site.
There are no easy answers. Smith, along with others in the tech industry, believe that tech companies cannot be trusted to regulate themselves. But with mass shootings, terrorist threats and other dangerous content a horribly frequent occurrence, they may have to.
Related Resources: