On Tuesday, the Southern Poverty Law Center (SPLC) released its fall 2017 edition of the education magazine “Teaching Tolerance.” One of the articles in that magazine focused on the radicalization of the white supremacist Dylann Roof, who shot and killed nine members of a black church in Charleston, S.C., in 2015. The SPLC drew some rather odd conclusions from Roof’s history, however. The organization suggested that Google and Facebook were to blame for refusing to censor websites and for allowing an Internet algorithm to radicalize Roof.
“A dangerous coupling of digital media trends held Roof’s hand as he walked the path of radicalization,” Cory Collins wrote in “Teaching Tolerance.” Roof’s “first guide: Google’s search algorithm, which fails to find the middle ground between protecting free speech and protecting its users from misinformation.”
Yes, the SPLC just blamed Google for the results of an Internet search. “Google remains vulnerable to sites that peddle propaganda, especially those with heavy web traffic that utilize tricks of the SEO (search-engine optimization) trade,” Collins added.
“Google has claimed its autocomplete function filters out offensive terms—and asserted its updated search algorithm ‘will help surface more high quality, credible content’ and bump down sites with hate speech and disinformation,” the SPLC writer noted. But the SPLC is unsatisfied with these measures, which already suggest a terrifying trend toward censoring Internet speech.
“Google’s algorithm is deaf to ‘dog whistles,’ or the coded language of white nationalist messaging and pseudoscientific racism. As a result, it fails to stem the flow of disinformation to the top of its search results. Since Google changed its algorithm and autocomplete function, NPR found searches for ‘black on white crime’ continued to call up ‘multiple white supremacist websites,'” Collins noted.
The SPLC writer alleged that “this gateway to hate, provided inadvertently by Google, becomes more of a threat when coupled with a second phenomenon that can lead young people like Dylann Roof toward radicalization: the filter bubble that refuses to burst.”
“In short, search engines like Google and social media sites like Facebook take a user’s information and web history,” Collins explained. “With that information, online agency becomes a façade; behind the curtain, a formula controls what users see and steers them toward content that confirms their likes, dislikes and biases. Practically, this helps advertisers. Consequentially, it replaces the open internet of diverse perspectives with an open door to polarization and radicalization.”
The SPLC writer quoted James Hawdon, director of Virginia Tech’s Center for Peace Studies and Violence Prevention. “Each link, every time you click on something, it’s basically telling the algorithm that this is what you believe in. The next thing you know … you just get led into this rabbit hole of increasingly extreme ideas.”
Comments are closed.