The suit is the latest allegation that YouTube’s software, which can automatically remove videos suspected of violating the company’s policies, discriminates against certain groups, such as LGBT people. It comes during a national reckoning over racial discrimination in which companies such as Google have promised to push for change.
YouTube uses its “absolute, and ‘unfettered’ control over access to approximately 95 percent of all video content that is available to the public,” the lawsuit alleges, to “rig the game, by using their power to restrict and block Plaintiffs and other similarly situated competitors, based on racial identity or viewpoint discrimination for profit.”
YouTube spokesman Farshad Shadloo said the company is reviewing the complaint. “We’ve gone to extraordinary lengths to build our systems and enforce our policies in a neutral, consistent way,” he wrote. He said the company’s automated systems do not discriminate based on race. In the past, YouTube has said that its algorithmic approach to content moderation is protected under the law.
Catherine Jones, creator of the YouTube channel Carmen CaBoom, said the platform removed the channel, alleging nudity. But none of her videos contained nudity, the lawsuit says. Other videos Jones produced were removed because of alleged hate speech, a designation the suit says is untrue.
Nicole Lewis, whose Nicole’s View channel earns $6,000 to $7,000 per year, says 17 videos were removed or archived for unknown reasons, according to the lawsuit. Kimberly Carleste Newman said 700 or more videos from her channel, the True Royal Family, have disappeared, and she doesn’t know why or how to get them back, the lawsuit says. And Lisa Cabrera says her 4,423 videos have generated 20 million views but 68 of them were removed with no explanation, according to the suit.
On Thursday, YouTube CEO Susan Wojcicki shot down the allegations during a Washington Post Live event. “It’s not like our systems understand race or any of those different demographics,” she said.
But she said the fairness of machine learning algorithms is a “huge area of work” across the industry. “We always want to make sure that our machines haven’t by accident learned something that isn’t what we intended,” she said. “If we ever find that it did, then we will retrain our machines to make sure that they now have the right, that whatever that issue was has been removed from the training set of our machines.”
YouTube said in response to the suit filed by LGBT YouTube creators last summer that its algorithms don’t discriminate against people for their gender or race. That suit is ongoing.
The suit filed Tuesday cites a sworn declaration by another YouTube creator, Stephanie Frosch, who says YouTube officials told her in 2017 that the company’s content moderation algorithms do discriminate based on race.
Frosch is a plaintiff in the suit filed last summer. In the declaration, Frosch says she was invited to YouTube’s headquarters in September 2017 to discuss alleged discrimination.
After asking Frosch to sign a nondisclosure agreement, YouTube representatives told Frosch that the company’s algorithms categorize creators based on their race, among other characteristics, she wrote. That information is used “when filtering and curating content and restricting access to YouTube services,” she says she was told. “The result is that the algorithm discriminates based on the identity of the creator or its intended audience when making what are supposed to be neutral content based regulations and restrictions for videos that run on YouTube,” company officials told her, according to the declaration.
Peter Obstler, an attorney at Browne George Ross, represents Frosch and the black creators who filed the suit.
“Our automated systems are not designed to identify the race, ethnicity, or sexual orientation of our creators or viewers, and our policies are global, enforced by a global team,” said Shadloo, the YouTube spokesman.