For more than two years, the notion of social media disinformation campaigns has conjured up images of Russia’s Internet Research Agency, an entire company housed on multiple floors of a corporate building in St. Petersburg, concocting propaganda at the Kremlin’s bidding. But a targeted troll campaign today can come much cheaper—as little as $250, says Andrew Gully, a research manager at Alphabet subsidiary Jigsaw. He knows, because that’s the price Jigsaw paid for one last year.
As part of research into state-sponsored disinformation that it undertook in the spring of 2018, Jigsaw set out to test just how easily and cheaply social media disinformation campaigns, or “influence operations,” could be bought in the shadier corner of the Russian-speaking web. In March 2018, after negotiating with several underground disinformation vendors, Jigsaw analysts went so far as to hire one to carry out an actual disinformation operation, assigning the paid troll service to attack a political activism website Jigsaw had itself created as a target.
In doing so, Jigsaw demonstrated just how low the barrier to entry for organized, online disinformation has become. It’s easily within the reach of not just governments, but private individuals. Critics, though, say that the company took its trolling research a step too far, and further polluted social media’s political discourse in the process.
“Let’s say I want to wage a disinformation campaign to attack a political opponent or a company, but I don’t have the infrastructure to create my own Internet Research Agency,” Gully told WIRED in an interview, speaking publicly about Jigsaw’s year-old disinformation experiment for the first time. “We wanted to see if we could engage with someone who was willing to provide this kind of assistance to a political actor…to buy services that directly discredit their political opponent for very low cost and with no tooling or resources required. For us, it’s a pretty clear demonstration these capabilities exist, and there are actors comfortable doing this on the internet.”
Trolls Behind the Counter
In early 2018, Jigsaw hired a security firm to sniff around Russian language black market and gray market web forums for disinformation-for-hire services. (That company asked WIRED not to name it, to preserve its ability to work on underground forums.) Browsing sites like Exploit, Club2Crd, WWH and Zloy, the security firm’s researchers say they didn’t find explicit offerings of trolling or disinformation campaigns for sale, but plenty of related schemes like fake followers, paid retweets, and black hat search engine optimization. The team guessed, though, that more awaited beneath the surface.
“If we look at this as window shopping, we hypothesized that if someone was selling fake likes in the window, there’s probably something else behind the counter they might be willing to do,” says Gully. When researchers for the security firm Jigsaw had hired started chatting discreetly with those vendors, they found that a few did in fact offer mass-scale social media posting on political subjects as an unlisted service.
“That… is an extremely controversial and risky thing to do.”
Thomas Rid, Johns Hopkins University
Before it bought one of those paid trolling campaigns, Jigsaw realized that it first needed a target. So together with its hired security firm, Jigsaw created a website—seeded with blog posts and comments they’d written to make it appear more real—for a political initiative called “Down With Stalin.” While the question of Stalin’s image sounds like a decades-old debate, it engaged with a current, ongoing argument in Russia about whether Stalin should be remembered as a hero or a criminal. (Partly due to the Kremlin’s rehabilitation efforts, polls show positive sentiments toward Stalin are at their highest in years.)
“The idea was to create a tempest in a teacup,” says one of the security firm staffers who worked on the project, explaining the decision to focus on a historical figure. “We wanted to be very careful, because we didn’t want too much tie-in to real-life issues. We didn’t want to be seen as meddling.”
To attack the site it had created, Jigsaw settled on a service called SEOTweet, a fake follower and retweet seller that also offered the researchers a two-week disinformation campaign for the bargain price of $250. Jigsaw, posing as political adversaries of the “Down with Stalin” site, agreed to that price and tasked SEOTweet with attacking the site. In fact, SEOTweet first offered to remove the site from the web altogether fraudulent complaints that the site hosted abusive content, which it would ostensibly send to the site’s web host. The cost: $500. Jigsaw declined that more aggressive offer, but paid SEOTweet $250 to carry out its social media campaign, providing no further instructions.
Down With Stalin, Up With Putin
Two weeks later, SEOTweet reported back to Jigsaw that it had posted 730 Russian-language tweets attacking the anti-Stalin site from 25 different Twitter accounts, as well as 100 posts to forums and blog comment sections of seemingly random sites, from regional news sites to automotive and arts-and-crafts forums. Jigsaw says a significant number of the tweets and comments appeared to be original post written by humans, rather than simple copy-paste bots. “These aren’t large numbers, and that’s intentional,” says Jigsaw’s Gully. “We weren’t trying to create a worldwide disinformation campaign about this. We just wanted to see if threat actors could provide a proof of concept.”
Without any guidance from Jigsaw, SEOTweet assumed that the fight over the anti-Stalin website was actually about contemporary Russian politics, and the country’s upcoming presidential elections. “You simply don’t understand all that the president does for our country so that people can live better, and you armchair analysts can’t do anything,” read one Russian-language tweet (below) posted by a fake user named @sanya2un1995, including a photo of Stalin in her post but clearly referring to Russian president Vladimir Putin. Another fake account wrote a post on a forum accusing the anti-Stalin site of “writing all kinds of nasty things about our president, supposedly he has everyone on their knees and is trying to bring back the USSR, but personally I think that’s not how it is, he is doing everything for us, for the common man.”
Strangely, neither Jigsaw nor the security firm hired for the experiment said they were able to provide WIRED with more than a couple of samples of the campaign’s posts, due to a lack of records of the experiment from a year ago. The 25 Twitter accounts used in the campaign have since all been suspended by Twitter.
WIRED tried reaching out to SEOTweet via its website, seo-tweet.ru, which currently advertises the services of a self-professed marketing and cryptocurrency entrepreneur named Markus Hohner. But Hohner didn’t respond to a request for comment.
Blowback
Even as Jigsaw exposes the potential for cheap, easily accessible trolling campaigns, its experiment has also garnered criticism of Jigsaw itself. The company, after all, didn’t just pay a shady service for a series of posts that further polluted political discourse online. It did so with messages in support of one of the worst genocidal dictators of the 20th century, not to mention the unsolicited posts in support of Vladimir Putin.
“Buying and engaging in a disinformation operation in Russia, even if it’s very small, that in the first place is an extremely controversial and risky thing to do,” says Johns Hopkins University political scientist Thomas Rid, the author of a forthcoming book on disinformation titled Active Measures.
Even worse may be the potential for how Russians and the Russia media could perceive—or spin—the experiment, Rid says. The subject is especially fraught given Jigsaw’s ties to Alphabet and Google. “The biggest risk is that this experiment could be spun as ‘Google meddles in Russian culture and politics.’ It fits anti-American clichés perfectly,” Rid says. “Didn’t they see they were tapping right into that narrative?”
But Jigsaw chief of staff Dan Keyserling stands by the research, pointing out that the actual content it generated represents an insignificant drop in the social media bucket. “We take every precaution to make sure that our research methods minimize risk,” Keyserling says. “In this case, we weighed the relatively minor impact of creating fake websites and soliciting this kind of small scale campaign against the need to expose the world of digital mercenaries.”
To what degree the Jigsaw experiment really exposed that practice, however, deserves scrutiny, says Alina Polyakova, a disinformation-focused fellow at the Brookings Institution. She supports the idea of the research in theory, but notes that Jigsaw never published its results—and still hasn’t, even now.
“I don’t think policymakers or your average citizen gets how dangerous this is, that the cost of entry is so low,” Polyakov says. “As an experiment, I don’t think this is a problem. What I do think is a problem is not actually publicizing it.” Jigsaw’s staff concedes that they didn’t publish their results, or even publicize the experiment until now—in part, they say, to avoid revealing anything about their research methodology that would inhibit their security firm partners’ ongoing work in Russian-language underground markets. But Jigsaw says it did use the experiment’s results to inform its work on detecting disinformation campaigns, as well as in a summit they held in Ukraine on disinformation in late 2018, ahead of the Ukrainian presidential election.
Jigsaw wouldn’t be the first to court controversy for flirting with the disinformation dark arts. Last year, the consultancy New Knowledge acknowledged that it had experimented with disinformation targeted at conservative voters ahead of Alabama’s special election to fill an open Senate seat. Eventually, internet billionaire Reid Hoffman apologized for funding the group that had hired New Knowledge and sponsored its influence operation test.
The Jigsaw case study has at least proven one point: The incendiary power of a disinformation campaign is now accessible to anyone with a few hundred dollars to spare, from a government to a tech company to a random individual with a grudge. That means you can expect those campaigns to grow in number, along with the toxic fallout for their intended victims—and in some cases, to the actors caught carrying them out, too.