Russia Is Learning How to Bypass Facebook’s Disinfo Defenses
Credit to Author: Lily Hay Newman| Date: Thu, 05 Mar 2020 14:00:00 +0000
Since Russia’s stunning influence operations during the 2016 United States presidential race, state and federal officials, researchers, and tech companies have been on high alert for a repeat performance. With the 2020 election now just seven months away, though, newly surfaced social media posts indicate that Russia’s Internet Research Agency is adapting its methods to circumvent those defenses.
In September, University of Wisconsin researcher Young Mie Kim started analyzing posts on Facebook and Instagram from 32 accounts connected to the IRA. Within weeks, Facebook announced page, group, and account takedowns related to Iranian and Russian disinformation efforts in October. And accompanying research from the social media analysis firm Graphika corroborated that 31 of the 32 accounts Kim had been observing were Russia-linked. But Kim's findings, detailed for the first time today, reveal additional details about how the IRA has evolved its tactics—and how it may be continuing to do so.
“Despite the increased transparency measures by top platforms, it looks like the Russians are taking advantage of loopholes to try and circumvent the tech platforms’ defenses,” Kim told WIRED. “They’ve improved their mimicry behaviors, and because of their evolving tactics, it’s increasingly more difficult to detect these foreign actors. So I think we should be very wary of that.”
Ahead of the 2016 election, the IRA built up pages with massive followings that often invented personas or grassroots organizations—complete with logos and other marketing material. As digital platforms began scanning for international indicators of what Facebook calls “coordinated inauthentic behavior,” though, the IRA seems to have changed gears. The posts Kim analyzed in September were more focused on impersonating real domestic US organizations or claiming a connection to them, seemingly to borrow legitimacy and hide in plain sight.
One post from a racially charged Instagram account called "iowa.patriot" posted an anti-Elizabeth Warren meme in August that said, "If white privilege existed, why did Elizabeth Warren have to spend decades lying about her ethnicity to get ahead?" Beneath the words was a banner logo taken from a US advocacy group. (Kim redacted references to real people and organizations). In July, the same account also posted a map of the US made of bacon titled "Sharia Free Zone."
The accounts Kim looked at mainly targeted battleground states like Arizona, Florida, Michigan, Ohio, and Wisconsin.
The IRA focused most of its content on the same divisive issues as in 2016, like racial identity, anti-immigrant and anti-Muslim sentiment, nationalism, patriotism, religious topics, and gun rights. And similar to 2016, Kim noticed campaigns promoting a range of ideological views. But she also saw evolutions to stay current, like an increase in tailored feminist and anti-feminist content.
For example, an Instagram account called "feminist_agenda_" posted an illustration in September depicting disembodied hands with different skin tones giving the middle finger under the words "if your feminism doesn't include queer, black, poor, disabled, trans and muslim women, its not feminism. #womensmarch"
Kim noticed another pivot on commerce pages. In 2016, the IRA set up some accounts that claimed to sell items like T-shirts emblazoned with political lines and slogans. But in September, she instead saw evidence of Russia-owned commerce pages hawking benign, neutral items. Instead of directly conducting influence operations through merchandise, the IRA seemed to be using the commerce pages as a way to legitimize and promote their other accounts and posts.
Facebook's October takedown initiative indicates that even as the IRA evolves its tactics, the platform can still eventually spot many of its nefarious campaigns. And the company said in October that the most novel thing about the network it took down was an expanded effort to conceal Russian links to the accounts themselves. But Facebook also acknowledged in July 2018 the challenges it faces staying ahead of the curve.
“We're glad to see researchers do further analysis on our past takedowns,” a Facebook company spokesperson said in a statement. “Last October, we removed this Russia-linked network, which appeared to be in its early stages using tactics we've observed before. We will keep evolving our defenses and announcing these foreign influence campaigns, as we did more than 50 times last year."
As good as the IRA has gotten at avoiding detection, those same steps make it harder for its accounts to stand out and gain followers. Ben Nimmo, director of investigations at Graphika, notes that by making their posts and personas more generic and homespun for the US audience, Russian trolls have limited their own reach.
“What would always give them away was bad English," Nimmo says. "So what we saw in October was they started copy/pasting an awful lot of their content from online sources or blogs or Wikipedia. On Instagram a lot of what they were posting were things like screenshots of tweets from real Americans to blend in. But the big difference was, before they had real personalities, they were snarky, they were really good at the internet. When you're copying someone else there's no personality. They created generic, flat personalities, so almost all of the accounts had a much lower following."
Nimmo also echoes Facebook’s point that part of the IRA’s effort to be more inconspicuous involved operational security improvements like connecting accounts to VoIP phone numbers, routing payments through the US, and masking the origin of web traffic.
Neither Kim nor Nimmo would speculate about what the IRA has been up to since October. Both say that the agency's influence operations are likely ongoing in some capacity, given past precedent and comments from US officials. But Nimmo points out that the October takedowns likely set the IRA back.
“If you have a bunch of assets that have been disrupted, the first thing you have to do is rebuild an audience. They have to grow their audience slowly enough that they don’t get noticed, but quickly enough that they have it in place in advance of Election Day,” he says. “The logical thing for them to do would be to try to work out how they got caught and try and work out what they can do differently.”
US officials see foreign meddling as an active threat leading up to Election Day and have warned both Congress and the public repeatedly about the risk. But when it came to Super Tuesday at least, the Department of Homeland Security said this week that it did not see a significant surge in election-related disinformation across digital platforms. Kim cautions, though, that more sophisticated tactics could still be slipping past social media platforms’ defenses every day.
“We have to think about the general approach the tech platforms are taking right now,” she says. “They’re basically fighting against foreign actors, but when the line between foreign and domestic is blurred, it’s really difficult to enforce that rule.”
All images are from Instagram (September 2019). The posts and identified accounts were later taken down by the company for links to the Internet Research Agency. The identities of non-IRA parties including domestic political groups’ logos, the faces of ordinary citizens, and comments by non-IRA users are redacted.