By Patrick Warren and Darren Linvill
Illustrations by Chris Koelle





Here’s what Russia’s 2020 disinformation operations look like, according to two Clemson experts on social media and propaganda



Darren Linvill (left) and Patrick Warren (right)



Internet trolls don’t troll. Not the professionals at least. Professional trolls don’t go on social media to antagonize liberals or belittle conservatives. They are not narrow minded, drunk or angry. They don’t lack basic English language skills. They certainly aren’t “somebody sitting on their bed that weighs 400 pounds,” as the president once put it. Your stereotypical trolls do exist on social media, but the amateurs aren’t a threat to Western democracy.
Professional trolls, on the other hand, are the tip of the spear in the new digital, ideological battleground. To combat the threat they pose, we must first understand them — and take them seriously.
On August 22, 2019, @IamTyraJackson received almost 290,000 likes on Twitter for a single tweet. Put in perspective, the typical tweet President Trump sends to his 67 million followers gets about 100,000 likes. That viral tweet by @IamTyraJackson was innocent: an uplifting pair of images of former pro football player Warrick Dunn and a description of his inspiring charity work building houses for single mothers. For an anonymous account that had only existed for only a few months, “Tyra” knew her audience well. Warrick’s former coach, Tony Dungy, retweeted it, as did the rapper and producer Chuck D. Hundreds of thousands of real users viewed Tyra’s tweet and connected with its message. For “Tyra,” however, inspiring messages like this were a tool for a very different purpose.
The purpose of the Tyra account, we believe, was not to spread heartwarming messages to Americans. Rather, the tweet about Warrick Dunn was really a Trojan horse to gain followers in a larger plan by a foreign adversary. We think this because we believe @IamTyraJackson was an account operated by the successors to Russia’s Internet Research Agency (IRA). Special Counsel Robert Mueller indicted the IRA for waging a massive information war during the 2016 U.S. election. Since then, the IRA seems to have been subsumed into Russia’s Federal News Agency, but its work continues. In the case of @IamTyraJackson, the IRA’s goal was two-fold: Grow an audience in part through heartwarming, inspiring messages, and use that following to spread messages promoting division, distrust and doubt.
We’ve spent the past two years studying online disinformation and building a deep understanding of Russia’s strategy, tactics and impact. Working from data Twitter has publicly released, we’ve read Russian tweets until our eyes bled. Looking at a range of behavioral signals, we have begun to develop procedures to identify disinformation campaigns and have worked with Twitter to suspend accounts. In the process we’ve shared what we’ve learned with people making a difference, both in and out of government. We have experienced a range of emotions studying what the IRA has produced, from disgust at their overt racism to amusement at their sometimes self-reflective humor. Mostly, however, we’ve been impressed.



Tweets are framed to serve Russia’s interests in undermining Americans’ trust in our institutions.



Professional trolls are good at their job. They have studied us. They understand how to harness our biases (and hashtags) for their own purposes. They know what pressure points to push and how best to drive us to distrust our neighbors. The professionals know you catch more flies with honey. They don’t go to social media looking for a fight; they go looking for new best friends. And they have found them.
Disinformation operations aren’t typically fake news or outright lies. Disinformation is most often simply spin. Spin is hard to spot and easy to believe, especially if you are already inclined to do so. While the rest of the world learned how to conduct a modern disinformation campaign from the Russians, it is from the world of public relations and advertising that the IRA learned their craft. To appreciate the influence and potential of Russian disinformation, we need to view them less as Boris and Natasha and more like Don Draper.
As good marketers, professional trolls manipulate our emotions subtly. In fall 2018, for example, a Russian account we identified called @PoliteMelanie re-crafted an old urban legend, tweeting: “My cousin is studying sociology in university. Last week she and her classmates polled over 1,000 conservative Christians. ‘What would you do if you discovered that your child was a homo sapiens?’ 55% said they would disown them and force them to leave their home.” This tweet, which suggested conservative Christians are not only homophobic but also ignorant, was subtle enough to not feel overtly hateful, but was also aimed directly at multiple cultural stress points, driving a wedge at the point where religiosity and ideology meet. The tweet was also wildly successful, receiving more than 90,000 retweets and nearly 300,000 likes.
This tweet didn’t seek to anger conservative Christians or to provoke Trump supporters. She wasn’t even talking to them. Melanie’s 20,000 followers, painstakingly built, weren’t from #MAGA America (Russia has other accounts targeting them). Rather, Melanie’s audience was made up of educated, urban, left-wing Americans harboring a touch of self-righteousness. She wasn’t selling her audience a candidate or a position — she was selling an emotion. Melanie was selling disgust. The Russians know that, in political warfare, disgust is a more powerful tool than anger. Anger drives people to the polls; disgust drives countries apart.
Accounts like @IamTyraJackson have continued @PoliteMelanie’s work. Professional disinformation isn’t spread by the account you disagree with — quite the opposite. Effective disinformation is embedded in an account you agree with. The professionals don’t push you away, they pull you toward them. While tweeting uplifting messages about Warrick Dunn’s real-life charity work, Tyra, and several accounts we associated with her, also distributed messages consistent with past Russian disinformation. Importantly, they highlighted issues of race and gender inequality. A tweet about Brock Turner’s Stanford rape case received 15,000 likes. Another about police targeting black citizens in Las Vegas was liked more than 100,000 times. Here is what makes disinformation so difficult to discuss: while these tweets point to valid issues of concern — issues that have been central to important social movements like Black Lives Matter and #MeToo — they are framed to serve Russia’s interests in undermining Americans’ trust in our institutions.





These accounts also harness the goodwill they’ve built by engaging in these communities for specific political ends. Consistent with past Russian activity, they attacked moderate politicians as a method of bolstering more polarizing candidates. Recently, Vice President Biden has been the most frequent target of this strategy, as seen in dozens of tweets such as, “Joe Biden is damaging Obama’s legacy with his racism and stupidity!” and “Joe Biden doesn’t deserve our votes!”
The quality of Russia’s work has been honed over several years and millions of social media posts. They have appeared on Instagram, Stitcher, Reddit, Google+, Tumblr, Medium, Vine, Meetup and even Pokémon Go, demonstrating not only a nihilistic creativity, but also a ruthless efficiency in volume of production. The IRA has been called a “troll farm,” but they are undoubtedly a factory.
While personas like Melanie and Tyra were important to Russian efforts, they were ultimately just tools, interchangeable parts constructed for a specific audience. When shut down, they were quickly replaced by other free-to-create, anonymous accounts. The factory doesn’t stop. They attack issues from both sides, attempting to drive mainstream viewpoints in polar and extreme directions.
In a free society, we must accept that bad actors will try to take advantage of our openness. But we need to learn to question our own and others’ biases on social media. We need to teach — to individuals of all ages — that we shouldn’t simply believe or repost anonymous users because they used the same hashtag we did, and neither should we accuse them of being a Russian bot simply because we disagree with their perspective. We need to teach digital civility. It will not only weaken foreign efforts, but it will also help us better engage online with our neighbors, especially the ones we disagree with.



They know what pressure points to push and how best to drive us to distrust our neighbors.



Russian disinformation is not just about President Trump or the 2016 presidential election. Did they work to get Trump elected? Yes, diligently. Our research has shown how Russia strategically employed social media to build support on the right for Trump and lower voter turnout on the left for Clinton. But the IRA was not created to collude with the Trump campaign. They existed well before Trump rode down that escalator and announced his candidacy, and we assume they will exist in some form well after he is gone. Russia’s goals are to further widen existing divisions in the American public and decrease our faith and trust in institutions that help maintain a strong democracy. If we focus only on the past or future, we will not be prepared for the present. It’s not about election 2016 or 2020.
The IRA generated more social media content in the year following the 2016 election than the year before it. They also moved their office into a bigger building with room to expand. Their work was never just about elections. Rather, the IRA encourages us to vilify our neighbor and amplify our differences because, if we grow incapable of compromising, there can be no meaningful democracy. Russia has dug in for a long campaign. So far, we’re helping them win.





Tips for Avoiding Online Disinformation

1   Be careful whose content you share.

Remember, bad actors aren’t always the account you disagree with. All too often, online trolls pretend to be your friend. Just because someone says they share your beliefs doesn’t necessarily mean it is true.

2   Be wary of anonymous strangers.

The rules in the digital world are the same as the real world. In the real world, strangers who hide their identity are probably less likely to have your best interest at heart. The same is true online.

3   Double check information before you share it.

There are many nonpartisan fact-checking websites you can visit, including snopes.com and politifact.com. Misinformation can’t spread if we all take a breath and check our sources. Also, photo sourcing tools, such as tineye.com, can help you track the URL of an image to see if it has appeared anywhere else online.

4   Don’t rely on social media for your news.

Most Americans now get their news from social media, and this often means they are getting their news in a bubble (and bad actors rely on this!). Try visiting news aggregators, like Google News, that offer users a range of sources on every story.


Story Extras



Want to find out how good you are at spotting disinformation online? Take Linvill and Warren’s “Spot the Troll” quiz.







Warren and Linvill have been published and quoted extensively in national media; explore some of those stories in the links below:













Darren Linvill is an associate professor of communication at Clemson. His work explores state-affiliated disinformation campaigns and the strategies and tactics employed on social media. Patrick Warren is an associate professor of economics at Clemson. Warren’s research focuses on the operation of organizations in the economy such as for-profit and non-profit firms, bureaucracies, political parties, armies, and propaganda bureaus.




Reprinted from RollingStone.com published November 25, 2019.  © Rolling Stone LLC.  All rights reserved.  Used by permission.



6 replies
  1. Gina C Wilkie
    Gina C Wilkie says:

    Great article with good advice. I remember the article that was printed in Charlotte Magazine from Jan 2019 about your work and it was fascinating too. I would like some advice if you have it: I have two (adult) friends who delve into government “conspiracy theories” and are constantly posting their findings on Facebook. I call these posts conspiracy theories, but I don’t know if it’s real information or not. I am concerned about the amount of time investment my friends are spending researching this stuff. One friend told me she spends about 5 hours a day researching and reposting. It actually makes her anxiety flare up and she is on medication because of it. I have talked to her twice now about the amount of time she spends on-line and how it takes away from her ‘real life’. She has told me that she has a fear of missing out on the news if she doesn’t spend that time researching. I believe she’s addicted (I think both my friends who do this are addicted). Anyway, other than a friendly, heart-felt talk, do you have any advice for me as to how to help them regain their time and refocus their attention away from the computer? Thank you!

    Reply
    • Patrick Warren
      Patrick Warren says:

      Hey Gina,
      I’m not an expert in interventions that affect belief in conspiracy theories, but the academic literature indicates that both informational and emotional interventions work. Information should be presented coolly and rationally and is more likely to be received well if coming from someone they trust. Emotional interventions should try cultivate an increased sense control over the world. Conspiratorial thinking often derives from a sense that the world is out of control. Focus your friends attention on things in their life that they have control over and can actually make progress on. If your busy improving your own local community/family/self, it’s hard to have much time to think about shadowy conspiracies.
      Good luck!
      PLW

      Reply

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply to Al Clarke Cancel reply

Your email address will not be published. Required fields are marked *