News

How To Spot Russian Trolls Online Ahead Of 2020 Election


Posted on:

< < Back to ?p=246118 https://ondemand.npr.org/anon.npr-mp3/wbur/media/2019/11/20191127_hereandnow_russian-trolls-2020-election.mp3?ft=nprml&f=783375581

As the end of the year approaches, social media is flooding with inspirational memes and messages.

But have you ever thought about who is creating these messages? If not, Clemson University researchers suggest you start considering who is posting the content you share online and why.

Many of these inspirational posts are part of the latest attempt from Russia to sow discord and discontent among Americans ahead of the 2020 presidential election, researchers say.

Two researchers at Clemson — Darren Linvill, associate professor of communication, and Patrick Warren, associate professor of economics — have studied the strategy and tactics of professional trolls.

The duo has looked at the methods of Russia’s former Internet Research Agency, which has been absorbed by the country’s Federal News Agency.

“It’s not a drunk teenager in the basement. It’s really more like a Russian Don Draper,” Linvill says. “It’s an ongoing guerrilla marketing campaign. These are professionals. They know what they’re doing and they’re really good at their job.”

Americans shouldn’t count on the Department of Justice to stop Russian trolls, Warren says.

People are starting to understand that they need to question whether the information they read on social media is true, but he says that’s not enough to stop trolls from gaining influence.

To stop trolls from exploiting existing tensions in American society, he says people need to question why we’re seeing certain messages and the consequences of sharing them before hitting retweet.

“I think that there’s a lot that you can do,” Warren says. “If you’re mindful of the origins of the information you’re sharing, it can make a big difference.”

Interview Highlights

On how trolls gain influence and the Russian troll Twitter account @IamTyraJackson

Warren: “These trolls, they go through kind of a life cycle. And the first step in that life cycle is to introduce themselves. There’s some community out there they’re trying to become a part of in order to try to influence the members of that community. And so the way you introduce yourself is you’d post something that people in that community are going to find interesting. They’re going to be likely to share in order to then use that sort of clout that you’ve built within that community later on. And that account actually didn’t get that far. They were shut down before that happened.”

Linvill: “These Russian trolls, they don’t work to antagonize people like one might think. They’re entrenching people in ideology, not working to change ideology.”

 

On successful Russian troll Twitter account @PoliteMelanie

Linvill: “@PoliteMelanie built a brand. We saw her appear routinely in mainstream media. She was on CNN and Al Jazeera in posts online, and she even won the Chicago Tribune’s Tweet of the Week back in October of 2018 that was voted on by Chicago Tribune’s readers. And she often talked about important issues that Americans genuinely need to be talking about related to the Black Lives Matter movement and the #MeToo movement. But she would always frame these issues in particularly divisive ways. For instance, she and her friends encourage real Americans to go and find people. They might post a video with a very real incident of racism and they might advertise a phone number and certainly a name and encourage their followers to take action and possibly even violent action.”

 

On why it’s important to call out when troll accounts are Russian

Warren: “I think it matters because we care about people’s motivations. And so the goal of these accounts isn’t to have a conversation at all. The goal of these accounts is to cause social conflict online, to become real conflict in the world. And that seems dangerous.”

 

On how researchers determine when an account is a foreign troll

Linvill: “It’s exceedingly hard, but it started by reading Russian tweets until our eyes bled. And there’s a variety of signals that we look for from the aesthetics to more specifics and technical signals. For the layman, though, I think it doesn’t ultimately if it’s a Russian troll or an Iranian troll or a Chinese troll, I think one needs to be careful when you’re interacting with anonymous accounts not to retweet someone just because they use the same hashtag as you did and you agree with them, but also not accuse people of being Russian trolls just because you disagree with them. I think that’s one of the biggest impacts of Russian disinformation is that we don’t trust each other anymore and it’s really dangerous and it’s a lasting impact.”

 

On what people can do to combat Russian trolls

Warren: “I think it’s important to realize that when you share something on social media, you’re doing two things. You’re sharing a message, but you’re also bringing prominence to the account you’re sharing. And so the question you should be asking yourself often on social media, in addition to the obvious question that we all start with, which is: Is this real or not? The next question you should be asking yourself is, why am I seeing this? Algorithms kind of rule our lives on social media. And what these guys are trying to do is get people who shouldn’t be central to the conversation to become more central to the conversation due to their gaming of the algorithm.”

Linvill: “And we found that possibly the biggest impact that Russian disinformation had in the run-up to the 2016 election was not the words they were using, but in making real human users, making those accounts more prominent, because when the Russians retweeted accounts, those accounts became more active.”

Warren: “And that’s particularly important because Twitter has gone and they shut down 3,500 accounts that they identified as emanating from St. Petersburg. But those accounts that they managed to make more prominent remain more prominent and active than they would have been if the Russians hadn’t intervened. I mean, there’s literally nothing that platforms could do to go and undo that.”

 

On whether Special Counsel Robert Mueller’s indictment of the Russian Internet Research Agency slowed the spread of disinformation online

Linvill: “It is my understanding that the Internet Research Agency is no longer an official entity. It’s been folded into Russia’s Federal News Agency and it still churns along, but it’s worse than that. We’ve seen data released by Facebook recently that suggests that Iran is following the Russian playbook very closely. And Patrick [Warren] and I like to distinguish between defensive and offensive disinformation. And previously a lot of these other countries mostly engaged in defensive — talking about things that were important to those countries. So Saudi Arabia had a botnet that tweeted about Jamal Khashoggi and how he wasn’t killed by the Saudi Arabians. But you see both Iran and China getting involved in more offensive disinformation, which is messing with attitudes here in the United States about the things that we think are important.”


Chris Bentley produced and edited this interview for broadcast with Ciku Theuri and Todd MundtAllison Hagan adapted it for the web.

This article was originally published on WBUR.org.

Copyright 2019 NPR. To see more, visit https://www.npr.org.