Clemson Profs Unveil Russian Trolls’ Information War, Tweet By Tweet

A year ago, over beers, a pair of young instructors began a project that opened a window on a campaign against America that continues unabated


Published:

PHOTOGRAPHS BY LOGAN CYRUS

DARREN LINVILL AND PATRICK WARREN ALREADY UNDERSTOOD, to a degree matched by few in their country, that Russia was weaponizing Twitter in an ongoing campaign of information warfare against the United States. From behind computer monitors in a repurposed classroom at Clemson University, they had spent nearly a year unlocking, compiling, and analyzing the evidence, nearly two million tweets fired off from hundreds of paid operatives in an office building nearly 5,000 miles away. But in October, the campaign targeted a woman in New York, and the two professors felt a shiver, a fresh fear they hadn’t yet acknowledged.

On October 10, a 53-year-old white woman in New York named Teresa Klein called police and falsely accused a nine-year-old black boy of sexually assaulting her outside a deli in Brooklyn. A bystander recorded video, posted it to Facebook, and dubbed Klein “Cornerstore Caroline.” It was another in a series of nickname-generating incidents—“BBQ Becky,” “ID Adam,” and Charlotte’s own “SouthPark Susan”—of white people caught on camera calling law enforcement and accusing racial minorities of crimes when they’ve done nothing wrong. The video was viewed more than four million times in its first three days on Facebook.

From Clemson, Linvill and Warren monitored the Twitter activity of accounts that belong to social media saboteurs working from a nondescript office building in Saint Petersburg, Russia, home to a now-infamous “troll factory” called the Internet Research Agency. Since 2013, the IRA has used the web and social media platforms to disseminate pro-Russia propaganda and act on behalf of the Russian government and business interests. American intelligence analysts believe the operation is closely tied to Russian President Vladimir Putin and the Russian military and, starting in late 2015, worked to influence the U.S. presidential election on behalf of Donald Trump. A federal grand jury in Washington, D.C., indicted the IRA, two other entities, and 13 Russians in early 2018 on charges that they conspired to “defraud the United States ... for the purpose of interfering with the U.S. political and electoral processes, including the presidential election of 2016.”

In late 2017, the Clemson professors began to unlock much of the IRA campaign’s Twitter archive, and they continue to monitor accounts linked to the agency. So when the “Cornerstore Caroline” story hit The New York Times and other major media outlets in October, Linvill and Warren checked in with those accounts and discovered something new and unsettling: They were posting Klein’s personal information, including address and phone number—a malevolent online trolling practice known as “doxing.” The doxing tweets sometimes came with the casually menacing suggestion, “Twitter, do your thing.”

Neither man was eager to defend Klein, who clearly was in the wrong when she called the cops on a nine-year-old who had accidentally brushed against her. But that wasn’t the point. “That was the most recent example of something we had that made me take a step back and rethink what it is we’re working with,” Linvill tells me in late October. “Because it’s personal. They’re reaching out and touching individual Americans’ lives in a negative way.”

It’s easier to distance yourself emotionally from a campaign that “seeks to influence the public” or “interferes with the electoral process in the United States.” Those terms keep matters abstract enough for some comfort. But to see Russian trolls take aim at one person, regardless of what she did, brings the fear home. When specific people get “doxed,” they lose the protection of privacy and anonymity; suddenly, people everywhere know who you are, where you live, how to contact you, what you look like. One or more Internet trolls in an office in Russia, or the people supervising them, decided it would be fun or useful or disruptive to disclose Klein’s personal information to the world, with no apparent thought of or care for the consequences. If they could do that to her, they could do it to just about anyone.

LOGAN CYRUS

From the the Social Media Listening Center at Clemson University, professors Darren Linvill (left) and Patrick Warren (right) compiled and analyzed nearly 2 million tweets from Russian troll accounts.

“I actually said the words, ‘The trolls are going to be so excited if they manage to get someone hurt or killed,’” Warren tells me. Linvill estimated afterward that as much as a quarter of the total Twitter activity about “Cornerstore Caroline” on October 11—hundreds of tweets and retweets—originated with trolls employed by the IRA.

The aim? “To encourage violence,” Warren says. “For sure.”

***

IN ITS 13 YEARS, Twitter has transformed the way people around the world communicate and share information—and disinformation. Anyone with a computer or smartphone and a valid email address can use it for free. That’s its blessing and curse.

It took less than a decade for authoritarian governments to begin using it and other social media as tools for spreading propaganda. In 2012, the Russian government began paying people, many in their 20s, to post pro-Kremlin tweets as part of their normal Twitter use. In 2013, a man with close ties to Putin bankrolled the founding of the Internet Research Agency in Saint Petersburg, according to a 2017 report by U.S. intelligence agencies. The IRA began hiring Russian citizens to post pro-Kremlin messages. The operation ramped up during the Russian invasion of Crimea, in neighboring Ukraine, in 2014. By the next year, the IRA had roughly 1,000 employees. Toward the end of 2015, according to the intelligence analysis, they began to tweet with a new focus: boosting Trump’s presidential campaign while disparaging Hillary Clinton’s—one wing of a Putin-backed operation to disrupt American public life.

The tweets Linvill and Warren have compiled, until recently the largest data set of Russian troll activity in existence, serve as a chronicle of that operation. Neither professor can be sure they’ve found them all. But they have collected and analyzed roughly two million tweets that the IRA produced from June 19, 2015, three days after Trump announced his candidacy for president, to the end of 2017. They continue to identify and analyze tweets from accounts they’re “99 percent sure,” Linvill says, come from IRA employees who easily dodged Twitter’s disabling of their original accounts by simply creating new ones with different names. The two men have shared their cache with members of Congress and multiple news organizations.

During a September hearing in Washington, U.S. Senator Susan Collins of Maine, a Republican member of the Senate Intelligence Committee, told Twitter founder and CEO Jack Dorsey, “I’ve learned, not from Twitter but from Clemson University, that I was one of those targeted leaders … So why doesn’t Twitter notify individuals like me that have been targeted by foreign adversaries?”

“I agree it’s unacceptable,” Dorsey replied.

The scale of the trolling operation continues to stun even the professors. “One former employee of the IRA described the feeling of working there as though ‘you were in some kind of factory that turned lying, telling untruths, into an industrial assembly line,’” Linvill and Warren wrote in July in the first formal paper based on their Russia research, which as of November had not been published. “The IRA is engaging in what is not simply political warfare, but industrialized political warfare.”

Neither Linvill nor Warren thought they’d ever stumble across something so momentous. “I fully expected nothing to come of it,” Linvill tells me. They’re careful, too, to emphasize what their data trove doesn’t show. To date, they’ve examined only Twitter, not other platforms such as Facebook, Instagram, or Reddit. The data don’t prove conclusively that Trump won the presidency because of the Russian trolling operation, or Russian assistance in general. There’s no way to know precisely how the volume of tweets, or any one tweet, turned a Clinton or Bernie Sanders voter into a Trump voter, or induced a Clinton or Sanders voter not to vote at all. They can’t say for certain what, if any, effect the campaign had on the 2018 midterms.

What it does show, in detail, is a critical piece of a deliberate effort by an adversarial government to exploit distrust among the American public—to manipulate Americans to see their fellow citizens as enemies, to overwhelm them with so much disinformation that they have trouble distinguishing fact from fiction, or they accept fiction as fact. It appears to have succeeded. Absent action by government or social media platforms, it likely will continue to.

“While there are obvious political implications to this, we still strongly feel that it’s not about the (2016) election. This is an ongoing issue,” Warren says. “The reason we won the Cold War is that we worked really well internally and with our NATO partners, and Russia didn’t have any partners. They want fighting. They want fighting inside the United States; they want fighting between the U.S. and other NATO partners. That’s what it’s about.”

***

I DRIVE TO MEET the two of them at Clemson on a sweltering morning in late August, the Monday of the week classes begin. The campus of more than 20,000 students buzzes with the hauling of boxes, the lugging of book bags, cicadas, and the fruitless circling of drivers looking for somewhere to park. I finally find a spot in the student union lot. 

Linvill, a dark-haired, bespectacled 42-year-old, walks over from his office to rescue me. He’s wearing a checkered orange-and-white Polo shirt—the second-ranked football team’s season opener is only 12 days away. “I’m a second-generation Clemson professor,” he remarks. “That’s why the orange.”

It takes about five minutes to walk to Daniel Hall, which houses the university’s Social Media Listening Center. Linvill runs down how various elected officials and media outlets ranging from Wired to the Washington Post have broken off pieces of their research to, for instance, track how often the IRA trolls have tweeted about events in a particular state. “The thing about our data is, there’s a million little stories,” he says as we approach Daniel Hall, a hulking, 50-year-old brick structure that, appropriately enough, resembles the kind of boxy Stalinist monolith common throughout Russia, a gloomy remnant from Soviet days.

The hallways at Daniel are almost as utilitarian and dismal: beige brick walls, dim lighting, a conspicuous shortage of garbage cans. “That’s what you get when you work at a state school,” Linvill says. We reach the Listening Center, little more than a refurbished classroom with a handful of computers and monitors on tables against the walls, where Warren waits. 

Warren is 39 but looks like a graduate student, with his slight physique, wire-rimmed glasses, tousled thatch of hair, and cargo shorts. Linvill, an associate communications professor, is the larger and more voluble of the two; Warren, an associate professor of economics, is more reserved, and his soft voice betrays his upstate South Carolina roots.

They’re both local boys. Linvill’s father, Dale, an agricultural meteorologist, worked at Clemson for 24 years until his retirement in 2004 and still serves as a professor emeritus. Linvill earned bachelor’s and master’s degrees at Wake Forest before returning to Clemson for his Ph.D. in education. Warren grew up in Walhalla, about a half-hour’s drive northwest of Clemson, and earned an undergraduate degree at the University of South Carolina (“We don’t talk about that,” Linvill jokes) before earning a doctorate in economics from MIT. His specialty is the economics of organizations—how entities as diverse as corporations, nonprofits, and political parties affect how national economies operate, or don’t. He decided to concentrate on that field after the U.S. government’s botched response to Hurricane Katrina in 2005. “Why was it so terrible? That’s what I was trying to understand,” he tells me at the Listening Center.

The demeanor of the two academics belies the gravity of the research they’ve done. They’re friends as well as colleagues. They joke easily with each other as they recline at their work stations and snack on chips. Both men are married with young children and have limited opportunities for socializing. They make it a point to carve out time for a bi-weekly game night at the home of Dale Linvill, where Darren Linvill, Warren, and some other youngish Clemson profs lug beer down to the basement and play board games to unwind. It was during one of those game nights, over the semester break for the 2017 holiday season, when Linvill and Warren decided to dive into the lake of Russian troll accounts.

***

THEY WERE PLAYING Pandemic, in which players have to band together to find cures for diseases sweeping the globe. It’s an unusual game that emphasizes cooperation more than competition.

In November 2017, the Democratic minority of the U.S. House Intelligence Committee had introduced into the Congressional record a list of 2,752 accounts that Twitter identified as connected to the IRA. During breaks in battling one contagion or another, Linvill and Warren started chatting casually about the release. Linvill was drinking Pabst Blue Ribbon, Warren an undisclosed craft beer. The list of accounts, Warren says, “seemed intriguing. We started thinking about, ‘Well, what can we do with those?’ We had a sense of what (the Russians) were trying to do, but like really dig into what they were trying to do.”

The executives of social media platforms, such as Jack Dorsey, “were going before Congress to testify,” Linvill says. “It was the issue of our day.”

Their expectations were low, even after Twitter in January 2018 added another round of IRA-related accounts to the Congressional list, bringing the total to 3,814. They did have a potent tool in the Social Media Listening Center and its Social Studio software, an advanced program primarily used to track business clients’ social media activity. Clemson got its hands on the software through a personal connection—Jim Bottum, a longtime engineering professor, knew Dell Technologies founder and CEO Michael Dell. 

During a 2011 visit to Dell headquarters, Bottum started talking with Dell about the university’s new social media lab, which Bottum thought would benefit from a customer relationship management program Dell was using. The Listening Center opened in 2012 and began using the software, Social Studio’s predecessor. It allowed students and professors to track social media mentions across platforms and analyze keywords, peak times, and chain retweets with a depth and detail almost unheard of at a university. Its use wasn’t confined to business-client relationships. It could examine other kinds of social media activity just as thoroughly.

“We let the faculty and students go to town, so to speak,” says Joe Mazer, the communication department chair. “We were taking a big risk at the time, because we did not know the future of social media.”

LOGAN CYRUS

Clemson’s Social Media Listening Center is unusually advanced for a state university. Here, Amanda Moore, the center’s associate director, works on a Saturday afternoon in fall. 

Even with Social Studio, though, Linvill and Warren had a problem: Twitter had suspended almost all of the IRA accounts, which meant they couldn’t search by account name, or “handle,” and hope to collect anything approaching the full scale of the operation. One or two IRA handles did slip through the net, though, popping up in Social Studio when the professors performed a straight search. “The fact that some of the suspended accounts started showing up—and initially it wasn’t all of them—was honestly a surprise,” Linvill says. “And to get all of them, we had to do a lot of ...”

“Tricks,” Warren offers.

“Tricks,” Linvill confirms.

The “tricks” allowed them to cast a larger net. Instead of searching by the handles of suspended accounts, they discovered that a keyword search that used the term “tweet from,” followed by the handle, “yielded all tweets from a given account,” they wrote in their July paper. Other searches returned “clutter” from non-IRA accounts. For example, for a confirmed troll who used the handle @Brooooke, they began querying for tweets that contained “Brooooke,” without the @ symbol. That brought up all the tweets from @Brooooke but also from the non-IRA accounts @_Brooooke_, @_Brooooke_X_, and eight other variations. (In light moments, Linvill and Warren playfully pronounce the handle name as if they’re cattle mooing in a barnyard.) The search method allowed Linvill and Warren to weed out irrelevant accounts and keep what evolved into a treasure chest of Russian troll tweets, in numbers they never could have foreseen.

The disadvantage: They had to toss a lot of junk over the side to get to the treasure. It took about two months to get the list in shape, eliminating 172,812 tweets in the process, plus another 828,219 from non-English language accounts. That left 1,875,029 troll tweets from 1,311 handles confirmed as IRA accounts. Then they began coding each tweet, one by one, according to one of five categories: Right Troll, Left Troll, News Feed, Hashtag Gamer, and Fearmonger. The last of the five spread baseless conspiracy theories, such as a hoax about tainted turkeys from Walmart at Thanksgiving 2015. Hashtag Gamers tweeted harmless, non-political hashtags, such as “#ThingsI​LearnedFromCartoons,” that served as engagement bait for other users. News Feed tweeters generally posed as news aggregators, using handles such as the fictional @OnlineMemphis, and linked to legitimate local and regional news stories from actual news outlets, presumably in an attempt to build credibility.

The most numerous were the Right and Left Trolls, especially those on the right, which mine popular right-wing conspiracy theories and tend to speak of Trump in glowing, messianic terms. (A pair of typical post-election entries from one @10_GOP: “All of the good guys I know wanted us out of Iran deal and all of the globalists wanted us in it. Look at whose (sic) mad today - deep state.” “@realDonaldTrump We need to end ObamaCare as soon as possible before it kills millions.”) The less numerous Left Troll accounts—230 handles and 405,549 tweets, compared to 617 handles and 663,740 tweets from the Right Trolls—mainly posed as Black Lives Matter or LGBTQ activists. “When you have been handcuffed for no good reason, all you can think about is how not to get shot,” tweeted someone using the handle
@blacktivists in May 2016. “Never trust a cop.” The Left Trolls invariably castigated Hillary Clinton for alleged crimes and inconsistent support of minorities—reflecting themes from opponents of her 2016 presidential campaign.

Linvill and Warren use the categories only as an organizing framework. The primary goal of the IRA trolls wasn’t necessarily to get Trump elected president—although Putin himself has admitted he prefers Trump to Clinton, whom he detests. The real goal was inducing crippling dysfunction in the American political system and people.

I catch up with Linvill by phone in early October, in the midst of the national furor over Brett Kavanaugh’s confirmation to the U.S. Supreme Court, which IRA Twitter accounts have been using as trolling fuel for the previous two weeks. You could argue that Twitter would be on fire anyway over Kavanaugh, even if it was Russian troll-free. “But with the Russians’ help, we’re fighting just a little bit harder about Kavanaugh, and that is absolutely in Putin’s best interest,” Linvill says. “Because what have we as a country accomplished in the last two weeks? Absolutely nothing. Other issues? Completely ignored. So it doesn’t matter whether they’re pushing it from the left or the right. They’re pushing the division.”

***

BEFORE AND AFTER Trump’s election, mainstream media outlets and American officials generally described Russian efforts to steer public opinion his way as “meddling” or “interfering” with the electoral process, as if the IRA’s operation was a large-scale fraternity prank. American intelligence agencies and the Clemson professors view those as euphemisms that obscure their significance. “When a country can come interfere in another country’s elections,” former South Carolina Governor and then-Ambassador to the United Nations Nikki Haley said in 2017, “that is warfare.” More specifically, it’s a new front, enabled by social media, in what foreign policy experts and historians have come to call “hybrid warfare.” Putin is its most ardent practitioner.

The term refers to a combination of tools and tactics meant to undermine an enemy, from conventional actions by armed troops to propaganda and disinformation campaigns. Disinformation aimed at the enemy is nothing new in war. But social media platforms such as Twitter allow warring states to spread it en masse instantaneously and at little cost compared to the investment required for troops and gear on the ground. Hashtagging and retweeting by users oblivious to the campaign amplify the messages. In 2014, when Russia annexed Crimea, it rolled out a calculated operation designed to confuse Russians and Ukrainians alike about whether Russia was even behind the invasion and, simultaneously, why the invasion was justified. Russian Twitter trolls and Kremlin-funded news outlets such as RT spread the fiction that Ukrainian leaders were uniformly fascist. They manufactured false stories, such as one widely shared account of the public crucifixion of a three-year-old boy by the Ukrainian military. No evidence has ever emerged that this happened.

“Sometimes,” Warren offers, “they’re just throwing spaghetti at the wall to see what sticks.” A term has emerged among academics and journalists for the accumulated layers of spaghetti: “reverse censorship.” Rather than silence or close legitimate sources of information, reverse censors try to drown them in a flood of noise that leaves consumers perplexed and apathetic. Demolishing the public distinction between truth and falsehood, fact and fiction, is a signature goal of totalitarian states. “The aim is to confuse rather than convince,” Russian-born journalist Peter Pomerantsev, author of a 2014 account of Russia under Putin titled Nothing Is True and Everything Is Possible, has written, “to trash the information space so the audience gives up looking for any truth amid the chaos.” 

***

LOGAN CYRUS

Patrick Warren (left) and Darren Linvill used business software to unlock the Russian troll tweets from accounts that Twitter had identified as such, then suspended. 

THROUGHOUT 2016, social media in the United States erupted with #BlackLivesMatter and related content. It spiked several times that year when, in separate incidents, police shot and killed black people. The Keith Lamont Scott shooting in Charlotte on September 20 was another case ripe for exploitation, and IRA trolls pounced on it as soon as the news broke. “BREAKING,” reads one tweet from that night. “12 police officers injured as ‘peaceful’ protests erupt in Charlotte!!!” The federal indictment in early 2018 identifies the user as a Russian troll.

Another federal indictment in July makes plain that the IRA wasn’t the only Russian organization playing on the #BlackLivesMatter field. That indictment accused 12 officers of the GRU, Russia’s military intelligence agency, of conspiring in 2016 to hack the email accounts and computer networks of the Clinton campaign and Democratic Party, steal documents, and release them through WikiLeaks. To hide their identities, the grand jury alleges, the GRU officers used false online identities, including the now-famous “Guccifer 2.0” and the Twitter handle @dcleaks_, which had a matching website. A lesser-known handle—which the GRU allegedly used on the same computer from which it tweeted as @dcleaks_—was @BaltimoreIsWhr, a fake Black Lives Matter account whose user posed as a “woke” black activist in Baltimore.

The GRU revelations compelled Linvill and Warren to open a new wing in their research project. “OK,” Warren says at the Social Media Listening Center in August, four weeks after the indictment. He, Linvill, and I gather around a monitor in a semicircle. “Here’s a good search that other people can’t do.” He opens Twitter and enters “@BaltimoreIsWhr” into the search field. The search yields the “Account Suspended” page. He opens Social Studio. “So we’re going to add a keyword—this is gonna be clear, what I’m doing—‘tweet from @BaltimoreIsWhr.’ Save. When was it, Darren, do you remember?”

“It was like, uh, two months before the election, wasn’t it?”

“Oh, it was 2016?”

“Yeah,” Linvill says, “’cause it was tweeting at the same time as @dcleaks_.”

Warren turns back to the screen and revises his search. Reporters from media outlets throughout the nation have gladly tapped this resource-by-proxy. “So this, what you’re about to see,” Linvill says, “is why we’re on the short list of a number of journalists’ email accounts.”

Through Social Studio, Warren has pulled up the 14 tweets that “@BaltimoreIsWhr” produced between September 1 and November 8, 2016, Election Day. “So we can see what this guy was talking about,” Warren says, scrolling down. “He was talking about police being violent toward black people.” The account mimics others that commented in volume after Freddie Gray died in Baltimore Police custody in 2015. The next summer, before the Scott shooting in Charlotte, police shot and killed a pair of unarmed black men, Philando Castile in Minnesota and Alton Sterling in Louisiana, touching off widely publicized protests.

The cases were fissures in American public life that that could widen with the application of a well-placed, well-timed wedge. The alleged GRU agent behind @BaltimoreIsWhr tweeted a link to a Huffington Post story about the September 16 police killing of Terence Crutcher in Tulsa—four days before the Scott shooting. The user, whoever he or she was, had an apparent fondness for HuffPo as a source of news. The account linked to another 2016 story, this one about Colin Powell objecting to Clinton’s statement to the FBI that he had advised her to use a private email server. This comment accompanies the link: “Are u still thinking she cares about us?? Mr.Powell (sic) just said the truth while Clinton attempting to scapegoat him.”

In the upper-right corner of the screen, another function of the software registers a word cloud showing the handle’s most commonly used keywords, indicated by size. The @BaltimoreIsWhr cloud is a swirl of the words “Clinton,” “mistreatment,” “Ferguson,” “criminal,” and “truth,” along with the hashtag #ftp, which stands for “Fuck tha Police.” Like a swarm of hornets, they surround one giant word in the center: “black.” Another search for tweets from the same account earlier in the year brings up invitations to rallies and the handles @BlacksForBernie and @AAsForBernie—in line with the presumed Russian goal to subvert Clinton’s electoral standing among black voters by, among other methods, posing as black activists and championing the candidacy of Bernie Sanders.

“We didn’t know any GRU accounts,” Linvill remarks, eyeing the screen. “This is just, like, a month and a half ago when we started looking at this guy.”

And fresh IRA accounts sprout from the soil all the time, disguising themselves. Linvill refers me to the now-suspended account @JessForevrYoung, whose user was posing as a youngish black woman who tweeted often about her family and admiration for prominent black women. On October 6, 2018, she sent a tweet with an innocuous-looking triptych of Michelle Obama’s high school, college, and law school graduation photos. “Daily reminder that Michelle Obama is one of the most academically accomplished First Ladies,” the tweet begins. Two days later, it has 16 retweets and 41 likes. “That’s got the fingerprints of the IRA all over it,” Linvill says. “A version of that appears dozens of times in our data set.”

What? Why in the world would a Russian troll tweet anodyne praise of Michelle Obama? Or is it just bait—a lure for users in a vulnerable demographic to follow into the cave of Russian disinformation?

“Yeah, yeah. That’s exactly what it is,” Linvill tells me. “If you go through (@JessForevrYoung’s archive), the majority of her tweets are ... nice. I mean, there’s pictures of babies and uplifting moments in black history.” It’s a demonstration of what makes the operation so hard to combat. As it grows, and as time passes, the IRA gets better at what it does. Its disinformation streams multiply and flow into more channels, reaching more and more people—the vast majority of whom can’t tell at a glance what’s real and what isn’t, what’s the authentic voice of the people and what’s a cadre of impersonators working for the government of a foreign state that wants theirs to collapse. The Trojans at first thought the wooden horse was a gift.

***

ANYONE HOPING for an aggressive response to the Russian troll campaign, by either social media platforms or the U.S. government, has encountered only disappointment. A year ago, Twitter’s public policy arm notified approximately 1.4 million American users who engaged—through responses, likes, retweets, or quotes—with confirmed IRA accounts, or were following IRA accounts at the time Twitter suspended them. In its release, as an example of an IRA tweet that “received significant engagement,” Twitter included a tweet by “@Crystal1Johnson” that read, “Cops have killed 68 people in 22 days since #Kaepernick started protesting. 68 in 22 days… have no words #KeithLamontScott.” The tweet was posted September 21, 2016, the day after the Scott shooting.

But the suspensions haven’t stopped the disinformation campaign. “It is clear,” company officials wrote in an October blog post, “that information operations and coordinated inauthentic behavior will not cease.” (Twitter’s press office did not respond to emailed interview requests for this article.)

“There is no doubt that Russia undertook an unprecedented effort to interfere with our 2016 elections,” U.S. Senator Richard Burr of North Carolina, who chairs the Senate Intelligence Committee, stated publicly in May. But the U.S. government has yet to mount a coordinated response, despite the occasional statement, such as Nikki Haley’s, that at least acknowledges the threat. Special Counsel Robert Mueller’s office took the IRA and GRU cases to grand juries for indictment, and the Treasury Department this year sanctioned the IRA, four other organizations, and 19 individuals for multiple cyberattacks and for interfering with the 2016 election. Still, there’s little sign of any action by the government to prevent further interference, and it remains unclear what they would be able to do without violating the First Amendment. A spokesman for Burr declined to make the senator available for an interview, citing the Intelligence Committee’s ongoing investigation of possible collusion between Russia and the Trump campaign. The spokesman did say the committee will, as part of a report from its investigation, address “the use of social media by foreign influence campaigns.”

Linvill and Warren continue their work, in between their usual class loads. One of the difficulties of tracking social media in academia is the hare-tortoise relationship between the former and latter. Social media move instantaneously. Academia moves glacially. Each new domestic crisis—and at least one seems to land on the American public every day—requires a new dive into the troll accounts, just to see what they’re saying and how. They’ve analyzed only that first batch of 1.9 million tweets, a robust sample. But they’ve at least glanced at far more, as many as 9 million, with more coming, as they pursue grant money for further research and try to get academic journals to publish their work.

They, like everyone else, can’t pinpoint how governments, companies, and individuals can fight off the risk that trolls pose. Twitter and Facebook, after all, have been around for only a little more than a decade. The professors sense that people and institutions may have to rethink their relationships to the Internet and decide whether they’re worth this kind of risk. Warren wonders about the wisdom of Twitter and other platforms allowing users to set up accounts in a few minutes with nothing more than active email addresses. Any comprehensive response to the Russian trolling would have to be united among private and public institutions in the United States, which might not be possible. “It’s very much a multilateral attack. I think it needs a multilateral response,” he says. “Is there a NATO group on combating fake news? I don’t think there is. I think there probably should be.”

Academics have a role to play, too, Linvill observes. He teaches a class in political communication, and he happily escorts his and Warren’s research into the classroom. “My students were getting sick of me talking about Russians last semester,” he says in the Listening Center.

“So are your friends,” says Warren.

“So is my wife,” Linvill shoots back.

They still have as much fun as they can with the project they kicked off over beers on game night. But they find themselves checking their email and Twitter accounts more closely, worried about hackers; wouldn’t they be a heck of a prize? Linvill laments that he’s always wanted to visit Saint Petersburg, but he’ll never get to go now, “not while Putin’s alive.” He wants his students, all of them more technically proficient with the tools of social media than he, to grasp what their misuse might mean—for the world, for their country, for themselves. The outlook inspires more dread than hope.

“It’s obvious: This is the nature of political communication now,” Linvill says. “It’s not just happening with the Russians engaging in our country. We’ve heard from several journalists who wanted our input on concerns of it happening in regional elections. We heard from a guy in Arizona who sent us what he suspected was a bot network trying to influence the (U.S.) Senatorial race there.

“So this isn’t just external in the future. It’s going to be an internal problem. How do you deal with it when campaigns and PACs are starting to use it domestically? It is the nature and future of American politics, whether we like it or not.”

Greg Lacour is the senior editor for this magazine. 

Edit ModuleShow Tags
Edit ModuleShow Tags

More »Related Stories

Essay: What Is Charlotte's Sound?

A question that leads everywhere

CMPD Officer's Pigs Help With Community Engagement

Best New(ish) Charlotte Records: 2019

Refresh your record collection—or digital downloads—with these local albums from the past two years

Charlotte Shines on the World Stage During NBA All-Star Weekend

Edit ModuleShow Tags
Edit ModuleShow Tags

Calendar

Edit ModuleShow Tags
Edit ModuleShow Tags

More >> Partner Content

See Iconic Andy Warhol Works at This Limited-Time Exhibition

In partnership with Methodist University

The Must-Visit Vendor at the Southern Spring Home & Garden Show 2019

In partnership with Wahoo Decks

3 Unique Ways to Spend Valentine's Day in South End

In partnership with Charlotte Center City Partners

Sign Up for our E-Newsletters

Stay in-the-know on restaurant openings, things to do, and all things Charlotte with our handy newsletters. SIGN UP HERE

Edit ModuleShow Tags
Edit ModuleShow Tags