DEPARTMENTS

Where Does Fake News Come From?

Kate Starbird’s research on social media in the wake of disasters uncovers a web of disinformation.

September 2017

Reading time min

Where Does Fake News Come From?

Former Cardinal basketball star Kate Starbird’s research shows how websites amplify fake news. (Photo: Tom Reese)

@veteranstoday: Orlando nightclub shooting: Yet another false flag? — http://www.veteranstoday
 .com/2016/06/12/orlando/ looks like another PR extravaganza 

This tweet links to a story claiming that law enforcement fabricated the report of a Muslim security guard who killed 49 people in June 2016 at a gay nightclub in Orlando, Fla. The slayings were more likely carried out by professional assassins hired by the FBI, suggested the story, which was published on the website Veterans Today. The supposed motive: Muhammad Ali had just died, and authorities needed to counteract the positive PR his death had engendered for Islam.

“It made my hair stand up,” says Kate Starbird, ’97, an assistant professor of human centered design and engineering at the University of Washington.

As part of her research on how people use social media in the wake of disasters, Starbird is studying tweets like this one in which rumormongers claim an event is a “false flag” — a covert operation by authorities that frames someone else — as well as tweets that deny the event ever happened.

Starbird’s research is unraveling how insurgents, hustlers and foreign agents come together online to upend the nation’s political discourse. Using the trappings of traditional journalism, they advance “alternative narratives” to mainstream news reports once almost universally accepted as credible. Starbird’s studies reveal an online network that nurtures fake news: an array of websites that repurpose the same content and amplify one another’s messaging, creating a misimpression among readers that multiple sources are reporting a story. It is, essentially, a hothouse for false claims.

“The tweets provide a window into this quirky conspiratorial ecosystem,” Starbird says, rattling off some of the perceived threats she has cataloged from websites across the political spectrum, including the Rothschild family, vaccines, chemtrails and the illuminati. Two ideological threads stand out: The sites are, she says, “anticorporate media and, of course, antiglobalist.” Essentially, they promulgate beliefs that a small coterie of powerful people “are manipulating the media and world events to their benefit.”

Starbird’s work shows how these ideas are spread by grassroots true believers and by profiteers; by gangs of Twitter bots, cloaked in usernames, programmed to disinform; and by foreign operatives exploiting this echo chamber to advance their own clandestine political agendas.

“We have evidence that some Russian sites are doing it, and I’ve got some evidence that there are some Iranian sites in this,” she says. “One big question is, how much of it is just emergent — people doing their own thing, interacting with the information — and how much is intentional orchestrated disinformation?”

By discovering the sources of fake news and charting how it spreads, Starbird hopes to enable the development of tools that will help people know where their information is coming from.

When I asked Starbird via email if she could meet in person to go over a particularly complicated multidimensional chart from her research that lays out the relationships among the conspiracy theory websites, she replied with enthusiasm. “I’d love to show you some of the data,” she wrote. “That’s one of my favorite things to do! (Seriously).”

She comes to the meeting in a black Star Wars T-shirt. “I think my nephew gave it to me,” she says a little sheepishly. She adds that she doesn’t think a lot about clothes, or Star Wars for that matter. The former Cardinal basketball standout is as tall and lean as when she won the Naismith College Player of the Year award as a Stanford senior, and still looks like she could glide past any opponent and score. She carries herself with a quiet confidence — an “inner peace” that coach Tara VanDerveer once identified as Starbird’s most remarkable characteristic.

While her work has been featured this year by the Washington Post, the BBC and the Seattle Times, Starbird does not relish the limelight. The daughter of a colonel, she grew up on and around Army bases in five states. She started programming computers at age 9 and was a diehard computer science major in college.

After Stanford, she played basketball professionally for nine years. Toward the end of her basketball career, she fell in love with a graduate student in social work, Melissa Marsh, whom she married in 2008. The two of them moved to Boulder, where Starbird enrolled in the Technology, Media and Society Program at the University of Colorado.

Her PhD adviser was Leysia Palen, now the chair of the department of information science, who was pioneering the new field of crisis informatics. Starbird began using her computer science skills to analyze social media during disasters.

“She wasn’t afraid to jump in,” says Palen. “She has that right combination of self-questioning — she didn’t assume that she was right — and boldness in that she didn’t fear to be wrong.”

By the time Starbird joined the UW faculty in 2012, she had studied such disasters as the Red River floods in Colorado, the earthquake in Haiti and the Deepwater Horizon oil spill in the Gulf of Mexico, examining how those within and outside the affected regions shared information and created Twitter support networks. She crunched data — the number of tweets and retweets, the geographic origin of posts, activity over time — but her methods went beyond the quantitative, says Palen, citing Starbird’s study of a band of volunteers around the world who connected through Twitter to provide help to Haiti.

“You have hundreds of millions of people shouting to the world about the earthquake, and she saw digital volunteers,” says Palen. Starbird identified them in part through the massive data set she compiled but also by reading thousands of tweets. She then interviewed select individuals to understand how they had voluntarily come together to offer assistance. She was “treating data ethnographically,” says Palen.

Starbird showed how “voluntweeters” emerged after other events, and she considered what measures could enhance their impact. She created “Tweak the Tweet,” which allows critical disaster-related information to be conveyed in standardized form, with hashtags, keywords and other syntax. The tool makes it easier to turn tweets into spreadsheets and maps that help determine what aid is needed where. The system has been deployed for more than 40 events, and Starbird is investigating how to encourage its adoption without her involvement.

Her current focus on conspiracy theories and fake news arose from her ongoing work on improving disaster response. “The thread has always been human behavior and information sharing, online, during crisis events,” she says. “So trust in information has always been there.”

By Starbird’s account, the faucet of fake news has been steadily dripping for at least seven years. The cumulative effect is now impossible to miss, with tweets about fake news emerging from the upper echelons of the U.S. government.

Starbird says she should have zeroed in on these practices sooner. She recalls how in 2013, while studying misinformation tweeted about the Boston Marathon bombing, she and her colleagues recorded several thousand tweets claiming that Navy SEALs or Blackwater operatives were behind it, or even that it was an elaborate ruse staged by government-paid actors. These were only a tiny fraction of the total rumor count, so she dismissed them. But then her team noted similar messages after the terrorist attacks in Paris and the shooting at Umpqua Community College in the fall of 2015.

To uncover what was behind these recurring themes, in January 2016 Starbird and her colleagues began gathering tweets from mass shootings. They had not been thinking about the presidential election. But by the time they completed their data collection nine months later, the influence of “fake news” was evident. Starbird realized that the subset of tweets that referenced alternative narratives could offer insights into a fundamental shift in the ideological landscape of the United States.

Starbird and her colleagues sifted through the 58 million tweets they had collected on shootings to identify 77,641 that referred to false flags or other narratives counter to official law enforcement accounts. Starbird’s team then followed URLs in these tweets to the underlying websites, identified their publishers and coded the sites’ perspectives.

Starbird presented her findings to acclaim in May at a scholarly conference. At the heart of the study is a “domain network graph,” which looks like an airline’s map of the routes among all its cities. Websites are represented by dozens of circles, or nodes, that vary in size according to the number of mentions in the tweets. Whenever two websites are cited by the same Twitter account, the circles are connected by a line. The chart color-codes websites that affirm alternative narratives for the shootings, those that deny them and those that are used as evidence to support alternative narratives, even though they don’t directly refer to those narratives. The latter two categories include many mainstream media outlets. The first category, on the other hand, is a tangle of lines crisscrossing 80 tightly clustered circles, each representing an outlet that produced stories espousing alternative narratives for shootings as well as other conspiracy theories.

The analysis does not dwell on the two websites that had by far the most retweets, because their Twitter-related activity was fueled by automated accounts, also known as bots. One of them, Infowars.com, is the largest node on the graph, while the other, therealstrategy.com, would have been so large that it could not be included without distorting the picture. Tweets from these sites, however, were seldom retweeted beyond their network of bots. “We were primarily looking at the actions of real people,” says Starbird, noting that a previous study led by one of her former graduate students, A. Conrad Nied, documented how to identify bot-driven accounts.

Starbird identified three types of sites that traffic in alternative narratives. One set, she says, is “true believers.” The second promotes conspiracy theories to attract eyeballs to its sites for financial profit. (Many sell nutritional supplements, she says.) The third, and smallest, group is spreading disinformation to advance its political interest: to undermine the credibility of U.S. authorities and mainstream media.

Starbird has found that false-flag claims are whipped up by the interplay of these different groups. For example, she says, Veterans Today, which produced the Orlando shooting false-flag tweet, has strong ties to New Eastern Outlook, a geopolitical journal published by the government- chartered Russian Academy of Sciences. While there are only two openly Russian government- run websites on Starbird’s graph, the study found that most of the sites contain content supporting Russian government interests.

“It was particularly troubling to me because I’m an Army brat,” she says. “I don’t want to focus too much on the Russian element, but that was part of the findings that blew my mind.” She is now trying to document the extent of that influence in greater detail.

Starbird also emails and talks with some of the individuals who believe alternative narratives, and says they are trying to do careful research so they can make sense of confusing stories during turbulent times. “I’ve had some great conversations,” she says. “I don’t agree with their conclusions, but I can understand how people engaging with information can come up with different explanations.”

But she adds that many people are being deceived. “They’ll go to three or four different sites and think that they’re seeing different people talk. In reality, they’re getting the same content in these different places.”

I am sitting with Starbird in an empty lab at UW, and I ask how the conspiracy work has affected her. “I used to be much more positive,” she says, referring to her early research on voluntweeters. “I was looking at the worst events possible but what people were doing was so prosocial.”

That changed when she studied the spread of rumors at events such as the Boston Marathon. Reading several thousand tweets that blamed the U.S. military for staging the bombing was disheartening. “Accusing soldiers of perpetrating this [triggers] my disgust reflex,” she says, with just a hint of anger.

For a moment, Starbird’s inner peace is disturbed. Then she mentions she has several papers that require her attention, including one on journalists’ responses to rumors and another on how Twitter conversations about Black Lives Matter were hijacked. She does not have time to be upset. There are more tweets to analyze, more questions to answer.


Jonathan Rabinovitz is a writer and an editor who lives in the Seattle area.


Trending Stories

  1. Let It Glow

    Advice & Insights

  2. Meet Ryan Agarwal

    Athletics

  3. Neurosurgeon Who Walked Out on Sexism

    Medicine

  4. Art and Soul

    Arts/Media

  5. How to Joke in a Job Search

    Career Development

You May Also Like

© Stanford University. Stanford, California 94305.