FOR IMMEDIATE RELEASE: July 26, 2019
FOR MORE INFORMATION: Ted Knight, University of Maryland Division of Research, email@example.com, +1 410 703 4685
The Department of Defense has awarded $1.5 million to a cross-cutting team of University of Maryland (UMD) researchers seeking to understand the spread of information campaigns by examining how emotion affects whether someone will re-share content online.
One of just 12 teams across the country selected for the award from the prestigious Minerva Research Initiative this year – and one of only three such projects led by a woman – the University of Maryland team will collect real-world Facebook and YouTube data in multiple languages to examine the emotional content and viral reach of the posts, as well as the emotional reactions of those annotating the posts.
Based out of UMD’s Applied Research Laboratory for Intelligence and Security (ARLIS), the team will collect and annotate a sample of 1,000 public Facebook posts and 300 YouTube videos from both Poland and Lithuania that were shared by social and political influencers from those countries. Both countries have often been targeted by Russian information warfare and have strategic relevance to NATO and Europe. Once complete, these annotations will be used to explore the relationship between emotion and the sharing of narratives.
“Whether using outright disinformation or manipulating public opinion with accurate stories, information warfare involves stories shared on social media platforms with specific embedded narratives designed to provoke, enrage, excite and change behavior,” said Susannah Paletz, Principal Investigator of the project, Research Professor at the UMD College of Information Studies and ARLIS Affiliate.
A social psychologist, Paletz has been studying social media for five years for the Office of Naval Research and, in a project last summer, developed an innovative coding scheme to annotate emotions that inspired this new effort.
Reflecting the current, nuanced approach to the psychology of emotions, these annotations extend beyond the so-called six basic emotions – anger, disgust, fear, happiness, sadness and surprise. Paletz and her colleagues’ annotation scheme includes longer-lasting emotions critical for everyday life and online interaction, including humor, wonder, nostalgia, relief, love and hate. This scheme includes over 20 distinct emotions particularly relevant to the challenge at hand, and the team continues to refine the list.
For their Minerva project, Paletz and her team will work with native speakers at universities in Poland and Lithuania to complete the annotation. Small groups of annotators will first independently use the annotation scheme to judge each of the collected social media posts for each emotion, rating them 0 to 100, for both the content of the social media post and for their own reaction. Then, they will come together to make consensus agreements on the content ratings of the posts where they disagreed.
The results are expected to show how eliciting specific emotions (e.g., anger, contempt, humor) can help a narrative – truthful or not – go viral. The researchers will investigate whether messages evoking emotions that encourage action are more likely to be re-shared. If disinformation provokes the right emotions, people will be more likely to spread it regardless of its accuracy or truth.
Drawn from several different disciplines across the university, Paletz’s team includes co-PI Anton Rytting, a computational linguist; Cody Buntain, a computer scientist (who will soon join the faculty at the New Jersey Institute of Technology); Devin Ellis, a policy expert; Ewa Golonka, a Russian linguist and social scientist; and Egle Murauskaite, an expert on unconventional security threats.
“These kinds of difficult issues – adversarial disinformation campaigns, hacking, Russian interference – can’t really be solved without us working collaboratively across disciplines,” Paletz said. “Psychologists, computer scientists and information scientists have all been examining online communities for decades and social media for years – but within the confines of their own disciplines. It’s been rare for them to work together, as we are, to really integrate methods and theories.”
The project will also address critical gaps in research about how information travels through populations and across national boundaries and languages. The researchers will develop methods for detecting and tracking how narratives and other memes spread within and across languages. Buntain also expressed an interest in the individuals behind disinformation campaigns.
“I think one thing people get wrong about these kinds of efforts is the characterization that agents responsible for disinformation are highly coordinated, skilled men in dark suits who are pulling strings behind the curtain,” Buntain said. “Instead, I think reality is something closer to a bunch of young people who are getting paid to post online and support a few high-level messages, but are otherwise given a lot of latitude in how they do it. Rather than being experts at propaganda or disinformation, I think many of these individuals are using marketing tools exactly as they were meant to be used, but with an unanticipated intent – are the tools Coca-Cola or Exxon-Mobil uses to market its product all that different? – and these people find what works to get engagement, followers and clicks.”
The research team has already begun identifying Polish and Lithuanian politicians and social influencers. This summer and into the fall, they will collect social media data. Over the next academic year, research assistants in Poland and Lithuania who will be supervised by Golonka and Murauskaite, respectively, will begin conducting the emotion annotation. The team will also examine different types of narratives and will conduct multilevel statistical analyses to understand which emotions and narratives predict social media sharing. The three-year project is expected to conclude in the summer of 2022.
# # #
The Applied Research Laboratory for Intelligence and Security (ARLIS), based at the University of Maryland College Park, was established in 2018 under the auspices of the Office of the Under Secretary of Defense for Intelligence (OUSD(I)) and the US Air Force Office of Concepts, Development, and Management, intended as a long-term strategic asset for research and development in artificial intelligence, information engineering, and human systems. One of only 14 designated Department of Defense University Affiliated Research Centers (UARCs) in the nation, ARLIS conducts both classified and unclassified research spanning from basic to the applied system development and works to serve the US Government as an independent and objective trusted agent.