Can a Strategy Game Help AI Learn to Spot Scammers?

Maria Herd - April 27, 2022

iSchool Associate Professor, Dr. Jordan Boyd-Graber, joins team of researchers awarded $1M to support U.S. Military’s cybersecurity fight.

A map of Europe with game pieces of military equipment and ships placed sporadically on top of it.

The online version of the classic board game Diplomacy, reportedly favored by John F. Kennedy, Henry Kissinger and Walter Cronkite, is serving as a fun testbed for University of Maryland computational linguists developing a new way to fight a serious and costly cybersecurity threat.

Funded by a $985,000 award from the military’s Defense Advanced Research Projects Agency (DARPA), they’re seeking to use artificial intelligence (AI) systems to help internet users fend off social engineering gambits—like the “Nigerian prince” scam—that psychologically manipulates the unwary to send money or divulge information. Cybercriminals stole an estimated $7 billion this way last year, and such attacks can compromise national security.

Computer science Associate Professor Jordan Boyd-Graber, who’s leading the project, said one of the biggest challenges of his work is gathering a valid data set of lies to train the algorithm to recognize deception. The board game Boyd-Graber played nearly weekly while an undergrad at Caltech offers a solution, however, because it encourages deliberate, precisely identified falsehoods and a clear delineation between truth and trickery.

“This is what brought me back to the game of Diplomacy, a community of people who reveled in being effective liars—they could lie and tell us and be proud of it,” said Boyd-Graber, who has a dual appointment in the University of Maryland Institute for Advanced Computer Studies and the College of Information Studies. “It has already proved to be a very useful tool (in) building data sets for machine learning researchers to train algorithms on when people are being deceitful.”

Diplomacy is played by obtaining and defending European territories. There are no dice, playing cards or other elements that produce random effects. Instead, the game requires players to collaborate, collude and betray to win by conquering Europe during World War I.

Boyd-Graber’s current project builds off his previous work for DARPA using data from the online version of the game to study deception and teach computer agents about negotiation. His earlier research on the computational linguistics of betrayal was covered by CNN, The Wall Street Journal and others.

The researchers’ immediate goal is to develop an AI that can win at Diplomacy, and in this realm, computers have some advantages, such as never missing data that humans might overlook or find difficult to decipher. For example, a time stamp is a big clue in Diplomacy; perhaps the player is lying about their time zone. Or if they’re sending messages too quickly, they’re probably a bot instead of a human.

The broader goal is to get computers to cooperate and negotiate with people. Eventually, these advances will lead to better online bots that people regularly use for customer support, scheduling appointments and more.

“Existing programs are insufficient because they focus on single interactions—ignoring the more complex dynamics of extended interactions and the richness and complexities of human communication,” said Boyd-Graber.

Leading the research with Boyd-Graber is his former doctoral student Denis Peskoff M.S. ’18, now a postdoctoral researcher at Princeton University who completed his doctoral studies last year and will formally graduate this year.

“He is actually the one who pitched the idea to DARPA and got the ball rolling on the program; I’ve never seen anything like it,” said Boyd-Graber.

Peskoff, a polyglot with a background in international affairs, wasn’t familiar with Diplomacy until he started working with Boyd-Graber. Now it’s his favorite game other than basketball—and he faces off against some of the world’s top players.

“Denis is much better than me now,” Boyd-Graber admitted.

 

Original article was published by Maryland Today on April 26, 2022.