INFO doctoral candidate gives a presentation about multimodal human AI-interaction and its applications to music education
the_post_thumbnail_caption(); ?>
Snehesh Shrestha, a doctoral candidate in the College of Information Studies (INFO), held a presentation Oct. 27 about Multimodal Human-AI Interaction as part of the OTTRS Speaker Series.
Throughout the presentation, Shrestha discussed the significance of combining verbal and nonverbal cues in human communication, AI-mediated student-teacher interaction systems, and a novel haptic band designed for remote feedback, prompts, and metronome functions to enhance online music education.
Shrestha explained how his team used a Wizard-of-Oz setup to compare ‘natural’ interactions and deceive participants into believing they were interacting with a human-level AI robot. The team studied verbal and nonverbal strategies to train machine learning algorithms for multi-modal commands.
This research can have practical applications, Shrestha said. Remote music lessons became common during the COVID-19 pandemic, but can pose a significant challenge to instrument learners.
One way to combat this disconnect is through AI. Shrestha said that AI-mediated student-teacher interaction systems in violin education can use precise motion capture and audio to enhance pose estimation algorithms for 3D player visualization.
Shrestha also introduced a novel haptic band which can give remote feedback to students learning virtually. The band uses prompts and metronome functions to enhance online music education experiences.
Watch the full video below.