Skip to main content
Have a request for an upcoming news/science story? Submit a Request

Purdue professor uses Envision Center motion capture technology for AI music project

  • Science Highlights

A Purdue professor is using the Envision Center’s motion capture technology for a project focusing on AI applications with musical instruments.

Kristen Yeon-Ji Yun, a clinical associate professor of music, is using motion capture technology while playing cello to collect data and transfer it to a robot. The end goal is to effectively use the robot to play cello just as well as a human cellist.

“Our goal is to have the robot play music it never has encountered,” says Yun. “Our first step was the motion capture from the Envision Center. We wanted to have the data about cello movement because it’s quite complicated. All the bow angles for the strings are different and there is left hand muscle memory. This data will be helpful for us to mimic human movement. We will transfer that data into a robot stimulating program.”

This project used the Optitrack motion capturing system to track body joints, equipment, and left-hand fingers to generate a 3D reconstruction of the cellist's motion. The optical tracking technology provides just under 1mm (about 0.04 in) accuracy in each individual point movement, which can be used to describe body and instrument motion. The performer wears a specialized suit with retroreflective markers, which then can be extrapolated to a human skeleton, accompanied by the tracking of the cello body and bow.

Image description

The data collected from the motion capture will be used to train the robot, with the end goal being for the robot to learn how to play a musical instrument. Students working with Yun are currently building the right arm of the robot and the controller necessary for it to play the cello.

Yun is also working on a second project using this data, focusing on creating an app, Evaluator, to help string instrument players with their solo and ensemble practice.

“The evaluator aims to improve individual practice and performance,” says Yun. “What it does is analyze a musician’s sound. It compares the sound with the digitized score and detects deviations of intonation, rhythm, and dynamics.”

Yun notes that there are already other tools out there with similar features, but that their tool will have one major difference. “Our tool will detect incorrect postures. I’m using computer vision to prevent injuries because injuries sometimes come from incorrect posture.”

Yun thanks the Envision Center for being kind and accommodating during her team’s visits. She looks forward to collaborating with the center again in the future in order to continue collecting data for this project using other string instruments.

“I really appreciate the Envision Center for being supportive and very prompt to respond to our questions. That was wonderful.”

To learn more about the Envision Center, contact envision@purdue.edu.

Originally posted: