A Forest of Robots Builds Trust Through Dance

Gesture and Sound Make Machines Seem Emotional and Humanlike


Twelve white arm-like robots stand on pedestals in a dark room. A pair of human dancers approach one in the middle. The dancers and the robot stare at one another apprehensively. The dancers revolve around the robot as it rotates. Live music starts. More dancers approach, crawling, as the other robots come alive. People and machines move to the rhythm of drums, guitars, a horn. A slow piano melody replaces the other instruments as a lone dancer intertwines with one robot and embraces another. The music picks up and an MC arrives, singing and rapping, as 17 dancers, only five of them human, writhe around him.

This is not just experimental theater. It’s the result of a grant from the National Science Foundation (NSF) awarded to the Center for Music Technology at Georgia Tech. Humans and robots each have their strengths. Where humans are creative and adaptable, robots are consistent and precise. In the best of all worlds, humans and robots will work together, cooperating at home, work, and school. For that to happen, people will need to trust their mechanical partners. And because we’ve evolved to form trust based on certain social cues, robots will need to exhibit those cues.

Performance of “FOREST,” a collaborative project between Georgia Tech Center for Music Technology and the Kennesaw State University Dance Department. (Video credit: Georgia Tech)

Gil Weinberg, the founding director of the Center for Music Technology, has spent years studying those cues. For this project, he proposed an idea to the NSF to disseminate some of his findings to society. “The best way I came up with to bring it to the public,” he says, “was to have a performance with musicians who make sound and with dancers who use gestures.”

According to Amit Rogel, a graduate student in Weinberg’s group, “People, in general, have a lot of fears about robots. There are many movies about killer robots or evil AIs. Sharing our work, where you can see positive interactions with robots in an artistic way, is important.”

Dance Dance Revolution

The video described above spawns from earlier work in which Weinberg’s group developed a system that can generate emotional sounds for robots in an effort to improve human-robot trust. To create the data set for the system, they invited musicians to convey a set of emotions vocally or through instruments, then trained a neural network—a piece of software inspired by the brain’s wiring—to generate new audio examples for each emotion. They found that when people collaborated with robots on tasks like placing objects into containers, observers trusted the robots more if the machines emitted emotional sounds during the interaction. The next logical step was teaching the robots to convey emotions through gestures.

“If I have the suit on, and I’m talking to another person, you just see the robot doing subtle motions. For instance, it bobs its head when I say something more impactful. It’s very surprising and really cool.”

Translating full-body human posture and gesture into a handless robot is not straightforward. Amit looked into psychology research and read up on how humans express themselves through movement. “For example, when people are happy,” he says, “their heads will be pointed up, their arms will go up, they’ll make repetitive up and down movements.” Weinberg’s group used a robot called Franka Emika Panda, featuring seven degrees of freedom at its seven joints. “Head up means joint six is x degrees. A tall position indicates joint four would be y degrees, and then up and down movement would look like this,” he says, moving his body up and down. In one study, they found that people could easily decipher the intended emotions. Participants also rated these animated robots high on likability and perceived intelligence.

Six dancers with six robots performing on stage.

Dancers from Kennesaw State University perform with robots from the Georgia Tech Center for Music Technology. (Image credit: Gioconda Barral-Secchi)

Aside from the sleek video, the group also recorded a student performance comprising six different works. Dancers intermingled with a dozen UFACTORY xArms similar to the Panda. For one piece, the robots executed preprogrammed moves in response to music. For another, they reacted to people who manipulated their joints or showed moves to their cameras. They also played off someone’s brain waves. In yet another, they improvised using data from a dancer’s motion-capture (mocap) suit. Lastly, they obeyed rules telling them what to do based on what their robot neighbors were doing.

Rogel enjoys interacting with the robots while wearing the mocap suit. “The mocap suit picks up on really small movements that you don’t even know you’re doing,” he says. “If I have the suit on, and I’m talking to another person, you just see the robot doing subtle motions. For instance, it bobs its head when I say something more impactful. It’s very surprising and really cool.”

Smooth Operator

Rogel programmed the rules for the robots’ movements in MATLAB®. The code tells the robots how to respond to any combination of mocap, cameras, EEG, music, nudges from humans, and the movements of other robots. Rogel controls how the different elements interact using Simulink®, where blocks represent control functions. He can open the blocks and look at the functions. “I’m a mechanical engineer,” he says. “I like seeing the equations and focusing on the math as opposed to lines of software code. In having all the toolboxes readily available, it’s so easy to get all of this done in MATLAB.”

In Simulink, he visualizes the robots’ movement in two different ways. One shows graphs of various points’ acceleration. The other shows each robot as an animated stick figure. “I can test different parameters and look at how it’s responding without ruining any of the robots,” he says. “Or, if I’m doing cool things, I’ll also be able to see it before testing it on the robot. It’s an easy tool for iterative processes.”

The raw data from the mocap suit is messy. One of the most useful software features turns movement curves into equations that the team can use to generate new robot movements. Usually, they create fifth-order polynomials—equations with variables raised to the fifth power. They do this so that when they take double-derivatives to convert position into velocity and then acceleration, they still have a third-order polynomial—a smooth curve. Otherwise, the motor’s movement might be herky-jerky, damaging the hardware and looking unnatural.

Another trick involves follow-through. When a person moves one body part, the rest of the body tends to move too. When you wave your hand, for instance, your shoulder adjusts. “We wanted to model this with the robots,” Rogel says, “so that when they dance, it looks smooth, like a fluid motion, and elegant.” The team uses tools that can locate the peak of a curve—one point’s maximum acceleration—and then alter another point’s curve to be offset by a certain amount of time.

Ivan Pulinkala, a choreographer at nearby Kennesaw State University who choreographed the sleek video with the MC, sometimes looked over Rogel’s shoulder, suggesting prompts. He would describe the undulation of the human spine or propose that the robots appear to be breathing. “The part that I found exciting is that it completely changed my approach to choreography and the approach of the dancers”—also affiliated with Kennesaw—“to movement,” Pulinkala says.

In preparation for one section of the videotaped piece, the robots improvised, and then repeated this improvisation so it could be filmed from different angles for the video. To do this, Rogel trained a neural network in MATLAB on one dancer’s performance in order to generate new robotic movements in that dancer’s style.

“One of the biggest challenges was how to make the movement smooth and flowing like human movements,” Weinberg says. “I thought it would be tricky because these robots are not designed for dance. Amit did wonders with them. Look at the student video from the end of the semester. You’ll see how the robots are really grooving with the humans. Otherwise, we might have been falling into the uncanny valley, where the robots look eerie.”

“One of the biggest challenges was how to make the movement smooth and flowing like human movements. I thought it would be tricky because these robots are not designed for dance. Amit did wonders with them. You’ll see how the robots are really grooving with the humans.”

The four Georgia Tech researchers working with one of the robots.

Amit Rogel (foreground, right), a Georgia Tech Center for Music Technology graduate student, works with other “FOREST” researchers (left to right) Mohammad Jafar, Michael Verma, and Rose Sun. (Image credit: Allison Carter, Georgia Tech)

Symbiosis

“I was happily surprised by the affection the Kennesaw dancers formed toward our robots,” Weinberg says. “It was a long shot because they came from a completely different world. Before the workshop, they probably perceived robots as just mechanical tools.”

“Instead of dancing around them, I’m actually dancing with them. It definitely felt like they were dancing alongside with us.”

Describing the shift in her thinking, one dancer, Christina Massad, told the Atlanta NPR station, “Instead of dancing around them, I’m actually dancing with them. It definitely felt like they were dancing alongside with us.”

Rogel used MATLAB to tune how sensitive the robots’ joints were to contact. “We did want the dancers to be able to get up close and touch the robots and really establish connections,” he says. “At the beginning, they were very timid and shy around them. But the first time they would bump into a robot while it was moving toward them, the human would be safe, the robot would be safe, and that felt a lot more real to them.”

Dancer performing with a robot.

Dancers get up close and establish a connection with the robots. (Image credit: Georgia Tech)

Independent of contact, a robot would occasionally break down. “The dancers really like that,” Rogel says, “because it also felt more human, like they’re getting tired.” The team also used some tricks to make the robots seem more alive, such as giving them eyes, names, and backstories.

The team has additional experiments in mind. They’d like to further probe people’s trust in response to robotic gestures. They’re also looking at emotional contagion—how people want machines to respond to their own emotions—and the role of people’s personalities; some might prefer different kinds of reactions. In terms of further communicating their insights, they plan to take “FOREST” on tour. Pulinkala hopes to develop a longer show.

One thing that sets “FOREST” apart from other investigations of human-computer interaction is its collection of multiple robots. Weinberg calls it both a technical and a social challenge. “The idea was to use forests as a metaphor. Inspired by the biodiversity in forests, we attempted to bring together a wide set of robots and humans that can play, dance, and influence each other. The music was diverse as well, featuring a mashup of many different genres such as Middle Eastern music, electronic dance, hip hop, classical, reggae, and Carnatic music, among others.”

The humans behind the project were also diverse. “The team comprised students, professors and administrators, musicians, engineers, dancers, and choreographers. The team also had a wide representation of gender, race, nationality, and backgrounds. I think that we were able to create something unique. Everyone learned something.” Weinberg continues, describing the larger mission behind the Center for Music Technology. “I’m a big believer in diversity, which is challenging to achieve at technical universities like Georgia Tech.” Georgia Tech doesn’t have a dance program, for instance. “You may have dancers who are also good at computer science but don’t find a way to combine both passions,” he says. “And this kind of project allows them to combine the best of both worlds.”


Read Other Stories

Panel Navigation

ACADEMIA / ROBOTICS

A Team of Nine Undergraduate Students Builds Innovative Jumping Robot for Their Final Project

Small, Agile “Ascento” Climbs Stairs and Avoids Obstacles

Panel Navigation

ROBOTICS / AUTONOMOUS SYSTEMS

Trusting Robots to Navigate New Spaces

New Algorithm Boosts Robustness of Robot Perception

Panel Navigation

AI

Cyborg Drummer and AI Team Create Music That's Not Humanly Possible

Designing a Robotic Prosthetic for a Professional Drummer