Malte-Jung-009_edit.jpg

Dave Burbank
Dave Burbank

Robots Can Build Better Teams

by Jackie Swift

A few years ago Malte F. Jung, Information Science, ran across an article by a father who had observed that his young daughter was modeling her behavior on her parents’ interactions with Amazon’s Alexa. The writer of the article pointed out that his daughter was learning to be rude and bad mannered by interacting with the voice assistant. Jung was fascinated but not surprised.

“As adults we know that we need to use simple language with these speech recognition systems,” he says. “We say, ‘Alexa, play this. Alexa, order that.’ You don’t say ‘please’ or ‘thank you.’ But kids just pick up on the language their parents use as normative, and they start saying things like, ‘Papa, get the milk.’ Through the design of the machine, even if unintentionally, you set these norms of conduct that have implications beyond the interaction with the machine.”

When Jung started his academic career about a decade ago, most research in human-computer interaction focused narrowly on interactions between one human and one machine. Jung has become something of a trailblazer by stepping outside that paradigm to examine moment-to-moment interactions between humans and machines in group settings.

“In behavioral studies, the emphasis has been on building an understanding of how an individual person perceives an individual robot—for instance, how they relate to a machine that looks more or less human,” he says. “And in interactive design, the focus has been on designing robots that work well one-on-one with a human. But that doesn’t reflect the real world. There are always other people around, and our interactions with other people matter. So then the question becomes, how do these machines shape not just how we interact with them, but how we interact with other humans?”

“Through the design of the machine, even if unintentionally, you set these norms of conduct that have implications beyond the interaction with the machine.”

Robot as Social Catalyst

Jung set out on a course of research that aims to expand how we think about the impact of human-computer interaction on interpersonal relationships. Could robots be used strategically to help groups function better? In one study, he joined with Sarah Strohkorb Sebo, then a PhD student at Yale University, now at the University of Chicago, to test whether a robot’s behavior can improve the interactions of group members.

The study required three people to work with a robot to figure out a puzzle task. The robot would sometimes make statements expressing its vulnerability, such as admitting to making a mistake. “We found that a robot expressing vulnerability in a group catches on and leads to more pro-social interaction among other members of the group,” Jung says. “It changes people’s behavior toward the robot but, also, how they treat each other.”

In this study, the robot was what Jung calls a social catalyst—a robot that helps teams build social interactions so that the teams can perform better and, thus, do better work. Over the years, he has looked at this phenomenon from the perspective of robot design. “When people think of human-robot collaboration in a team, the dominant approach is to figure out how to build robots that accomplish part of the team’s work by taking on a task or providing information,” he says. “My approach is a difference in paradigm.”

In one project, Jung and his collaborators built a robot that sat on a table and looked like a typical microphone in a recording studio. While it did not speak, it did have the ability to back-channel—to show by body language that it was listening to others in the group. In this case, it could turn and tilt toward a team mate. “We wanted to know, if the robot seems to engage, will that engagement carry over into the group?” Jung says.

While the team carried out a problem-solving task, the microphone robot tracked the conversation and noticed when someone hadn’t spoken for a while. It would then turn toward the person and tilt forward, implicitly encouraging the person to speak. “When it moved like that, people developed much more equal participation in the team,” Jung says. “It worked toward building more balanced conversation participation. And it did it all nonverbally. That was part of the study: We wanted to move away from the paradigm of a human facilitator who says ‘What do you think?’ and, instead, utilize the unique affordances of a machine.”

Robots in the Operating Room

To understand more fully how the design of a robot affects teamwork in a real-world setting, Jung joined with Steven Jackson, Information Science; Amy Cheatle PhD ’23 Information Science; and Visiting Researcher Hannah Pelikan to study robotic surgery teams. These teams use a robot that has multiple arms to wield surgery instruments as well as a laparoscopic camera. While the robot and other team members stand over the patient, the surgeon sits in a far corner at a console with a screen, viewing the tissues inside the patient’s body through a laparoscopic incision and carrying out the surgery by manipulating the robotic arms.

“The design of this robot implies that surgery is about what the surgeon does, that it’s just about cutting and joining tissue,” Jung says. “But when you bring in the team perspective, surgery is not just done by the surgeon. There are nurses, technicians, an anesthesiologist, sometimes also surgeons in training. They all have to work effectively together for the surgery to happen. And here you have a system that shuts the leader off from the team.”

Through field observations, interviews, and video analysis of the interpersonal interactions of the surgery teams, the researchers found that the robot affected the teams in many ways, beyond the physical distancing of the surgeon. For instance, communication patterns changed fundamentally, Jung says. “In open surgery without a robot, a surgeon typically requests an instrument and the assistant gives it to them, often without verbally responding. In robotic surgery, communication is inefficient. It takes quite a bit of verbal back-and-forth before it’s clear what everyone wants.”

Surgery teams also found their sensory communication impaired to some degree, Jung says. “Our senses are not separate,” he explains. “How I see is connected to how I touch, for instance. And in a surgical team, there’s codependence and entanglement not just across modalities but also across people. For the surgeon to see, one of the team members has to touch tissue and move it. The robot messes up those entanglements and that forces the team to develop new practices around sensing and around communicating sensory information.”

Good Outcomes Require Teamwork

Jung came to human-computer interaction and team dynamics through an early interest in design. “When I was little, I wondered why so many things are so terribly designed,” he says. “They’re ugly, and they don’t work particularly well.

“In graduate school and as a postdoc, I realized how much good design solutions depend on how well the designers and engineers interact with each other while they are designing something,” he continues. “The interactions we have with others are so fundamental for our ability to accomplish anything. I believe for robots to be successful we need to design them with these interactions in mind.”


Cornell research moves quickly. Keep up with our monthly e-newsletter.