Hit enter to search or ESC to close

Henny Admoni Is Building Better Robots By Studying Humans

A colorized photograph of Henny Admoni, head of the Human and Robot Partners Lab at Carnegie Mellon University, pictured alongside a robotic arm

When Henny Admoni tells people she’s a roboticist who primarily studies humans, she gets strange looks. “It might seem a little weird,” Admoni says, “but the goal of my research is to understand how to make robots good for people.”

Admoni leads the Human and Robot Partners Lab at Carnegie Mellon University, where, along with a team of researchers, she studies the ways robots and AI can improve people’s lives.

The lab has worked on robotic arms that assist people in eating food and preparing meals; they’ve observed robots that encourage artists to think outside the box, and they’ve looked at how robots can help search and rescue teams collaborate more effectively.

With a library of published papers, awards, and international speaking engagements under her belt, one might think that, for Admoni, robotics has been a lifelong passion. It has not.

“I thought I was going to be a journalist,” she says. “I wanted to be an international correspondent, travel the world, go to war zones, wear a flak helmet, and uncover truth in different places.”

But upon arriving at college, and not getting into the introduction to diplomacy class she had her sights on freshman year, Admoni turned towards science research. “Journalism is a study of people, and I wanted to continue thinking about people,” Admoni says. “So I pursued psychology.”

At the same time, she enrolled in an entry-level programming class, and quickly developed an interest in computer science. “I became fascinated by what we can learn about people through psychology, and how that information can help us design better robotic systems,” she says.

In 2016, Admoni completed a PHD thesis at Yale University titled Nonverbal Communication in Socially Assistive Human-Robot Interaction. Admoni’s work in her lab now is an extension of that thesis—looking at the way people communicate nonverbally and then using the findings to design robots that can respond to nonverbal signals. Where we cast our gaze is a key form of nonverbal body language Admoni studies in her lab.

“Eye gaze is one of the most useful modalities we have for predicting people’s intent,” Admoni says. “We know people look at objects before they reach for them and before they reference them verbally. Eye gaze is tightly linked to what people are going to do, which is very helpful for a collaborative robot who is trying to figure out what someone wants to do.”

In one study, Admoni positioned a human subject in front of three dishes of candy. A robotic arm sat across the table, and when the subject looked towards a specific dish, the robotic arm extended, taking a piece of candy from the dish and placing it in their hand. This is the essence of Admoni’s work: “We use principles of human behavior and how they reveal internal states to design robot systems that are then responsive to those human beings,” she says.

Some of Admoni’s latest research centers on autonomous technology to aid in human driving. “We are looking at eye gaze as people are driving,” Admoni says. Along with Carnegie Mellon’s David Held—a faculty member of the CMU Argo AI Center for Autonomous Vehicle Research—Matthew O’Toole, and Srinivasa Narasimhan, Admoni is interested in engineering AI such that if a person is driving and they miss something, the AI can step in to correct the driving. “Basically it’s the idea of making your blind spots disappear,” Admoni says. “And this is by having the AI understand what you’re looking at, let you take control of whatever it is that you see, but also monitor the parts of the world that you don’t see.”

Admoni explains that the team’s long-term goal is to use novel sensing technology called light curtains to do low-cost obstacle detection. “The human aspect is that we can learn where those sensors need to be deployed by looking at what expert drivers attend to when they’re at the wheel,” Admoni says. “We can then deploy sensors in areas that the expert drivers are likely not to attend to, to help them be safer drivers.” Admoni notes that down the line, she envisions a similar system being potentially useful for autonomous vehicles. “We can use expert driver examples from people to try to train an autonomous vehicle to attend to the most important locations, given certain situations,” she says.

This past October, Admoni made Robohub’s 2021 list of 50 Women in Robotics You Need to Know About. When it comes to gender in robotics, Admoni says the number of women in the field is increasing, but there’s much room for improvement in expanding opportunities to other minority groups. . “We have a gender problem, but we have an even bigger problem of Black, Pacific Islander, and Native American representation in robotics,” she points out. “So I think that’s going to be the next big hurdle for the field in terms of inclusivity.”

As far as the future of robots and robotics, Admoni wants to assure people we are not headed in the direction of Terminator, and the likelihood of having a humanoid like Rosie from the Jetsons in your house is slim.

“There’s an understandable concern about robots getting too smart and reaching a singularity,” Admoni says, referring to the speculation of artificial intelligence reaching human intelligence. “As a roboticist I am very not worried about robots taking over the world. If we can [currently] get our robots to work well for just an hour out in public, we do a little happy dance. I think that’s actually what the future is going to look like: robots that are hyper specialized for particular tasks and know how to do those tasks as well.”

Choose your lane

How Autonomous Vehicles Distinguish Between Bicycles and People Who Ride Them

How Autonomous Vehicles Distinguish Between Bikes and People

When it comes to how autonomous vehicles see the world, humans come first, literally. Autonomous vehicles (AVs), like the kind operated by Pittsburgh-based Argo AI, use Machine Learning to detect and classify the objects in their surroundings, identifying people...
Why The League of American Bicyclists is optimistic about autonomous vehicles

Why a Leading Cycling Advocacy Group Is Optimistic About Autonomous Vehicles

As autonomous vehicle use grows, AV companies and the League of American Bicyclists are collaborating on how to ensure cyclists and motorists can share the roads safely, even if the “motorist” is artificial intelligence software. As part of the...

Self-Driving Is Arriving Right On Time. Just Like Ice Cream Did

Seven years ago, I was a self-driving skeptic. Not of the technology. Of all the “experts” promising autonomous vehicles would be everywhere by 2020. You didn’t need to be Nostradamus to know that was ridiculous. All you needed was...
Illustration of a futuristic parking deck turned into a mixed-use space, with AVs driving by

How Autonomous Vehicles Could Help Transform Parking Lots

Researchers say it’s likely that autonomous vehicles (AVs) can help reduce the need for parking lots, opening more room for grass and trees and other elements of nature. It may not seem like it when you’re circling the block...
An illustration of an Argo autonomous vehicle in teal against a backdrop of buildings, a bicyclist, and research papers

7 Big Breakthroughs From Argo Research at CVPR 2022

The 2022 Conference on Computer Vision and Pattern Recognition (CVPR 2022) is nearly here. Thousands of computer scientists, software engineers, and researchers from around the globe will gather in New Orleans to review and discuss their latest work in...

Researchers Predict the Future With Lidar Data

Researchers at Carnegie Mellon University’s Argo AI Center for Autonomous Vehicle Research, a private-public partnership funded by Argo for advancing the autonomous-vehicle (AV) field, say they have come up with a way to use lidar data to visualize not...

Must Reads