JWP_CUVP-Research_MarkCampbell-9440_Edit.jpg

Cars, robots, and spacecrafts that can reason and make intelligent decisions may be futuristic, but Cornell engineers are making spine-tingling steps.
Jesse Winter
Jesse Winter

JWP_CUVP-Research_MarkCampbell-9887_Edit.jpg

“The analogy is people. What people are very good at is dealing with new things, with uncertainty, or reasoning about things that aren’t clear,” Mark Campbell says, referring to intelligent systems.
Jesse Winter
Jesse Winter

JWP_CUVP-Research_MarkCampbell-9465_Edit.jpg

Using data from multiple types of sensors, the Campbell lab has programmed a car to anticipate which way another car will turn at an intersection—100 meters before it arrives.
Jesse Winter
Jesse Winter

robots_Edit.jpg

The Campbell lab develops sets of algorithms that can go inside any system. They can equip robots with all the same sensors as their cars, but they also want to give their robots language—a tougher task.
Beatrice Jin; Jesse Winter
Beatrice Jin; Jesse Winter

JWP_CUVP-Research_MarkCampbell-9671_Edit.jpg

Collaborating with Hadas Kress-Gazit, Campbell’s lab is working to get a human and a robot to build a pyramid together, including the incorporation of natural language.
Jesse Winter
Jesse Winter

Cars, Robots, Intelligent Systems

by Caitlin Hayes

Imagine you’re in the distant future with an age-old problem: you can’t find your keys. Mark Campbell, Mechanical and Aerospace Engineering, says you might ask a robot for help. The robot could communicate back, searching with you until the keys are recovered.

Campbell’s version of the future doesn’t end there. With keys in hand, you would go to your car but wouldn’t need to drive it. At the interstate, your car might join a train of other autonomous cars moving safely and efficiently to their destinations.

To make elements of this world possible, however, we need intelligent systems, says Campbell. “The analogy is people,” he explains. “What people are very good at is dealing with new things, with uncertainty, or reasoning about things that aren’t clear. That’s all very hard to program.”

Campbell continues, “We’re interested in developing a set of algorithms that sit inside whatever system—a robot, spacecraft, or car—and enable it to reason and make intelligent decisions.”

Intelligent Cars

With current technology, an autonomous car performs well on basic roads, but in a busy, complicated place like Manhattan, it won’t know what to do, especially if it loses GPS. “It would basically have to pull over, and we’d have to go get it,” says Campbell. “That’s the current state of the art.”

To address this, Campbell and his team are working on algorithms that would enable an autonomous car to anticipate possible outcomes and make decisions. “When people are driving, we have a mental model of the scene around us and how it’s going to change,” he says. At a four-way intersection, for example, we would expect another car to slow down, stop, and either turn or go straight. “This is your mental model, and if anything starts to go against that model, you make some different decisions. So the challenge for us is how to encode that mental model.”

Using data from multiple types of sensors, Campbell and his team have programmed a car to anticipate which way another car will turn at an intersection—100 meters before it arrives. “The sensors pick up on these subtle movements that we wouldn’t notice with our naked eye,” he says. “Effectively, it’s a kind of perception, a mental model, anticipating how a scene is going to change.”

The car’s ability to respond could make autonomous driving and the roadways in general more safe, a major goal of the field. “If you look at the number of accidents that happen, so many of those could be prevented,” Campbell says. “And people are becoming more and more distracted.”

Increasing various efficiencies would be another benefit of a large-scale adoption of self-driving cars. “If you’re pulling onto the highway into a kind of train, your gas mileage goes way up because of the aerodynamics,” Campbell says. “Like bikers in the Tour de France.” Commuters would also gain back that time spent driving.

“So how far will it go?” Campbell says. “I could imagine city officials, looking at all the benefits, making it mandatory that you’re almost fully autonomous inside city limits. It’s a long way off, but I can imagine it.” The fact that many of Campbell’s recent PhD students now work at car companies speaks to the momentum behind this dream.

Collaborating to Build the Ideal Robot, the Human’s Assistant

Robots, fluidly interacting with humans, have been a long-time fantasy of the future, as shown in movies, such as Star Wars and many others, but the reality is much harder to achieve. Campbell and his team can equip robots with all the same sensors that their cars have, but language poses particular challenges.

“Robots deal with numbers, so the algorithms have to transform those numbers into something that’s an intermediate to my spoken information,” Campbell says. “We’re figuring out ways of making algorithms that connect the two.”

This includes programming a robot to understand not just human commands but also communicated information—enabling natural, two-way exchanges. “We want eventually to think about humans working very intimately with robots, rather than separately, where the robots do one set of things and people do another,” Campbell says.   

This collaboration might become invaluable in search-and-rescue missions, or communicating with an unmanned spacecraft, or in the home. “If you’re going to have a robot near your family, you want to be able to work with it naturally and feel safe,” Campbell says.

“My students got me to come out at two or three in the morning. They were testing in the Vet school parking lot, and it was the very first time the car could actually drive itself.”

In collaboration with Hadas Kress-Gazit, Sibley School of Mechanical and Aerospace Engineering, Campbell’s group is working to get a human and robot to build a pyramid together. In a broader project funded by the National Science Foundation—with Kress-Gazit, Massachusetts Institute of Technology, and University of Rochester—teams are addressing all the major challenges. They’ll work on creating a natural language between humans and robots, building robot perception with sensory data, and teaching robots tasks and decision-making.

Kickstarting the Autonomous Driving Revolution, the Student Factor

Campbell hasn’t always been so interested in robotics. “I was all aerospace,” he says. As a graduate student, he’d experienced the exhilaration of having a project on the space shuttle and wanted to give that opportunity to his own students.

“We had two opportunities to launch satellites, and it took us four years to develop each project,” Campbell says. “The first time, the shuttle had an accident. The second time, we worked for four years to develop these little cubes that launched on a Dnepr rocket that exploded over Kazakhstan. The students appreciated the experience, but it was hard.”

Still, Campbell was not drawn away from aerospace until a student’s passion led him away. “I was on sabbatical in Australia, and a very strong PhD student took on a side-project while I was gone,” Campbell says. That side project was the Defense Advanced Research Project Agency (DARPA) Robotics Challenge, sponsored by the Department of Defense. The task was to develop an autonomous car that could drive 150 miles through the desert.

Campbell’s PhD student, leading a team of undergraduates, didn’t fair too well in the first challenge, but for the next, which offered million dollar development grants, they asked Campbell to serve as adviser. “We wrote the grant together while I was in Sydney. I would work all day, and my graduate student would work all night,” Campbell says. “And we were lucky enough to win one of those grants.”

Cornell’s team went on to be one of only six finishers of the challenge of more than 200 initial teams. The car they programmed drove autonomously on city streets for about 60 miles, navigating other competing vehicles as well as cars driven by people. This all happened before the Google Car. “These contests kind of kick-started the autonomous driving revolution,” Campbell says, “and it launched me and my research into this area.

“This is a wonderful thing about academia,” he adds, “that you can pretty much change your career, which makes our jobs really refreshing.”

Some of the frustrations of working in aerospace no longer apply. “The main thing I like about robotics is that I can see it operating and working right in front of me,” Campbell says.

Campbell recalls vividly the first time he and his students enabled a car to drive autonomously. “My students got me to come out at two or three in the morning. They were testing in the Vet school parking lot, and it was the very first time the car could actually drive itself,” he says. “The steering wheel was shaking, but it was driving this snake course at 15 miles per hour.”

His students had planned a surprise—they’d programmed the car to go in reverse. “I’ve never driven backwards at 15 miles per hour—your entire sense is different, but the algorithms didn’t care at all. They just planned a path backwards, and it’s exactly as safe as going forwards. I thought that was just awesome. It was spine-tingling. We had moments like that once a month during development.”

Campbell adds, “I always like to say that the reason I’m doing robotics now is because my students pulled me in that direction, and I’m very happy they did.”