A Multitasking Machine

Glenn Matsumura

Before he can show off his shiny new project, Andrew Ng has to find it. Some days, it’s down the hallway, fidgeting with doorknobs. Or it might be bent over the dishwasher in the artificial intelligence lab, unloading wine glasses.

“Students take it out to run experiments,” Ng says. “So I don’t always know where my robot is on any given day.”

Mounted on a modified Segway Human Transporter, the Stanford Artificial Intelligence Robot—STAIR, for short—has a praying-mantis likeability, as in, “My, what big eyes and swinging appendages you have.” Outfitted with multiple cameras, an array of microphones, a laser range-finder, a navigation sensor and a very, very compact computer and power supply unit, STAIR is designed to wheel through homes and offices with ease.

Building STAIR was relatively simple. Teaching it to perform multiple tasks is another matter. “The large research challenge is developing the software and giving it the intelligence to do things by itself,” Ng says. “I’ve spent most of my professional life working on the science of getting computers to act without being explicitly programmed.”

A specialist in machine learning, pattern recognition and statistical artificial intelligence, Ng studies how computers learn. In addition to STAIR, he has several other projects in the works—a robotic dog that can climb over obstacles, a robotic snake that can tunnel under earthquake rubble to look for survivors, a robotic helicopter that can perform flips and hover in place. Much of the funding comes from the National Science Foundation, the Defense Advanced Research Projects Agency, and a handful of technology companies, including Intel, Honda and Google.

Ng has a dream that dates from the inception of artificial intelligence in 1956. “Over the last 30 years, AI has fragmented into many different subfields, so that today there are completely disjointed research communities working on different subsets of the AI problem—computer vision, robot planning, machine learning, speech,” he says. “My colleagues and I thought it was time to bring the disparate threads of AI back together, and revisit the original AI dream of building an intelligent agent.”

Ng counts off the “special purpose” robots, designed to carry out just one chore, that we all use on a regular basis—dishwashers, washing machines and dryers. He wants to design one robot that will be able to perform many everyday tasks: “I think the revolution in robotics will come when we can have a single robot that can clean up your living room after a dinner party, fetch an item and assemble an IKEA bookshelf.”

But first, STAIR had to learn how to pick up a coffee mug. That, says Ng, is a “very difficult” problem. “When you put your fingers around a cup to pick it up, you know there aren’t poisoned barbs on the rear side,” he explains. “But the robot has never seen the object before. How does it know what the rear face looks like?”

Ng and the nine other professors and 10 graduate students working on STAIR had to create an artificial, three-dimensional world populated with 3-D models of objects for STAIR. Then they had to show the robot, by means of algorithms, where to grasp a specific object. “The robot has to recognize that the midpoint of a handle-like shape is often a good place to pick up an object,” Ng says. “And over time, it actually learns to pick up other objects that may have a handle-like shape.”

Ng and his colleagues were delighted when STAIR learned the “grasping task” in only eight months, instead of a projected three years. Within the next few months, he predicts, STAIR will be able to “understand a verbal command to fetch a small number of objects from an office.” He’ll be happy on the day when—a decade from now—he can turn to STAIR and say, with some confidence, “Please rummage through my drawer and find a piece of paper.”

Back at the dishwasher, STAIR is still learning how to unload, very carefully. “It’s not working perfectly,” Ng notes. “And before you’d want this in your home, it would have to be as reliable as you are.”