FEATURES

Once More, With Feeling

Sensitive robots actually feel what they touch -- and they're smart enough to learn from the experience.

March/April 2000

Reading time min

Once More, With Feeling

Photo: Glenn Matsumura

After a Valujet aircraft went down in the Everglades in May 1996, recovery workers waded through the swamp, five abreast, wearing biohazard coveralls under their wetsuits. They combed through steaming muck full of snakes and tangled roots, groping for the remains of the passengers and plane, while sharpshooters stood watch for alligators. The search -- largely unsuccessful -- was a horrific ordeal. It was no place for even the most dedicated rescue team. But, says mechanical engineer Mark Cutkosky, "it would have been a good place for a robot."

Cutkosky, the Charles Pigott Professor in the School of Engineering, is trying to develop the right robot for such jobs. His vision: a sensitive, adaptable robotic hand that not only grasps but feels whatever it touches. The device might distinguish, for instance, a bone from a root and then pull it out gently or forcefully as the situation demands.

Cutkosky was already hooked on the idea when he came to Stanford 15 years ago to launch the Dexterous Manipulation Lab. Sensitive mechanical hands, he proposed, could do many of the tasks our own hands accomplish, such as precisely manipulating objects of different shapes, textures and weights, and gathering tactile data by handling and stroking. We could use them to explore things that are too distant, too difficult or too dangerous for us to grasp. With their help, rescuers might retrieve pieces of a sunken ship from the safety of the beach, bomb experts might clear a minefield without risk, and scientists might "feel" the roughness of a Martian rock from the comfort of earthly labs.

Over the years, Cutkosky's lab has built a series of experimental robots. One of these, dubbed Marvin, holds objects between jointed mechanical fingertips and feels their ridges, pits and bumps. Though it doesn't look like a human hand, Marvin embodies perhaps the closest thing yet to the hand's ability to execute complex motions guided by a keen sense of touch. And Marvin's robotic cousin -- Dexter -- can be controlled through the new technology known as virtual reality.

Primitive touching robots, such as the grippers used to make disk drives, have existed for years, but Cutkosky's dexterous and adaptable devices promise to extend robotic reach far beyond assembly lines. Marvin and Dexter represent a new kind of robot, and Cutkosky, 43, is a new kind of engineer. "He is very comfortable designing high-precision machines in the service of human-machine interactions, which is not that common," says Mandayam Srinivasan, director of MIT's Touch Lab. "That whole field is beginning to emerge very rapidly. . . . We need more people like him."

As a kid growing up in Pittsburgh, Cutkosky loved tinkering with things. He and his grandfather, a woodworker, built and repaired all kinds of devices at home. Naturally, the young engineer sought out a hands-on project while pursuing his PhD at Carnegie Mellon University in the early 1980s. Cutkosky joined a team working on a 2,000-pound robot with a huge hydraulic arm, designed for use on assembly lines. Like most industrial robots today, this one was an obedient brute: it could do the very same job time after time, but was pathetically inept at tasks requiring flexibility, such as changing the stiffness of its arm depending on the weight of a load. So Cutkosky fitted the robot with a controllable wrist that could weigh objects and adjust the arm accordingly.

Then he decided to try fingers. "If robots are going to go into hostile or unstructured environments and pick up and explore most everything they find, they're going to need fingers," he says. But mechanical fingers turned out to be a much bigger challenge than wrists. Cutkosky had to learn about human fingers before he could attempt to reproduce them.

He began by observing his own hands in motion. Almost every time Cutkosky held a wrench, hammer or pliers, he would ask himself a series of questions. What kind of grasp am I using? How does it change when I shift to another task? Does the shift give me better control over the tool? He started photographing and videotaping the hands of expert machinists at work. Focusing on this tiny slice of human activity, he found dozens of maneuvers to observe and measure as fingers and palms pried, tapped, pushed and tugged. Next, he sketched a series of hands in basic grasp positions and connected the drawings with lines to show how the different positions related to one another. "I made a taxonomy of grasps," he says. "Each of the 16 to 30 positions I observed was a leaf on the [family] tree." Using this classification system and formulating a set of rules to go with it, Cutkosky was able to predict the kind of grasp most machinists would use for a given object and task.

It was an important step, but not enough to build an adaptable hand. Cutkosky realized that any humanlike hand would need a sense of touch. Without touch -- and the ability to respond to tactile feedback -- robotic fingers couldn't adjust their grasp to manipulate different objects. "When you try to button your coat and your fingers are cold, you're very clumsy. It's not because of the muscles; it's because you've lost your sense of touch," he observes.

Ready to apply this insight, Cutkosky moved to Stanford in 1985. What drew him here, he says, was "the respect for design and the role of creativity and aesthetics in design. And I thought the students were terrific."

The Dexterous Manipulation Lab, just up the sloping path from Memorial Church, consists of two long rooms that share a common entryway. Despite all the desks and tables that line the walls of the two rooms, it's hard to find a spot to rest your elbow: virtually every surface is covered with pliers, multicolored wires, circuit boards and oddly shaped pieces of wood, plastic and metal. Several 4-foot-tall mechanical arms, the kind used in assembly lines, stand in corners and behind desks. Along the walls, students huddle in front of computer screens, writing and running programs used in the design, control and analysis of robots.

There are half a dozen projects under way, but a common theme prevails: the creation of sensing machines. When Cutkosky first came to Stanford, he was one of a handful of researchers experimenting with ways to provide robots with tactile sensing capabilities. Some fitted their robots with pressure sensors; others built in optical sensors. Hoping to mimic human touch more closely, Cutkosky chose to focus on dynamic tactile sensors that would detect subtle vibrational changes and trigger the robot's control motors to readjust accordingly.

Holding an object is a dynamic process, requiring the human brain to continuously adjust the force applied by the hand, he explains. As you hold a glass of water, for example, your grip steadily eases until the glass begins to slip ever so slightly. Your fingertips detect the very first tiny vibrations, or microslips. The brain responds by signaling the hand to apply more force. As soon as your grip tightens, however, it slowly begins to relax again, and the cycle repeats. It's an elegant system for holding something while expending a minimum of energy.

In 1989, Cutkosky and graduate student Rob Howe, MS '85, PhD '91 (now a professor at Harvard), built the world's first robot to hold objects using a dynamic tactile sensing mechanism, albeit a primitive one. Known simply as "the planar manipulator," the two-fingered machine was fitted with piezoelectric transducers -- sensors that pick up the sudden, high-frequency vibrations of a microslip and transform the mechanical strain into electrical signals. Since then, Cutkosky and others in his lab (in particular, students Michael Turner, MS '96, Weston Griffin, MS '99, and Allison Okamura, ms '96, and former students Marc Tremblay, MS '92, PhD '95, and Jim Hyde, '89, PhD '95), have been working to improve on this original design. Marvin, built in 1994, is a direct descendant of that first planar manipulator.

Marvin sits on a table, looking like neither a hand nor a stereotypical robot. The device consists of two L-shaped metal towers, each about a foot high, positioned side by side. These hold the motors. Protruding from each tower is a "hand" with two jointed fingers that end in curved silicon wedges, or fingertips. The two hands work together, grasping and manipulating things by holding them between wedges, or between one wedge and a small platform directly below.

A yellow "skin" of rough-textured rubber covers the wedges that form Marvin's fingertips, giving them a stable, durable surface. The central region provides a firm grip; the edges are the slip-sensitive regions that signal the need for increased force. In addition to piezoelectric transducers, Marvin's fingertips have force sensors that detect pressure changes and perceive an object's softness or hardness. They can also be fitted with optical sensors that measure the area of contact between the object and the fingertip.

Marvin's fingers are great at moving. Each finger has a repertoire of three possible motions, and each motion is controlled very precisely by its own separate motor. To give Marvin the ability to learn about objects through dexterous touch, Cutkosky returned once again to the study of human hands, this time with the help of two psychologists who had videotaped and classified the hand movements people use to explore textures and shapes. To make sense of the information coming through Marvin's sensors, he and his team also developed computer programs that translate the data into measurements of shapes and textures.

The lab's latest creation, two-fingered Dexter, brings researchers closer to the goal of "telemanipulation," the use of remote-control robots to reach, grasp and feel things under the direction of a distant operator. Dexter, like Marvin, has sensitive fingertips capped with yellow wedges. But Dexter's distinctive red digits dangle from the end of a 4-foot, rotating mechanical arm -- the kind a dentist might swing into place before going to work on someone's mouth. The arm motion allows Dexter to reach around and pick things up, whereas Marvin can only grasp objects placed in the 5-inch space between its hands. But there's a trade-off: what Dexter gains in reach, it sacrifices in fingertip agility and precisely controlled sensory exploration.

Thus, each robot has its strength. While Marvin's fingers can deftly turn a knob -- provided it's put into the robot's limited work space -- Dexter can lift and stack blocks.

But Dexter has another distinction that suggests important future applications. The newer robot was conceived with the possibility of telemanipulation in mind, using virtual-reality technology that could someday allow robots to perform remote explorations and repair equipment in hazardous environments. From the start, Cutkosky's team wanted to be able to control Dexter's movements from afar with the CyberGlove -- an invention developed by the Palo Alto-based Virtual Technologies Inc., which had its genesis in the Dexterous Manipulation Lab.

Last fall, for the first time, grad student Michael Turner hooked Dexter up to the CyberGlove and gave a demonstration. While Turner wore the lightweight, high-tech glove, Dexter echoed every movement of the handler's index finger and thumb. When another student placed building blocks in Dexter's range and Turner made the motions of picking them up, Dexter picked up the blocks. And because the glove transmits lifelike pressure sensations back to the user's hand, Turner was able to "feel" the block between his fingers -- even though it was actually between Dexter's. In fact, through such exercises, the CyberGlove has turned out to be a handy way of teaching Dexter new finger movements.

For many of us, there may be something almost eerie about Dexter's reflexive mimicry of the glove, or the way Marvin fondles a piece of plastic. We've grown comfortable with machines that wash our dishes, toss balls at us and hand out tickets at parking lots, and we've come to accept machines that perform complex calculations or beat us at chess. Are we ready now for robots to venture into that most intimate realm of human experience -- touch?

To a few technophobes, it may sound frighteningly futuristic. But for anyone who's ever had to clear a minefield or search a swamp, the answer will likely be a hands-down yes.


Marina Chicurel is a Santa Cruz-based science writer with a PhD in neurobiology from Harvard.

You May Also Like

© Stanford University. Stanford, California 94305.