Getting in Touch with Touch
Next time you open a door and close it behind you, try this quick test. While the door is open, put your hand on the doorknob. Close your eyes. Now, gently push the door shut.
Were you able to do it? If so, good job!
You might have found this test easy, but it actually posed quite a few challenges. Without looking, you had to keep track of how far the door was open. You had to know how hard to push to close it without accidentally slamming it. And you also had to know when to stop moving your hand once you’d finished closing the door.
How does it feel?
“Imagine how hard it would be to interact with the world without a sense of touch,” says mechanical engineer Allison Okamura, director of the Haptic Exploration Laboratory at Johns Hopkins University in Baltimore.
Haptics is the study of the sense of touch. And touch, it turns out, is incredibly complicated.
The way something feels depends on its weight, texture, and temperature. When you pick something up, it feels different than it does if you simply push it or rub your fingers along its surface. Sensory cells throughout your body send messages relating to touch to your brain, and your brain knows how to interpret those messages.
A better understanding of the sense of touch could lead to major advances in medicine, space exploration, robotics, and even video games.
Some day, for example, people on earth might be able to control space probes that are millions of miles away by using robotic tools that allow them to actually feel what they’re doing. That could help them conduct delicate experiments or perform repairs.
And while game systems such as Nintendo’s Wii already use some touch technology (for example, to make you feel as if you’re playing baseball or fighting with a sword), future systems might be even more realistic.
The healing touch
Visual systems that allow people to work remotely already exist. Without ever putting their hands inside a patient, surgeons can use robotic devices to remove tumors or repair wounds that are deep in the body. To help them maneuver robotic tools inside patients’ bodies, the doctors put cameras on the tools. On video screens, they can see what they are doing with their tools.
But these visual systems have flaws, Okamura says. For one thing, doctors can’t always tell if they’re pulling too hard or not pushing hard enough. In other words, they lack the sense of touch.
To fill that gap, Okamura and her colleagues are adding a sense of touch to a robotic surgical device called the da Vinci system. They’ve developed sensors that measure how much pressure the robot is applying to the body. Red circles and other visual clues then appear on a computer monitor to tell the user whether that level of pressure is too little or too much.
The researchers have also developed a physical-feedback system for the device. It lets the user not only see but also feel how much pressure is being applied.
Haptics researchers are making other strides by simulating how much strength it would require to cut through various materials.
In the Johns Hopkins lab, for example, wires connect a pair of scissors to a computer. The computer screen displays a black line and a Pacman-like character. Opening the scissors makes the Pacman’s mouth open. Closing them makes the Pacman shut its mouth and snip the line.
As the user “cuts,” the computer sends resistance feedback to the scissors, making it look and feel like the scissors are actually cutting the line. The effect is surprisingly realistic. The thicker the line looks on the screen, the harder it feels to close the scissors.
A surgeon who used such a tool to operate a remote-control surgical tool would be able to feel the difference between the thickness of different muscles and tendons. This might help her treat a patient or figure out where the person is injured.
Using the same idea, Robert Webster, a graduate student in mechanical engineering at Johns Hopkins, has developed a tool that challenges users to move a virtual piece of paper across a computer screen.
“If you do this on a desk, you’ll notice that if you push it too lightly, your finger slides across the paper,” Webster says. You have to push just hard enough to make the paper move.
Webster’s device includes a ball connected to two motors. You put your finger on the ball. As you move your finger, the computer senses the direction of your push and notes how hard you’re pushing.
If you push too lightly, the motors spin the ball. Your fingertip feels as if it’s sliding, just as it would if it glided across the surface of a real piece of paper.
By sensing small changes in how forcefully something is pushed, this type of device could eventually give robots the ability to handle delicate objects, Webster says.
For example, you know that you’ll break an egg if you hold it too tightly, but you’ll drop it if your grip is too light. Teaching a robot to use just enough pressure and not too much is a challenge.
Touching the future
For now, one of the limits of touch technology is how quickly computers can process data. To accurately mimic how the world feels to us, computer programs would have to constantly update information based on our tiniest movements. Right now, they’re not fast enough to do this.
There’s also a lot to learn about touch itself. Researchers are still trying to get a better sense of how things actually feel to us. Touch, though very sensitive, is not quite perfect. To our fingertips, for example, two small pins will feel like one pin if they are spaced 1 millimeter apart or less.
As technologies improve, however, scientists envision prosthetic limbs that give users a sense of touch, robotic space probes that let people on earth feel as if they are touching the surfaces of other planets, and video games that make players feel they’re truly fighting enemies or racing cars.
Some day, you might be able to hug your grandparents from hundreds of miles away, and you’ll be able to feel them hug you right back.