Touch the Future, Part II
By combining the human sense of touch with a computer interface, typically a controller similar to a joystick, haptic technology promises to open up virtual learning venues as never before.
A previous article in EdTech examined how a research professor at Carnegie Mellon University in Pittsburgh used haptic interfaces that allowed students to physically feel forces generated by a computer-modeled spring.
Below are profiles on innovative haptic applications at Rice University in Houston and Johns Hopkins University in Baltimore.
Rice University
Researchers at Rice are exploring how haptic devices can help with physical therapy and training to enhance human performance, says Marcia O’Malley, assistant professor of mechanical engineering.
Her research team is currently working with stroke patients who are using haptic devices to regain motor coordination in their upper body.
“They use a computer, which provides visual feedback, and a joystick, which provides helpful feedback as they try to reach for objects,” she says. “It acts as training wheels. It helps them feel the sensation of reaching or lifting before they actually do it.”
Researchers are also studying how haptic devices can speed up the acquisition of skills needed to perform tasks that require hand-eye coordination.
Until now, O’Malley says most virtual environments have relied on visual and auditory feedback. Research shows that when a third layer — the sense of touch — is added, performance is enhanced. But will haptic technology have the same effect on training?
Consider people who are learning to drive a tractor-trailer. If they train for a significant period of time in a virtual environment that offers haptic feedback, does their performance improve?
That is the big question. Because the field of haptics is relatively new, a link with performance has yet to be proved. But O’Malley says there already is a significant amount of evidence that hands-on interactions are beneficial, especially for students who are kinesthetic learners.
O’Malley is confident that haptics will ultimately change everyday life. Imagine feeling the fabric of clothes before purchasing them online or shaking hands with people while attending a virtual meeting. Manufacturers of cell phones are already incorporating haptics: While typing on cell phones, users will soon be able to feel the buttons press, click and push back.
Johns Hopkins University
Engineers at Johns Hopkins are working with the university’s medical school to develop this cutting-edge technology to help doctors perform surgery and train would-be surgeons.
Although surgical robotic devices are helpful, their downside is that surgeons lose the feel of interacting directly with human tissue, explains Allison Okamura, an associate professor of mechanical engineering at Hopkins. Her research team is developing robotic and haptic devices that provide doctors with the sensation of holding surgical instruments.
During virtual surgical training, Okamura says, the difference between a haptics-enabled exercise and one without haptics is similar to the difference between a flight simulator and a “How to Fly” video. While both offer visual and audio feedback, the flight simulator enhances learning because students feel as if they’re flying a real plane. They shake and move, feeling the effects of every button they push or decision they make. The same would hold true with haptic devices for training surgeons.
Okamura’s research team is also exploring how haptics can be used to teach elementary school students. She has brought haptic devices to local elementary schools to let students experience what it would feel like to bounce a virtual ball on the Earth, moon, Jupiter or Mars. She says haptics enables students to feel things they can’t see, such as the pull of gravity.