In a laboratory at the University of Utah, Doug’s virtual left hand reached out and touched the virtual door for just a second before he quickly pulled it back. A few moments later, he extended his virtual fingers again and ran them down the simulated wood grain of the door’s surface.
“I just felt that door,” Doug said with a gasp. “That is so cool.”
Doug is one of the first people ever to tangibly interact with a virtual world, using his mind to guide his avatar in a virtual space, and feeling that virtual embodiment’s simulated contact directly in his own brain. For Doug, it was as if his own flesh-and-blood hand had touched a real wooden door.
More amazing than that is that Doug is missing his left hand and part of his left arm due to a long-ago accident. A quarter-century later, as a volunteer research participant in DARPA’s Hand Proprioception and Touch Interfaces program – nicknamed HAPTIX – Doug felt real touch sensations.
Doug is one of the first testers of an implanted, peripheral nerve interface. When he imagines moving his missing arm and hand, signals from his brain travel down to the peripheral nerves and muscles of his residual limb. The interface reads Doug’s neural motion-planning signals, and then uses a set of algorithms by which the interface “learns” to translate the signals into a set of motion controls that a computer relays to a virtual hand.
This interface goes beyond previous one-way systems that enabled Doug to move a virtual or even physical prosthetic limb by thought alone, for with those systems he could not receive tactile feedback. The HAPTIX program closed the loop and made it possible for users of upper-limb prostheses to regain the tactile sensations of pressure and texture. Researchers in the HAPTIX program now are refining the user experience by providing nuanced sensations. The research could prove transformational for amputees, and, as the system becomes better, it might also provide opportunities for able-bodied people to engage with machines and virtual spaces in unprecedented ways.
Just how important is touch? Imagine trying to use a prosthetic hand to handle a delicate lightbulb, applying enough pressure and finesse to screw it in to a socket, but not so much as to shatter the bulb. Without touch feedback, a prosthetic limb user has to rely solely on visual information, which is a poor way of measuring force, to interact with an external object. With haptic feedback, someone like Doug can manipulate objects and move more confidently and could theoretically complete tasks in the dark or with his eyes closed.
The HAPTIX program closed the loop and made it possible for users of upper-limb prostheses to regain the tactile sensations of pressure and texture. Researchers in the HAPTIX program now are refining the user experience by providing nuanced sensations. The research could prove transformational for amputees, and, as the system becomes better, it might also provide opportunities for able-bodied people to engage with machines and virtual spaces in unprecedented ways.
A man named Nathan proved that point. Nathan was paralyzed in an automobile accident in 2004. Since then, he has lived with impaired signaling between his brain and peripheral nervous system, a deficit that precludes candidacy for a peripheral nerve interface. Twelve years after his injury, Nathan took the opportunity to volunteer to be implanted with a direct interface to his central nervous system. He now contributes to DARPA’s Revolutionizing Prosthetics program, helping researchers test technology for complex sensorimotor control of prosthetic limbs and other devices.
With electrodes on his motor and somatosensory cortices – the areas of the brain that control movement and touch sensation – Nathan can control a prosthetic arm using his thoughts alone, and he can feel what the arm touches via signals from sensors embedded on the prosthetic fingers. The technology is currently precise enough that Nathan can distinguish contacts with individual fingers. During experiments in 2016 at the University of Pittsburgh Medical Center, a blindfolded Nathan correctly identified which fingers on a prosthetic arm were being pressed by a researcher, and he could even tell when the researcher engaged two fingers simultaneously.
“Sometimes it feels electrical, and sometimes it’s pressure, but for the most part, I can tell most of the fingers with definite precision,” Nathan told a Washington Post reporter for an Oct. 13, 2016 article. “It feels like my fingers are getting touched or pushed.”
The concept of “man-computer interaction” began circulating around DARPA in 1960, when computer scientist J.C.R. Licklider published his vision of how computers could one day augment human abilities. As director of the agency’s Information Processing Techniques Office, Licklider shepherded development of a suite of technologies for visualizing, processing, sharing, and interacting with information. Based on the notion that humans and computers could have a symbiotic relationship that produced a sum greater than its parts, these technologies ultimately formed the foundations of today’s internet and personal computing experience.
By the 1970s, Licklider’s vision inspired DARPA’s first research into human-machine interactions facilitated by direct neural interfaces. An early set of experiments explored how well noninvasive sensors could measure responses to sensory stimuli experienced while performing tasks. At the time, the enabling technology for meaningfully interacting with the brain did not yet exist, and so the research results were marginal. But that situation began to change by the late 1990s with the accumulation of advances in information systems, materials science, and sensors for studying brain structure and function at a new level of detail.
By the early 2000s, DARPA began investing heavily in neurotechnology. The agency established the Brain-Machine Interfaces program to record patterns of neural activity in animal models and decode the neural states associated with memory formation, sensory perception, and motor intent.