For years now, brain-computer interfaces (BCI) have incrementally advanced, giving people with spinal injuries or lost limbs the ability to control prosthetics and computer cursors using their signals. But even though the tech has made strides, the replicating subtle, delicate, nuanced sensations of touch has remained just out of reach. Now, however, a team of researchers from the Cortical Bionics Research Group believe they have made a major breakthrough. A pair of patients wearing a BCI was able to control a bionic arm and “feel” tactile edges, shapes, and curvatures along its fingers. The researchers’ findings were published today in the journal Science.
Researchers spent years working with two patients who had suffered spinal cord injuries that left them unable to control their limbs. For the study, the participants had BCIs implanted in the sensor and motor region of their brains responsible for hand and arm movement. The researchers were able to record and then decode all of the patterns of electrical activity associated with the patient’s hand movement. From there, the patients were then asked to engage in a series of complex experiments controlling a nearby bionic arm to see if they could distinguish minute subtle changes occurring across its surface. They reported feelings and sensations like edges, curves, and directions moving across the hand. All of this was made possible by a unique new method researchers developed for encoding natural touch sensations.
Giacomo Valle, an Assistant Professor at Chalmers University of Technology in Sweden and the study’s study lead author said this work “went beyond anything that has been done before in the field of brain-computer interfaces.”
“We are in another level of artificial touch now,” Valle said in a statement. “We think this richness is crucial for achieving the level of dexterity, manipulation, and a highly dimensional tactile experience typical of the human hand.”
How a BCI can measure subtle touch
First conceived in theory by UCLA computer scientist Jacques Vidal in 1973, brain-computer interfaces work by using electrodes to obtain a human’s brain signals, analyze them, and interpret them into a form a computer can use for input. In practice, this means having electrodes surgically implanted underneath a patient’s skull to directly measure brain neurons. These have already been successfully used to help patients control prosthetic limbs and even play video games. At the same time, replicating the more subtle sensations of feeling textures or directional motion from a BCI has proved challenging and requires even more advanced levels of engineering.
This particular study went a step further. Though basic motor controls like moving an artificial limb can be achieved by reading a patient’s decoded brain signals, making them feel a sensation requires the opposite process to occur. Valle told Popular Science his team had to encode signals that would be associated with touch sensations and then send that to the patient’s brain. This is more complicated than it might sound. Sensations of touch are specific and require high levels of precision. Sending even slightly wrong signals to a patient’s brain would leave them feeling jumbled and confused. Making BCI’s work is already difficult, but encoding sensations adds a whole new layer of complexity.
“We had to send a message to the brain that speaks the language of the brain,” Valle said.
Experiments provide an early glimpse into the future
During the experiments, two patients equipped with BCI were sent a series of different touch sensation messages and asked to describe what they felt. At first, the patients reported feeling edges like that of the end of a table. Once that was successful, they were presented with slightly more complex shapes and curved letters. Eventually, they worked their way up to full 3D shapes and could feel a sense of movement running across the surface of the robotic hand and fingers. Valle said he and his team were optimistic the test would work but said he was somewhat surprised by just how effective the process is with their relatively limited number of “channels” that select and analyze brain signals.
Valle acknowledged there is still quite a ways to go before technology like this could theoretically be used outside of the lab fully restore someone’s sense of touch. To do that, someone would likely need to create a type of digital skin that would attach to a robotic limb and rapidly collect and interpret data about the world. A highly advanced BCI would then need to take all of that data and send it in a readable message to a patient’s brain. All of this would need to happen essentially instantaneously.
There’s some cause for some optimism. Researchers have made strides in so-called “E-skin” development and embodied prosthetics in recent years. Valle said well-funded, commercial BCI ventures like Elon Musk’s Neuralink and Synchron are also bringing much needed attention and investment into the area. Neuralink’s first human patient has reportedly been able to use his BCI to play online video games and chess. The company successfully implanted a device into a second patient back in August. While most of the real cutting-edge advancements in the field are being pushed forward by research teams at universities, it’s well-funded companies like Neuralink that might ultimately be needed to make this tech usable for a person outside a laboratory environment.
Whenever that happens, Valle said the achievement will be due first and foremost to the sacrifices made by the participants his team is working with. These early adopters, who Valle referred to as “pioneers” reportedly worked up to three hours per session recording and analyzing sensory data. In all likelihood, they will never completely benefit from the technology’s ambitious potential. Instead, they are laying the groundwork for those who might follow where they leave off.
“Without them, none of this is possible,” Valle said.