Elevate your enterprise data technology and strategy at Transform 2021.
Scientists at Elon Musk-backed Neuralink gave a progress update during a conference streamed online from the company’s headquarters in Fremont, California. This came just over a year after Neuralink, which was founded in 2016 with the goal of creating brain-machine interfaces, revealed its vision, software, and implantable hardware platform. Little of what was discussed today was surprising, but it provided assurances the pandemic hasn’t prevented Neuralink from moving toward its goals.
Readings from a pig’s brain were shown onscreen in a live demo. When the pig touched an object with its snout, neurons captured by Neuralink’s technology (which had been embedded in the pig’s brain two months prior) fired in a visualization on a television monitor. That isn’t novel in and of itself — Kernel and Paradromics are among the many outfits developing brain-reading neural chips — but Neuralink uniquely leverages flexible cellophane-like conductive wires inserted into tissue using a “sewing machine” surgical robot. Musk says Neuralink received a Breakthrough Device designation in July and is working with the U.S. Food and Drug Administration (FDA) on a future clinical trial for people with quadriplegia.
Neuralink founding team members Tim Hanson and Philip Sabes, who both hail from the University of California, San Francisco, pioneered the technology with University of California, Berkeley professor Michel Maharbiz. Musk calls the version demonstrated today “V2,” and it represents an improvement over what was shown last year. He’s confident it will someday be possible to embed it within a human brain in under an hour without using general anesthesia. He also says it will be easy to remove and leave no lasting damage, should a patient wish to upgrade or discard the interface.
Neuralink collaborated with Woke Studios, a creative design consultancy based in San Francisco, to design the plastic casing (but not the technical components) of the robot sewing machine. The machine employs optical coherence tomography for real-time brain tracking and five axes of motion to access implant sites around a patient’s head, as well as a 150-micron gripper for grasping and releasing threads using a 40-micron needle.
Woke began working with Neuralink over a year ago on a behind-the-ear concept Neuralink presented in 2019, and the two companies reengaged shortly afterward for the surgical robot.
“The design process was a close collaboration between our design team at Woke Studios, the technologists at Neuralink, and prestigious surgical consultants who could advise on the procedure itself,” Woke head designer Afshin Mehin told VentureBeat via email. “Our role specifically was to take the existing technology that can perform the procedure and hold that against the advice from our medical advisors, as well as medical standards for this type of equipment, in order to create a nonintimidating robot that could perform the brain implantation.”
The surgery consists of three parts: opening, inserting, and closing. A neurosurgeon takes care of opening, which involves creating an incision in the skin and removing a small piece of skull and any nearby dura membrane. Then the robot uses its cameras and sensors to insert the wires (or threads, as Neuralink calls them) into the brain while avoiding vasculature up to a depth of up to six millimeters. (Although the robot can physically insert deeper, doing so has not been tested, a Neuralink spokesperson told VentureBeat via email.) Finally, the surgeon secures the implant such that it replaces the piece of removed skull and closes.
The wires — which measure a quarter of the diameter of a human hair (4 to 6 μm) — link to a series of electrodes at different locations and depths. At maximum capacity, the machine can insert six threads containing 192 electrodes per minute.
A single-use bag attaches with magnets around the machine’s head to maintain sterility and allow for cleaning, and a table attached to a mechanical fixture ensures a patient’s skull remains in place during insertion. The machine’s “body” attaches to a base, which provides weighted support for the entire structure, concealing the other technologies that enable the system to operate.
Mehin danced around the question of whether the prototype would ever make its way to clinics or hospitals, but he noted that the design was intended for “broadscale” use. “As engineers, we know what’s possible and how to communicate the design needs in an understandable way. And likewise, Neuralink’s team is able to send over highly complex schematics that we can run with,” he said. “We imagine this is a design that could live outside of a laboratory and into any number of clinical settings.”
As Neuralink detailed last year, its first in-brain interface designed for trials — the N1, alternatively referred to as the “Link 0.9” — contains an ASIC, a thin film, and a hermetic substrate that can interface with upwards of 1,024 electrodes. Up to 10 N1/Link interfaces can be placed in a single brain hemisphere, optimally at least four in the brain’s motor areas and one in a somatic sensory area.
Musk says the interface is dramatically simplified compared with the concept shown in 2019. It no longer has to sit behind the ear, it’s now the size of a large coin (23 millimeters wide and 8 millimeters thick), and all the wiring the electrodes need connects within a centimeter of the device itself.
During the demo, the pig with the implant — named Gertrude — playfully nuzzled her handlers in a pen adjacent to pens containing two other pigs, one of which had the chip installed and later removed. (The third pig served as a control and hadn’t had a chip implanted.) Pigs have a dura membrane and skull structure similar to that of humans, Musk explained, and they can be trained to walk on treadmills and perform other activities useful in experiments. This is why Neuralink chose them as the third animals to receive its implants, after mice and monkeys.
Neuralink’s prototype can extract real-time information from many neurons at once, Musk reiterated during the stream. The electrodes relay detected neural pulses to a processor that is able to read information from up to thousands of channels, roughly 15 times better than current systems embedded in humans. It meets the baseline for scientific research and medical applications and is potentially superior to Belgian rival Imec’s Neuropixels technology, which can gather data from thousands of separate brain cells at once. Musk says Neuralink’s commercial system could include as many as 3,072 electrodes per array across 96 threads.
The interface contains inertial measurement sensors, pressure and temperature sensors, and a battery that lasts “all day” and inductively charges, along with analog pixels that amplify and filter neural signals before they’re converted into digital bits. (Neuralink asserts the analog pixels are at least 5 times smaller than the known state of the art.) One analog pixel can capture the entire neural signals of 20,000 samples per second with 10 bits of resolution, resulting in 200Mbps of neural data for each of the 1,024 channels recorded.
Once the signals are amplified, they’re converted and digitized by on-chip analog-to-digital converters that directly characterize the shape of neuron pulses. According to Neuralink, it takes the N1/Link only 900 nanoseconds to compute incoming neural data.
The N1/Link will pair wirelessly through the skin via Bluetooth to a smartphone up to 10 meters away. Neuralink claims the implants will eventually be configurable through an app and that patients might be able to control buttons and redirect outputs from the phone to a computer keyboard or mouse. In a prerecorded video played at today’s conference, the N1/Link was shown feeding signals to an algorithm that predicted the positions of all of a pig’s limbs with “high accuracy.”
One of Neuralink’s loftier goals is to allow a tetraplegic to type at 40 words per minute. Eventually, Musk hopes Neuralink’s system will be used to create what he describes as a “digital super-intelligent [cognitive] layer” that enables humans to “merge” with artificially intelligent software. Millions of neurons could be influenced or written to with a single N1/Link sensor, he says.
High-resolution brain-machine interfaces (BCI), are predictably complicated — they must be able to read neural activity to pick out which groups of neurons are performing which tasks. Implanted electrodes are well-suited to this, but hardware limitations have historically caused them to come into contact with more than one region of the brain or produce interfering scar tissue.
That has changed with the advent of fine biocompatible electrodes, which limit scarring and can target cell clusters with precision (though questions around durability remain). What hasn’t changed is a lack of understanding about certain neural processes.
Rarely is activity isolated in brain regions, such as the prefrontal lobe and hippocampus. Instead, it takes place across various brain regions, making it difficult to pin down. Then there’s the matter of translating neural electrical impulses into machine-readable information — researchers have yet to crack the brain’s encoding. Pulses from the visual center aren’t like those produced when formulating speech, and it is sometimes difficult to identify signals’ origination points.
Neuralink will also need to convince regulators to approve its device for clinical trials. Brain-computer interfaces are considered medical devices that require specific consent from the FDA, which can be time-consuming and costly to obtain.
Perhaps anticipating this, Neuralink has expressed interest in opening its own animal testing facility in San Francisco (though a Neuralink spokesperson says the reports “aren’t correct”), and the company last month published a job listing for candidates with experience in phones and wearables. In 2019, Neuralink claimed it performed 19 surgeries on animals and successfully placed wires about 87% of the time.
The road ahead
These hurdles haven’t discouraged Neuralink, which has over 90 employees and has received $158 million in funding, including at least $100 million from Musk. However, the challenges may have been exacerbated by what STAT News described as a “chaotic internal culture.” Responding to STAT, a Neuralink spokesperson said many of STAT’s findings were “either partially or completely false.”
While Neuralink expects that inserting the electrodes will initially require drilling holes through the skull, it hopes to soon use a laser to pierce bone with a series of small holes, which might lay the groundwork for research into alleviating conditions like Parkinson’s and epilepsy and help physically disabled patients hear, speak, move, and see.
That’s less far-fetched than it might sound. Columbia University neuroscientists have successfully translated brain waves into recognizable speech. A team at the University of California, San Francisco built a virtual vocal tract capable of simulating human verbalization by tapping into the brain. In 2016, a brain implant allowed an amputee to move the individual fingers of a prosthetic hand with their thoughts. And experimental interfaces have allowed monkeys to control wheelchairs and type at 12 words a minute using only their minds.
“I think at launch, the technology is probably going to be … quite expensive. But the price will very rapidly drop,” Musk said. “Inclusive of surgery … we want to get the price down to a few thousand dollars, something like that. It should be possible to get it similar to Lasik [eye surgery].”
VentureBeatVentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
- networking features, and more