Ctrl-labs, a New York-based startup developing a wristband that translates musculoneural signals into machine-interpretable commands, today announced that it’s acquired patents associated with Myo, a similar wearable created by North (formerly Thalmic Labs) that enables control of robotics and PCs via gestures and motion.

According to Ctrl-labs chief strategy officer Josh Duyan, the patents in question cover applications focused on electromyography (EMG) — that is, hardware that measures changes in electrical potential caused by brand-to-muscle impulses — in addition to software and human-computer interaction inventions. He says they’ll bolster Ctrl-labs’ developer tools and lay the cornerstone of a surface EMG control industry standard ahead of its developer kit’s expanded availability.

“We’re excited to acquire assets that drive our mission and to integrate the Myo IP pool with our developer offering,” added Duyan. “We look forward to building on these inventions and supporting the developer community that got started with the Myo band. This is an important step toward the future of universal control and our unwavering commitment to novel neural interface technology.”

For North’s part, the company says it’ll continue to provide customer support to current Myo owners. “We’re incredibly proud of Myo, all the people that helped us bring it to life, and those who built amazing applications on top of it,” said North cofounder and CEO Stephen Lake. “I’m glad to see the IP go to a great new home with Ctrl-labs. We’re excited to see the work live on in a new form.”

Ontario-based North, which has raised nearly $200 million in venture capital to date from backers like Intel Capital and Amazon’s Alexa Fund, said in October that it would cease sales of the $200 Myo as it pivoted to its next product. Despite promising use cases like telesurgical operations and a celebrity endorsement from DJ Armin van Buuren, the market growth North anticipated never materialized.

As for Ctrl-labs’ wearable — Ctrl-kit — it currently comprises two parts: an enclosure roughly the size of a large watch that’s packed with wireless radios, and a tethered component with electrodes that sits further up the arm. The accompanying SDK ships with built-out JavaScript and TypeScript toolchains and prebuilt demos, and programming is largely done through WebSockets.

The final version of Ctrl-kit will be in one piece, and it won’t be an entirely self-contained affair. The developer kit has to be wirelessly tethered to a PC for some processing, according to Ctrl-labs, but the goal is to get to the point where overhead is such that it can run on wearable system-on-chips.

Ctrl-kit leverages EMG to translate mental intent into action. Sixteen electrodes monitor the motor neuron signals amplified by the muscle fibers of motor units, from which they measure signals, and with the help of AI algorithms trained using Google’s TensorFlow distinguish between the individual pulses of each nerve.

The system works independently of muscle movement; generating a brain activity pattern that Ctrl-labs’ tech can detect requires merely the firing of a neuron down an axon, or what neuroscientists call action potential. That puts it a class above wearables that use electroencephalography (EEG), a technique that measures electrical activity in the brain through contacts pressed against the scalp. EMG devices draw from the cleaner, clearer signals from motor neurons, and as a result are limited only by the accuracy of the software’s machine learning model and the snugness of the contacts against the skin.

Video games top the list of apps Ctrl-labs expects its early adopters to build, particularly virtual reality games, which the company believes are a natural fit for the sort of immersive experiences EMG can deliver. (Imagine swiping through an inventory screen with a hand gesture, or piloting a fighter jet just by thinking about the direction you want to fly.) And not too long ago, Ctrl-labs demonstrated a virtual keyboard that maps finger movements to PC inputs, allowing a wearer to type messages by tapping on a tabletop with their fingertips.

It’s perhaps unsurprising that Ctrl-labs this week joined the nonprofit consortium Khronos Group’s OpenXR working group, which seeks to create a royalty-free API and device layer for virtual reality and augmented reality apps. A provisional version (version 0.9) of the standard was released in March, with companies including AMD, Arm, Google, Microsoft, Nvidia, Mozilla, Qualcomm, Samsung, Valve, Unity, LG, Epic Games, HP, HTC, Intel, MediaTek, Razer, and Unity Technologies contributing to its development and implementation.

Ctrl-labs says that the partnership “represents [its] desire to support and collaborate with the [extended reality] developer community.”