Nvidia and Arm announced that they are partnering to bring deep learning inferencing to the billions of mobile, consumer electronics, and internet of things (IoT) devices anticipated in the future.

Under this partnership, the two chip companies will integrate the open source Nvidia Deep Learning Accelerator (NVDLA) architecture into Arm’s Project Trillium platform for machine learning. The collaboration will make it simple for IoT chip companies to integrate AI into their designs and help put intelligent, affordable products into the hands of billions of consumers worldwide. The internet of things aims to make everyday objects smart and connected through processors, sensors, and other electronics.

“This is a very exciting announcement,” said Deepu Talla, a vice president at Nvidia, in a press briefing. “We can add deep learning architectures to systems. What’s the next step? All of the internet of things devices coming in the future. Whether it’s a smartphone or camera, all of those devices in the future will do inferencing. ARM is the largest IoT company in the world.”

Based on Nvidia’s Xavier AI chip, the autonomous machine system on a chip, NVDLA is a free and open architecture to promote a standard way of designing deep learning inference accelerators.

“Accelerating AI at the edge is critical in enabling Arm’s vision of connecting a trillion IoT devices,” said Rene Haas, executive vice president at Arm, in a statement. “Today we are one step closer to that vision by incorporating NVDLA into the Arm Project Trillium platform, as our entire ecosystem will immediately benefit from the expertise and capabilities our two companies bring in AI and IoT.”

NVDLA brings benefits that speed the adoption of deep learning inference. It is supported by Nvidia’s suite of developer tools, including upcoming versions of TensorRT, a programmable deep learning accelerator. The open source design allows for cutting-edge features to be added regularly, including contributions from the research community.

“This is one of the more interesting announcements as many counted out Nvidia out of the ‘very small edge,'” said Patrick Moorhead, analyst at Moor Insights & Strategy. “Nvidia is engaged in AI in places like drones and robots, but this announcement could enable Nvidia ML tech to be in even smaller IoT devices like home automation and even smartphones. Partnering with Arm doesn’t guarantee Nvidia NVDLA success at the ‘very small edge,’ but increases its chances greatly.”

The integration of NVDLA with Project Trillium will give deep learning developers high levels of performance as they leverage Arm’s flexibility and scalability across a wide range of IoT devices.

“This is a win/win for IoT, mobile, and embedded chip companies looking to design accelerated AI inferencing solutions,” said Karl Freund, lead analyst for deep learning at Moor Insights & Strategy, in a statement. “Nvidia is the clear leader in AI, and Arm is the leader in IoT, so it makes a lot of sense for them to partner on IP.”