Nvidia announced that its Drive AutoPilot is the first automated driving system that meets standards for Level 2-plus autonomous cars.

That means that the car can automatically handle steering, acceleration, and deceleration in the driver’s environment, as well as features like cruise control and lane centering. Nvidia introduced the system at CES 2019, the big tech trade show in Las Vegas this week. This suggests we’re not that far away from self-driving cars hitting the market — and eventually the roads.

The Nvidia Drive AutoPilot integrates Nvidia’s high-performance Drive AGX Xavier system-on-chip (SoC) processors and the Nvidia Drive software to process data from sensors outside the vehicle and inside the cabin.

German car maker Continental will develop Level 2-plus systems on Nvidia Drive, with startup production in 2020. Others will be making announcements during the show as well. Jensen Huang, CEO of Nvidia, will get on stage with Mercedes on Tuesday for an announcement.

“The Level 2-plus is a very robust, complex, high-performance computer and computing stack,” said Danny Shapiro, senior director of automotive at Nvidia, in a press briefing. “It’s all based on AI, with many deep neural networks used for perception. We use complete surround cameras. It could be six, it could be four, it could be more, depending on the car maker. It’s a powerful AI supercomputer, Nvidia AGX Xavier as its base.”

It can do full self-driving autopilot, meaning it can handle lane changes, highway driving, lane splits, and personal mapping, Shapiro said. It has an integrated cockpit that can monitor the driver in the cabin and provide alerts for drowsy or distracted driving. It’s open for partners to build on top of the system, with full over-the-air software update capabilities.

“It’s an important building block, and Level 2 is really the next step,” said Tim Bajarin, an analyst at Creative Strategies, in an interview. “2020 is the big target date for all the major automobile makers. The full impact of self-driving cars won’t be found until the next decade.”

Nvidia said its partners will start shipping cars in production with the Drive AutoPilot in 2020. The car should be able to park itself, monitor the driver, and use AI to deliver speech processing. Drivers can issue speech commands, and the car should be able to understand that speech without having to go to the cloud for computing resources, Shapiro said.

Above: Nvidia’s autopilot can sense pedestrians and hazards.

Image Credit: Nvidia

The car should be able to learn new routes that aren’t mapped. It learns how to drive from one point to another, whether they are mapped or not, augmenting data related to public roads.

“The differentiator of the Drive AutoPilot is the processing performance,” Shapiro said.┬áThe Drive AGX Xavier can do 30 trillion operations per second, which exceeds other advanced driver assistance systems (ADAS) that are on the market, he said.

In a demo, the car integrated data inside the car and outside the car to present information to the driver, like whether autopilot intends to do something like make a lane change. Nvidia is integrating different kinds of software into the Drive AutoPilot. Nvidia has a bunch of neural networks that all run simultaneously to detect hazards, Shapiro said.

The car tech falls short of full Level 5, which refers to an autonomous system that can fully replace a human driver. In Level 2-plus, the driver is still ultimately in control of the car.