AImotive is introducing a suite of software for self-driving cars. The Budapest, Hungary-based company that was previously known as AdasWorks has now opened an office in Mountain View, Calif., to partner with Silicon Valley’s self-driving car companies.
AImotive has trained its artificial intelligence software to spot paths, pedestrians, cars, and moving objects. It has also trained the software on many more virtual miles by teaching it to recognize objects in the realistic racing video game Project Cars.
The company wants to enable an AI ecosystem for autonomous driving that works regardless of location, driving style, or driving conditions, said Laszlo Kishonti, CEO and founder of AImotive, in an interview with VentureBeat. I rode in one of Kishonti’s test cars, a Toyota Prius outfitted with multiple cameras, with a big desktop computer in the trunk.
The car had four cameras. It didn’t drive itself, but Kishonti’s team showed me how good the software was at recognizing objects. Looking at the screen of a laptop in the passenger seat, I could see the software identify a clear path as green, optional routes as blue, other cars as orange, and pedestrians in red. We drove slowly, but I didn’t notice any flaws.
The new company name reflects AImotive’s broader vision of bringing global accessibility to self-driving vehicles. The AImotive product suite delivers the robust technology required to operate self-driving vehicles in all conditions, and it adapts in real time to different driving styles and cultures.
Kishonti spun AImotive out of his automotive testing company, Kishonti Ltd., after helping Nvidia with its bid to provide self-driving car technology to Tesla Motors. Tesla is using Nvidia’s supercomputing and graphics technology in its latest cars, which include an autopilot feature.
And since 2015, AImotive has grown from 15 engineers to more than 120 researchers, including 16 with doctorates, said Kishonti. During that process, AImotive raised $10.5 million from a number of investors, including Robert Bosch Venture Capital, Nvidia, Inventure Oy, Draper Associates, Day One Capital Fund Management, and Tamares.
The company says it provides the only “Level 5 architecture” using cameras as the primary sensors for both affordability and accessibility. Right now, Kishonti said the company’s test car has about $2,000 worth of cameras and computing power, but that could be reduced to $500 over time.
AImotive’s full stack software does not require a mandatory custom chip, and it uses AI to “see” fine detail and predict behavior, making it easier to manage common driving concerns, such as poor visibility and adverse conditions. AImotive’s training technique is also scalable, with a real-time simulator tool that trains the AI for a wide variety of traffic scenarios and weather conditions.
Car makers will be able to use the suite of software and tools to build and test their own self-driving cars. The AIDrive software includes a full technology stack that consists of a Recognition Engine, Location Engine, Motion Engine, and Control Engine.
The Recognition Engine is a continuously learning engine that combines and analyses sensor data with AImotive’s pixel-precise segmentation tool to recognize up to 100 different object classes, such as pedestrians, bicycles, animals, buildings, and obstacles.
The Location Engine provides a solution that allows the vehicle to know precisely where it is at all times, using 3D landmark point data on top of conventional GPS positioning.
The Motion Engine enables real-time tracking of moving objects, predicting future speed, location, and behavior, thereby allowing for optimal routing of the car, even in emergency situations. And the Control Engine is the execution component that manages acceleration, braking, steering, gear shifting, etc., as well as handling auxiliary functions such as turn signals, headlights, and the car horn.
AImotive also has testing tools and hardware designs for low-latency neural network computation.