Join gaming leaders online at GamesBeat Summit Next this upcoming November 9-10. Learn more about what comes next.
Watch out, Waymo — Mobileye has its sights set on you.
The Tel Aviv developer of autonomous car systems, which Intel paid $15.3 billion to acquire in March 2017, had an eventful few days at CES 2019 in Las Vegas. During a presentation headlined by cofounder and CEO Amnon Shashua, Mobileye announced that two Chinese partners — Beijing Public Transport Corporation and Beijing Beytai — will tap its technology to develop a commercial public transportation service in China. And it revealed a partnership with Ordnance Survey, Great Britain’s national mapping agency, to commercialize high-precision location data.
The significance of that second announcement can’t be overstated. Mobileye told VentureBeat that close to a million cars are funneling mapping data back to Mobileye’s cloud platform, in addition to 20,000 aftermarket units.
The workforce investigating the datasets has grown substantially since Intel’s acquisition. Mobileye had roughly 780 people in its employ two years ago, a number that now sits at 1,400 in Israel alone. (About 300 came from Intel.) In fact, Mobileye is currently building a new campus in Jerusalem that will house 2,500 employees, and is constructing offices for “several hundred” engineers and data scientists in Petah Tikva.
After Wednesday’s keynote, we caught up with Shashua to learn more about goings-on within Intel’s marquee automotive solutions team. In a wide-ranging interview, he spoke about Mobileye’s ongoing trials in Israel, the challenges driverless systems face today, and the steps companies must take to convince prospective riders of driverless cars’ safety.
Early next year, in partnership with Volkswagen and Israeli car importer Champion Motors, Mobileye will roll out Israel’s first autonomous ride-hailing service. Champion Motors will run the fleet operations and control center, while Volkswagen supplies the cars, the Israeli government shares infrastructure and traffic data, and Mobileye provides the autonomous driving systems.
Driverless Volkswagens will ferry passengers along preselected routes in Tel Aviv, within an area measuring roughly 11 square kilometers. That’s phase one. The next step, which Mobileye hopes to achieve by 2022, is deploying “a few dozens” vehicles on public roads that will travel unrestricted between destinations. In 2023, service will expand to all of Israel.
Concurrently within the next four years (if all goes according to plan), driverless tests in the U.S. and China with other partners will kick off in earnest. Mobileye has inked deals with BMW, Volvo, Hyundai, and others to bring its tech to commercial vehicles.
These late-stage deployments won’t involve a safety driver, Shashua said. They will be “truly” Level 4, meaning they’ll operate with limited human input and oversight in specific conditions. (The U.S.-based Society of Automotive Engineers’ standard J3016 defines six levels of vehicle automation, with Level 6 being the most sophisticated.)
Level 5 vehicles — vehicles that can operate on any road and in any condition without a human driver — aren’t in the cards right now. The reason? Even the best systems on the market today sometimes struggle in severe weather like snowstorms and downpours, Shashua said, and Mobileye’s is no different.
“That’s why deployments are done in good weather, like in Phoenix,” he added.
It’s not that Level 5 can’t be achieved — Shashua believes it’s within the realm of possibility with current machine learning techniques like reinforcement learning. Rather, he said it’s a matter of engineering sensors that can reliably deal with snowflakes, rain droplets, fog, and other perturbatory precipitation.
“You need a two sensor-modality … [sensors] with resolutions that can work in snow, for example,” he explained. “One of the issues with current cameras is that in snow, you don’t see the edges of the road or landmarks.”
Shashua predicts that many of today’s autonomous driving challenges will be overcome within the next five to 10 years, with the advent of cheap radars and high-fidelity lidar. Already, companies like AEye are developing systems that merge lidar and camera data, while startups such as Luminar are engineering long-range lidar sensors that promise to significantly undercut the competition.
“Sensor technology will come to maturity,” Shashua said.
Improved perception alone won’t be the key that unlocks fully autonomous systems, of course. That’s why Mobileye is teaming up with companies like Ordnance Survey to build high-precision location databases of roads in the U.K., Israel, and elsewhere.
In Israel between 2019 and 2020, the plan is for Mobileye, Volkswagen, and Champion Motors to collect data from 33 kilometers of Tel Aviv’s roads, and in the following two years another 111 kilometers.
“Any commercial vehicle [can be] equipped with a front-facing camera for a few hundreds of dollars — one dollar per year per car — that continuously creates high-definition mapping data,” Shashua said. “[This] solves the big problem of scalability of mapping.”
Highly accurate maps could provide a revenue stream for vehicle and fleet operators. With data collected from both autonomous and human-driven cars collect, utility companies could more accurately track assets like manhole covers, telephone poles, and lamp posts, and telecommunications providers could plan the buildout of new wireless and below-ground networks.
“Using maps to improve operations between businesses and cities will help bring us closer to the realization of smart cities and safer roads,” Shashua said.
While vendors like Baidu, which open-sourced its V2X Apollo Intelligent Vehicle Infrastructure Cooperative System platform this week, are investing in road infrastructure embedded with sensors that assist in driverless navigation, Shashua doesn’t think it’s a viable path forward.
The problem isn’t that the efficacy of vehicle-to-everything, or V2X, hasn’t been demonstrated — a U.S. Department of Transportation study of crash data from 2004 to 2008 found that a fully implemented V2X system could address 4.5 million accidents. It’s that they’re prohibitively expensive. Vehicle-to-vehicle components alone are estimated to average between $341 and $350 in 2020, according to the National Highway Traffic Safety Administration.
“I remember 20 or 30 years ago, people were talking about having magnometers on the lanes of the road,” he said. “I haven’t seen any magnometers yet.”
Shashua thinks the only on-the-road components needed to support autonomous driving are traffic light transponders — small transmitters that wirelessly signal cars when it’s safe to proceed. And then, they’ll merely serve as a backup. The cars themselves will be capable of navigating intersections.
In preparation for wider rollouts to come, Mobileye-equipped cars are becoming more adept at completing challenging road maneuvers. They’re now capable of handling unprotected left turns — a notorious trip-up for driverless cars — and lane changes in heavy congestion, as well as side passes, narrow lanes, and speed bumps.
“They’re able to do all of this in a very aggressive setting — in Jerusalem,” Shashua said.
That’s with cameras alone, mind you. Some autonomous car systems, including those from Waymo and Uber, tap lidar, sensors that measure the distance to objects by illuminating them with light and measuring the reflected pulses. And that’s in addition to radar, inertial measurement units, and other data-collecting sensors.
Then there’s suppliers like Oregon-based Flir, which propose that automakers add thermal vision cameras embedded with machine learning algorithms to the mix. WaveSense, meanwhile — a Boston startup that has its roots in the Massachusetts Institute of Technology’s Lincoln Laboratory for the United States Department of Defense — argues that ground-penetrating radars (GPR) are the next logical addition to the sensor stack.
But Mobileye is firmly committed to cameras. Toward that end, the latest custom accelerator processor chip in its EyeQ lineup — EyeQ5, which was sampled a few weeks ago and which Shashua expects to be “production-ready” in Q1 2019 — runs proprietary image processing algorithms focused on perception.
The 7-nanometer EyeQ5 will be capable of performing sensor fusion for fully autonomous vehicles, Shashua said, thanks to dedicated processors for specific sensors and central processors for fusion and decision-making. And it will offer 360-degree coverage, courtesy a combination of cameras and ultrasonic.
Mobileye detailed the backend system at the 2016 Consumer Electronics Show. Dubbed Road Experience Management, or REM, it creates crowd-sourced, real-time data for localization and high-definition lane data by extracting landmarks and roadway information at low bandwidths — 10KB per kilometer of driving. The segments are integrated into a global map in Mobileye’s cloud.
There’s plenty of cars from which to source. As of the end of 2017, EyeQ was used in over 15 million vehicles. That’s up from 10 million in mid-2016.
It’s not that Mobileye is opposed to integrating additional sensors — quite the contrary; EyeQ5 supports both radar and lidar. Instead, Shashua said that while the company’s focus is on vision, it’s committed to building redundant systems with radar and lidar in the first half of this year.
“We’re pushing the camera processing to its extreme,” he said.
By mid-2020, Mobileye plans to begin delivering to partners white box “subsystems” like its Surround Computer Vision Kit, a 360-degree, 12-camera vision system with a range of 300 yards; and a multi-chip turnkey solution, AV Kit, that incorporates fusion with other sensors, decision-making driving policies, and mapping.
Self-driving cars and safety
Technological leaps forward in self-driving systems won’t do much good if the public doesn’t trust them.
Three separate studies last summer — by the Brookings Institution, think tank HNTB, and the Advocates for Highway and Auto Safety (AHAS) — found that a majority of people aren’t convinced of driverless cars’ safety. More than 60 percent said they were “not inclined” to ride in self-driving cars, almost 70 percent expressed “concerns” about sharing the road with them, and 59 percent expected that self-driving cars will be “no safer” than human-controlled cars.
They have their reasons. In March 2018, Uber suspended testing of its autonomous Volvo XC90 fleet after one of its cars struck and killed a pedestrian in Tempe, Arizona. Separately, Tesla’s Autopilot driver-assistance system has been blamed for a number of fender benders, including one in which a Tesla Model S collided with a parked Culver City fire truck. (Tesla stopped offering “full self-driving capability” on select new models in early October 2018.)
So what will it take to convince a skeptical public? The answer lies in a mathematical model, Shashua said — the Responsibility-Sensitive Safety (RSS).
Mobileye proposed RSS in October 2017 at the World Knowledge Forum in Seoul, South Korea. A whitepaper describes it as a “deterministic … formula” with “logically provable” rules of the road intended to prevent self-driving vehicles from causing accidents. Less abstractly, it’s a “common sense” approach to on-the-road decision-making that codifies good habits, like maintaining a safe following distance and giving other cars the right of way.
Self-driving car passengers aren’t the only ones who stand to benefit from RSS. In a blog post published this week, Shashua detailed an augmented form of automatic emergency braking — automatic preventative braking (APB) — that uses formulas to determine when cars are entering a dangerous situation. The idea is to prevent collisions by bringing vehicles a slow, gradual stop when a potential hazard comes into view.
Shashua believes that, if APB were installed with a single forward-facing camera in every car, it would eliminate a “substantial” portion of front-to-rear crashes resulting from careless driving. And he said an APB system with surround camera sensing and location awareness could eliminate “nearly all” rear-end fender benders.
Assuming they work as promised, Mobileye’s machine learning-informed accident prevention tech could be a godsend for the millions of people who risk their lives every time they step into a car. About 94 percent of car crashes are caused by human error, and in 2016, the top three causes of traffic fatalities were distracted driving, drunk driving, and speeding.
To incentivize auto OEMs to adopt decision models like RSS and APB, Shashua proposes that regulatory bodies such as the NHTSA adopt a new rating designating vehicles with enhanced safety systems.
“The goal is to get cars to behave in a way that complies with human maneuvering,” Shashua said. “We need to build a coalition around it.”
VentureBeatVentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
- networking features, and more