Google has been a leader in the nascent field of self-driving cars. The technology for those cars will not only push the limits of artificial intelligence and machine vision software, it will also push semiconductor chip technology and hardware systems.
Daniel Rosenband, hardware engineer in the self-driving cars division at Google, spelled that out in a keynote speech this week at the Hot Chips chip conference in Cupertino, Calif. The talk was one of many showing that the frontier for sophisticated semiconductors — the building blocks of all things electronic — is no longer being driven by PCs and smartphones. Instead, it’s being led by the chips that can help cars execute the artificial intelligence algorithms that govern driving and the machine vision software that helps cars distinguish pedestrians or bikers.
Google is just one of many companies that could drive the $330 billion chip industry in a new technical direction. Others working on self-driving cars include Tesla, Honda, BMW, Volvo, Mercedes, and Ford. Uber said it would start testing 100 self-driving on-demand cars in Pittsburgh. General Motors and ride-sharing firm Lyft said they would start testing self-driving taxis by the end of the year.
Kevin Krewell, analyst at Tirias Research, believes self-driving car technology is the “leading driver” for chips. “Deep-learning based car navigation is a new work load that is not the same as other high-performance computing workloads. It’s driving new architectures that require new approaches.” He also said that’s why Intel bought A.I. company Nervana for more than $350 million.
If chips and other systems deliver, the results could be significant. If a self-driving car can give an hour per day back to someone, that’s 5 percent of their waking time. On top of that, self-driving cars may eventually be safer. Rosenband said that 1.2 million people die in car accidents every year.
“That’s a good size city that disappears because of car accidents,” he said. “In the U.S. alone, roughly 35,000 people die of car accidents every year. That’s the equivalent of a passenger aircraft crashing every day. That puts it in perspective.”
Self-driving cars also enable people who can’t drive, such as blind people or people with other disabilities. But the challenge of creating an outstanding self-driving car is high, and the technology is still being tested. A driver using Tesla’s Autopilot feature failed to see a truck in his path and died in a wreck.
But the companies investing in the technology see possibilities for a brighter and safer future.
“We could likely transform the lives of many people,” Rosenband said.
To do that, companies have to make big breakthroughs in artificial intelligence and computer vision. The systems have to be adaptable to ever-changing problems, such as new traffic conditions, pedestrian congestion, and bikers. That takes an awful lot of processing power, and the solution can’t be a partial one.
“We realized from driving on highways that it’s important to solve the problem from end to end,” Rosenband said. “How can you design something that is totally self-driving?”
You have to be able to pick up a person at a destination, drive them somewhere, and drop them off. Google designed a prototype car that drives at 25 miles per hour or less through neighborhoods, and it doesn’t even have a steering wheel. It drives defensively, waiting at an intersection for a second and a half before venturing into the intersection. And it has to calculate the probability of what happens next in any given scene.
The car needs to know where it is, what is around it, what those objects are doing around the car, and how should the car move. This requires a lot of maps and sensors. To get a sense of the 360 degrees around a vehicle, Google uses tech such as Lidar radar systems, which rotate and scan in any direction. Those sensors help generate a 3D model of what is happening around a car. They deliver range and speed of different objects on the road.
For its next-generation prototypes, Google needs something like four times the processing power of its 2015 prototype. It can do that with either general-purpose chips or with custom chips that are addressed directly to a particular problem that a self-driving car would encounter. On a 100 millimeter-squared chip, Google needs something like 50 teraflops of performance, Rosenband said.
“That’s a pretty compelling number,” Rosenband said. “If you look what chips were available ten years ago, I don’t think we could do what we are doing today. Many thanks to the community for that.”
To address the challenge, Nvidia revealed that it had created a new chip — dubbed Parker — for self-driving cars, as part of its Drive PX 2 supercomputer for automobile artificial intelligence systems. Intel says its Xeon Phi family of chips are being adapted to handle A.I. challenges.
Rosenband said there are still some very difficult problems to solve. Sometimes, even humans can’t see the color of a traffic light, given sunset or sunrise conditions. An overexposed or underexposed image is very hard for a computer to decipher, particular when a car is moving..
“Where do chips fit in?” Rosenband asked. “We need lots of gigahertz, radio frequency channels. We have to do a lot of digital signal processing to reduce noise and improve the fidelity from the radar system. We try to use the very best silicon available. The primary objective is to do it for maximum performance.”
At some point, Google might very well need systems that are the equivalent of data centers on a mobile device. It has to deliver the maximum amount of compute performance without consuming too much power, Rosenband said. And it has to enable a car to tap computing resources in data centers.
Why? Google has driven 2 million miles with its self-driving cars, but it still can’t anticipate everything that happens in the world. Chris Rowan, a chip expert and an executive at Cadence Design Systems, suggested that Google might need to test drive a billion miles to handle all of the rare events that could happen.