Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More


Kathy Winter is going to spend a lot of time in the garage. But she’s not a mechanic. Winter is the vice president and general manager of automated driving solutions at Intel. She helped dedicated the new Intel Advanced Vehicle Lab in San Jose, Calif., this week.

Intel is engaged with 33 “tier 1” carmakers, and it has alliances with car suppliers such as Delphi. While the company is testing self-driving cars now, it still has a long road ahead. Self-driving cars will have dozens of Lidar, radar, and camera sensors that can generate 45 terabits of data per hour. They will need 5G connectivity, or multiple gigabits per second wireless broadband, to communicate with servers that can help the cars make split-second decisions. The final cars that hit the road are going to need around 15 to 20 teraflops, or 10 to 20 times the computing power that is in cars today.

Winter’s job is to help make that happen. Before joining Intel in August 2016, she was an engineering executive at Delphi. This week, I visited Intel’s new Advanced Vehicle Lab in San Jose, Calif., and went for a ride in a self-driving Audi powered by Delphi and Intel. Then I spoke with Winter about the technology.

Here’s an edited transcript of our interview.

Event

Transform 2023

Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.

 

Register Now

Above: Yep, that’s a server in the trunk.

Image Credit: Dean Takahashi

VB: I felt like a normal driver. It shifted lanes more quickly than I might have done, if I were gradually shifting over into a right turn lane, but I’m sure it looked behind and didn’t see anything. I’m never 100 percent confident that there’s nothing behind me.

Kathy Winter: It’s interesting you say that. Over time — these cars are very conservative. There’s the whole trust factor. Does it really know? But one thing that’s improved over the last 12 months is the more human-like driving style. It’s programmable. When you think of different OEMs — some vehicles let you put them in sport mode, versus economy mode. It’s sort of like that. They can tune it to be more aggressive or more conservative.

VB: On the one hand, having gone for the ride, it seems like self-driving cars are very close. But when I hear about how much data has to be processed, it sounds farther away.

Winter: You hit it right. The technology is there. Even on the data center side, when you think about how we think of it — there’s the compute in the car itself. There’s the networking, the connectivity to the cloud, and on up to the data center. Each of those parts of the autonomous driving picture are there today and being worked.

Bringing them together is something unique for Intel. We can go to the teams that are experts in 5G, because they’re working on it anyway for things like industrial applications and smart cities. They have a huge data center business. We spend time working with them now on doing the automotive-grade applications. What do you layer on there from a function, safety, temperature perspective — all the things that go along with being in an OEM vehicle for a long time?

The things that are pacing it now — the technology is moving along very quickly — are things like regulation, insurance, liability. And especially user acceptance. Is it comfortable? Do you trust it? Getting people, in a much broader way, to experience it so they can trust and believe in the vehicles is going to take real time and effort compared to just the technology itself.

VB: It seems like 5G would gate it in some way as well, since you have so much data to transfer.

Winter: Today’s vehicles that are out there are built on existing platforms. The amount of data coming out is growing, and it can still be managed with a 4G connection. Maybe you download some of that data when you’re within reach of Wi-Fi, or you’re even swapping drives today. But when you start to think about the amount of data as you increase to fully autonomous vehicles, that’s a lot of data coming off. Thinking about the resolution of the vision systems, of radar and Lidar, continually increasing — that’s data-intensive. And then you bring in HD mapping.

You don’t have to have 5G, clearly, to have fully autonomous vehicles, but it brings a lot to the overall solution, to your ability to have big bandwidth going to and from the vehicle and very little latency or delay. If it’s mission-critical, you want that to happen in a split second. You always have a lot of processing going on in the vehicle, but there are things that we like to — if we see an anomaly or a problem, we send it up to the cloud and alert every other car in the area in a split second. There are applications and ways that 5G will enhance the autonomous vehicle experience.

Think about being in there and just riding along. Maybe you want to watch a movie or download something else. You can think of lots of things using the 5G connection that you don’t have to have, but the experience would be so much enhanced. And you’ll have that connection anyway to operate the fleet, to monitor the health of the vehicle, to update HD maps. Once you have that big pipeline in there, you can dream up all kinds of things to use it for.

Above: Intel and Delphi show off a self-driving car.

Image Credit: Dean Takahashi

VB: How much of the expertise has to be inside Intel, versus a Delphi or the car companies?

Winter: Today there are the OEMs, as well as the disruptor car companies — Google, Uber, the new players. You have that big group of folks building cars. You still have tier one. The tier one role is getting more and more involved in software as time goes on. It used to be more about mechanically integrating the vehicle and their sensors. In the case of Delphi, they now have a fully automated driving software stack. They’re still a tier one, but it’s morphed past a tier one in a way.

At Intel, what we’re trying to do is enable both. We’re providing platforms, whether it’s in the vehicle and highly optimized, running on Intel architecture, bringing in the modem piece. We don’t run 5G networks, but we can enable that with the modems and the integration with those carriers globally. And then the whole data center piece. The tier one wouldn’t really operate the data center piece. We pull through the system-on-chip, the CPU, the FPGA, the pure hardware play. Then we have the 5G technology and networking play, and then the data center and deep learning at the data center, as we try to improve the algorithms running down in the vehicle.

We do work a lot with the OEMs. You see us with BMW, because between us and Mobileye we’re trying to define the state-of-the-art platform that a tier one could help integrate or another OEM could potentially pick up and drop into their systems.

Above: Intel executives dedicate their new Advanced Vehicle Lab in San Jose, Calif.

Image Credit: Dean Takahashi

VB: The Delphi car didn’t have the big Lidar setup on top, right?

Winter: No. I can speak to this only because I worked there for five years before I came to Intel in August. There are a couple of different approaches out there. In one approach, you just don’t care what it looks like. Some companies have the huge Lidar on top and sensors stuck all over because they’re most concerned about progressing the software and the data piece.

There are other approaches, like what Delphi has taken and BMW is very much taking, where they’re asking what a production-intent vehicle would really look like. It’s highly integrated. You’ve done the work to custom-fit the radars and design ways to tuck in the Lidar. You put smaller Lidar in multiple places, instead of just putting this huge Lidar on top. When I think of what a real production vehicle might look like, especially for consumers, that’s where you see BMW and Delphi taking that approach.

When I look at some of these projects where they don’t care what it looks like, if you’re offering a ride service, potentially, does the customer care what the car looks like? They just want it to show up. It’s a different business case. But they’re two different approaches to the same problem.

VB: Is there a way to figure out how much hardware is going in there from Intel or from all the vendors put together? Just to get an idea of what the cost could be?

Winter: Lots of estimates are out there. You have many different sensor configurations. Some rely more on Lidar. Some rely more on radar. Some rely more on vision systems. But most will agree that it’s a combination and redundancy that gives you all-weather performance. You have that fail-safe mode. If one system isn’t sure, you have others voting. We’ll add HD mapping, as well. You have input coming from multiple places fused together, and you have your software deciding what to do with that information. What’s the next move?

It varies from vehicle to vehicle. A lot of why you see all these different pilots that all look a bit different is because everyone is trying to find the optimal balance between different types of sensors. It depends. Today radar has come way down the cost curve. It used to be huge, a couple of thousand dollars, and now they’re very small and inexpensive — a little bigger than a deck of cards, costs a couple of hundred bucks.

Lidar today is very large. They’re starting to come down, with the Pucks, but I think they’re still in the thousands. You’re paying $70,000 to $80,000 for the really big ones. We need to see Lidar come down that same cost curve and potentially move to solid state. Today they’re mechanical. That’s a failure mode, when you think about putting that kind of mechanical contraption in the car and then it hits a huge pothole. Even without that, they eventually have failure points.

Vision systems are another one where there’s a difference of approaches, between experimenting with lots of small, cheap cameras or a few big high-end cameras with tons of software. You see different opinions. It’s tough to zero in on cost because there are different ways to approach the problem.

Above: Intel’s self-driving car lab.

Image Credit: Dean Takahashi

VB: All these acquisitions you guys have done, has that come together to help you?

Winter: Absolutely. The biggest one, obviously, is Mobileye. It’s very complementary. Part of why we chose them is because we were working with them for more than a year, side by side on BMW, and also with Delphi. We’re both working on a similar automated driving platform for them as well. We’ve gotten to know them well, their strengths and their expertise.

Mobileye brings in the computer vision piece. They’re arguably the best in the industry. They also bring in their REM technology — Road Experience Management. They’re taking all their camera data that they collect for vision systems and saying, “How can we use that to create an overlay on the HD maps? How can we improve that by crowdsourcing from all these cars driving around with vision systems?” “Can you use that real-time data to improve maps?” “If all the cars in the area see that something’s changed, can you fix that?”

From an Intel perspective — I keep using the word “data.” Camera data, vision data, HD maps. What are you doing with that, moving that information back and forth? It’s very complementary. It plays to our strengths. Those are things that help broaden our overall offering in autonomous driving solutions.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.