Intel is flooding the skies with drones, doing everything from sending them out to inspect massive solar arrays  in the Mojave Desert to lighting the night sky above Disney World with 500 drones.

It’s all part of a virtuous cycle, according to Anil Nanduri, vice president in the new technology group and general manager of Intel’s unmanned aerial vehicle (UAV). The world’s biggest chip maker has moved beyond chips to focus on great outdoor experiences for users.

And drones are a good market because they use a lot of technology, including Intel’s RealSense depth cameras, and they produce an enormous amount of data. That data keeps the servers in data centers humming, and that creates more demand for Intel’s processors.

Nanduri is in charge of keeping that virtuous cycle going, and I talked with him about that, and more.

Here’s an edited transcript of our conversation.

Anil Nanduri, vice president in the New Technology Group and general manager of the UAV.

Above: Anil Nanduri, vice president in the New Technology Group and general manager of the UAV.

Image Credit: Intel

VentureBeat: How did you get into the drone field?

Anil Nanduri: When we started with RealSense, we were working on PCs and tablets. We were trying to see how we could change the computing interface with gestures and other use cases. As we were exploring, we had two versions of RealSense cameras: one, the front-facing, was primarily for PC interaction, and then we had the world-facing, which we designed for tablets. That can see further away. The world-facing camera, which was depth-sensing, also had the ability to work outdoors.

Then we said, “Wait a minute. This depth-sensing camera could be used in other fields besides PCs and tablets. How do we apply that?” The use cases around scanning for printing, or robotics for collision avoidance, or drones for collision avoidance — that’s what the ideation triggered. Where could we apply this beyond PCs?

It was part of that exercise, looking at RealSense capabilities beyond PCs and tablets, that got me into drones. How do we apply this in a flying platform? If you recall, at CES 2015 we had two demos on stage. One was a media conferencing robot navigating using RealSense. The other was a drone with six RealSense cameras.

VB: To step back a bit, what was the thinking around going into drones at all? Intel is historically an ingredient-maker, a chip maker. This is making this whole dish. You guys don’t make that move in every case where your chips are used.

Nanduri: We’re part of Intel’s New Technology Group. This was formed to scope out new opportunities and new markets and different ways we could operate within that domain. We’re not held to any kind of engagement model where we only have to do ingredients. We have that ability to innovate.

When it came to drones, we started with the RealSense technology. We worked with Yuneec in the consumer industry and got RealSense collision avoidance in the Typhoon H drone, which is more of a pro-sumer standpoint. We made some acquisitions as well — Ascending Technologies, and later MAVinci. What we realized is that we own some amazing technology now, including full commercial-grade systems. These had very high redundancies built into them, very accurate, high-precision systems. For commercial applications, like construction inspection, where you need high accuracy reconstruction of your point of interest, these systems were deployed and widely used — not in the U.S., but in Europe and other markets. For enterprise use cases and commercial use cases, the capabilities that are needed come to be much more rigid.

We said, “Hey, we have a good value proposition here. There’s great demand in this space. Why should we be encumbered by history?” This was a completely green field.

VB: If you have a mastery of the technology already, it makes sense.

Nanduri: Right. The other part of it, even though Intel’s model has been traditionally making ingredients — you know very well that we’ve done a lot of system development, even up to PCs. We develop a lot of capabilities. We just don’t productize them. We have the knowledge to build things end to end, so we took the extra step to brand it and sell it as well. That know-how, building end-to-end systems, is very instrumental in pushing the technology and innovation forward.

Brian Krzanich, CEO of Intel

Above: Brian Krzanich, CEO of Intel

Image Credit: Intel

VB: Brian has mentioned this before, that these are not just things that use Intel’s chips. They’re also products that generate an enormous amount of data.

Nanduri: Correct. That’s where it connects to our virtuous cycle. Each drone flight, depending on the payload you’re using and a rough map, each frame that you’re capturing with a high-resolution sensor or camera is about 25 megabytes. If you take 200 images, that’s five gigabytes. If you take 2,000 images, it’s 50 gigabytes. That’s one flight. These are the standard file sizes you deal with in the space. Then you have the processing behind it, applying tools and software that need a lot more compute.

It clearly fits into our virtuous cycle, the next set of machines that are going to be generating huge amounts of data: autonomous cars, robotics, virtual reality. Drones fit into the domain of huge data sets beyond what consumers can typically generate.

VB: Having data itself is considered a good thing or a priority these days, as far as making good use of all the servers out there in the data centers.

Nanduri: Right. But the workflows are different, how you apply them and deploy them. It’s good to have an end-to-end understanding of the workflow. These are new use cases. We’re just seeing the tip of the iceberg right now in how they can be applied. Talking about bridge inspections — there are more than 600,000 bridges out there in the United States. Think about manpower and individual safety. It’s dangerous to put all those folks out there on cherry pickers or harnesses. These machines can do the work more quickly and reliably. They can automate it in easier ways. The value prop becomes a no-brainer at that point.

If people look at how inspections are done today, people literally climb towers on harnesses, sitting there doing visual inspection. You don’t need to do that anymore. You still need to send people in to fix things, maybe, you can reduce the number of times a person has to go up there, and you can inspect more often. The later you find a problem, the more expensive it is to fix. You can build a database to create those inspection reports more often, and then analyze and use compute to check for inspection issues earlier. Today you can’t do it as often because it’s expensive and unsafe. With drones you can do it more often, and at a fraction of the cost.

The question, then, is how do we get there? What needs to happen to have that become a widespread use of drones day to day?

Intel and Imagineering's drone show at Disney World

Above: Intel and Imagineering’s drone show at Disney World

Image Credit: Intel

VB: How do you start addressing the notion that this generates too much data? Potentially you have to synthesize what goes into the data center and what’s useful to have, given how many hours of video these drones can generate.

Nanduri: The notion of too much data — we always wonder about it, but it never seems to me like there’s enough data. [laughs] We just need to know how to use it. I’m constantly surprised by how much more data we can process and bring in to improve productivity.

Let me give you the principles. Once we have the technology, there are a few other pieces. I break this down into two phases. One is, you have to consider regulation and what’s allowed, but I tend to look more at where this goes from here. Drones are getting better and better. The key aspect is how you automate the process, where you have the human on the loop — monitoring and taking precautions — but not in the loop. You don’t need a human to do everything, every task that’s part of the drone workflow.

To create that automation, the systems need to get smarter. A lot of the tools already exist for things like automatic takeoff, automatic landing, waypoints, GPS coordination. You can bring collision avoidance to it. They have some redundancies built in. You want to make sure there’s no single point of failure in the system. What happens if a rotor fails, or a communication link fails, or the compass is corrupted by magnetic fields? You think of it from the perspective of bringing redundancy and safety features in. Then you want to think about the point of interest. How do I create the flight plan? How do I adjust for the terrain? How do I bring geo-fences in?

You start to think about an automated system, then you think about acquiring the data, capturing that information. The sensors can change. We’ve talked about different payloads. Some of it may be thermal. Some of it may be high-resolution imagery. Some of it may be video, or even multispectral for agriculture use cases, or a methane sensor for finding gas leaks and things like that. Automating that workflow so it becomes efficient means a whole bunch of work to be done. The systems themselves are getting more stable and reliable, but if you look at where we need to go, it’s a matter of bringing that into a safe operational process.

A lot of activity is going on in that area. Our focus is on making sure that the one-click inspection approach can be realized. Then it has to work with what the regulations allow. Today it’s still visual line of sight. You need to fly below 400 feet. But as the safety capabilities get more and more demonstrated — NASA is working on UTM, Unmanned Traffic Management. Those principles will help us go beyond visual line of sight and give us a framework to go fully autonomous.

We’re innovating a lot around that workflow. One technology you might have seen is the light show, where we fly 500 drones. We automated that whole workflow — charging, recharging, data communications, everything controlled through a computer with a single operator. It’s amazing for people to experience. We flew at Disney. We’ve done more than 90 operational flights in public, outside of all the testing. Two shows every night, except on a couple of days when we had stormy weather.

VB: That was a pretty amazing video.

Nanduri: More than the video, I don’t know if you’ve heard any anecdotes about it, but it was packed. People were waiting for the show every night. They’d surround that area by the waterfront at Disney Springs. It was amazing to see the audience reaction. People had never seen anything like it. Having a revolving Christmas tree in the sky — the initial experience for us was just a four-minute show. We could do longer. But it was a learning experience, both for Disney in creating animation with it and for us gaining operational experience over such a long period.

Since then, we’ve seen a lot of interest in how this can be applied for all kinds of shows. The fundamental difference — there’s a lot of comparisons to fireworks. But with fireworks you have pollution. They’re obviously not reusable. The reusability here is a great value prop. We can give artists a whole new kind of canvas and paint now. With fireworks, they don’t have the flexibility.

VB: What was some the key learning on that project?

Nanduri: From the technology side, flying 500 drones is about control theory. How do you fly these without having them collide into each other? If you think of this as a robotics problem, or an autonomous flight problem, that aspect is a control theory problem. It’s not easy. It looks very elegant and simple, but getting that result out of software is hard. At the end of the day you’re actually controlling machines.

We had to design the whole concept with that in mind, and that was the breakthrough. We went from flying 100 drones last year, with technology that never could have scaled, to breaking the 500 record this past year. Now, to use it for something like an animation show, the creative artists can use 3D tools. They use Maya and other mechanics to create their expressions. We completely automated the interface to take that and program it into what you can call the drone flight line. All that is automated. We built a whole software stack for it, including being able to simulate in advance. It’s the end-to-end experience in terms of how we can operationalize something like this, not just showing a one-time demonstration.

Now the technology scales, which is where it ties back to commercial applications. I talked about this automation process. How do you scale and save costs? It’s about how you parallelize things. How can I do inspections with multiple drones? How can I do search and rescue where you streamline the fleet management to have thermal cameras, other kinds of cameras looking for people at night? There’ll be a day when we can apply this technology to save lives. At that point, the public perception of value quickly goes up in terms of new technology adoption.

VB: There’s another idea for flying billboards that might come to pass.

Nanduri: Right. The light show leads to ideas like ads in the sky, logos in the sky. We’ve already shown it. We’re pretty amazed at the reactions. There may be a business in that. It’s pretty interesting to see how people want to apply this in ways that even we couldn’t think of.

VB: What’s your road map for what lies ahead, all the things you have to get done?

Nanduri: We have a couple of products, a fixed wing product and a multi-rotor, the latter of which is the first Intel-branded drone for commercial use cases. We call it the Intel Falcon 8+. They both have yet to come into the U.S. market. That’s our immediate focus. We already announced them, and we’re looking forward to the last part of the execution, rolling them out.

We also have the flight planning algorithms and software, automatic tools for seamless integration of flight planning capabilities. We have some of that technology from our fixed wing, and we’re bringing it down to all of our product road map. Having more automation in the workflow is somewhere we’ll continue to push the envelope.

We’re building more safety features with RealSense. What we brought in with Yuneec, we want to integrate that into the commercial systems as well. You can think of a couple of use cases. Say you want a drone to fly within three meters of the inspection area and hold that position, or you want to fly into GPS-denied environments. If you fly under a bridge and the signal isn’t strong, how do you make sure that stays seamless? These are the kinds of capabilities that enhance the robustness of the UAV itself.

The light shows, we’ve operationalized that and they’re getting better. We’re deploying them at scale. Then we’ll take that same fleet management technology and apply it to the commercial side as the regulations continue to open up. We had to get a waiver from FAA to do the light shows. Using that information, we’ll continue to work with regulators on how that can become a mainstay in the future.

Intel is diving deep into drones.

Above: Intel is diving deep into drones.

Image Credit: Intel

VB: How do you think you’ll generally proceed with getting people comfortable around the idea of drones? They have this ability to go anywhere, to take pictures, to record.

Nanduri: In terms of the consumer side, there’s a lot of uptake. You go to the park and see more and more drones flying. Now, any technology, there’s always a perception issue. You think back to when cell phones first had cameras on them. We went through the same learning curve. How do you address privacy? There are already frameworks for handling these issues. I don’t think drones by themselves form a new issue. It’s an extension of what’s already been addressed. In a decade from now, I think, people will look back and say, “Wow, I didn’t realize people even wondered about this.” But public perception is always a journey.

We’ve asked ourselves how to get drones into the public’s perception. The light shows have been a phenomenal way of bringing that in front of people. They start to understand drones. They say, “Wow, I never realized we could use them this way.” People become more familiar. That’s one mechanic for bringing it to an audience, and we want to continue to use that channel. We want to spread that experience as widely as possible. It’s a great way to tell the drone story.

The other part is safety features, making sure the public understands that these are pretty safe if you get the right designs and the right attributes in. The light show drones we built are only 280 grams each. It’s less than the weight of a volleyball. I have pictures where I’m trying to catch it in my hand. Hopefully, we’ll be able to expand that conversation.

VB: Where do you see the technology evolving, and at what rate? It seems like, with a 15-minute battery life on most drones, we’re at the stage where cell phones were at one point.

Nanduri: You’re right. There’s also innovation in spaces like hybrids. Is that the right answer? How do you extend the technology we have today? You look at electric cars — we started with 40-mile ranges and now we see cars with 250 miles. The battery technology hasn’t fundamentally changed. It’s how you apply and deploy it. I definitely see innovation coming.

We do have to live with some constraints. I wish battery technology advanced like Moore’s Law does. But innovation still happens in how you work around those challenges. A year back, if you told somebody you saw a day where you’d have 100 drones flying together with a single pilot, they wouldn’t believe you. Now we’re already at 500.

This is what keeps me excited. You can see where drones can be applied — first response, search and rescue, construction, infrastructure. Drone racing might become a popular entertainment. There are all the consumer use cases. It’s a personal flying camera. If people think of drones as a camera, it changes the conversation in terms of the consumer thought process. People carry around selfie sticks now. You could do the same thing with a drone hovering above you.

The technology is going to make all this happen. That’s my firm belief. In a decade from now, you’ll see wide acceptance of this technology.