Presented by Qualcomm
Edge computing is accelerating and is already starting to reshape the future of the datacenter, 5G edge computing, and 5G infrastructure. It’s transforming industries, business models, and experiences, connecting unconnected things, and making it possible to reimagine how the world works, plays, and lives.
In this VB Q+A with Mike Vildibill, the new VP of Product Management at Qualcomm Technologies, he talks about the explosive demand for AI inferencing in the cloud, innovative cloud-to-edge AI solutions that are galvanizing the market, and his vision for Qualcomm Technologies in his new role.
VB: What’s exciting you most in the AI cloud space right now?
MV: During my career, we’ve seen phenomenal movement of processing from local on-prem at the enterprise into the cloud. We’ve seen the implications of this not only to the IT industry, but to how we compute, how we manage data, how we manage our personal data, and how we communicate with each other. And now there’s been another massive shift. We’re watching processing migrate from the cloud to the edge. Computing now has to happen closer to the action, to where the people and devices are at the edge of the cloud, not back in a data center in another part of the continent. This is again having profound implications. Things we see in our daily lives.
By doing a lot more immediate computation on that data closer to us, it means lower latencies or faster access, and it also means more personalized experience, more performance on computing. It can be done at low power near to you. This general migration to the edge translates into the fact that intelligence is moving toward the edge, and all that goes along with it.
VB: What would you say is the biggest news in this generation of chips that’s facilitating everything you just talked about?
MV: To being with, I believe that 5G is a game changer. It’s allowing for very low latency and high bandwidth communications to anywhere, to remote locations. You no longer need it to be next to a data center, or to have a big expensive wiring closet full of telco gear. It can be connected to the internet via a 5G-enabled mobile device, for example, and enabling big use cases across industries, like smart security cameras in stores that can be used also for stock tracking, off-limits areas, and theft alerts. Airport monitoring cameras, to track loitering, lost children, off-limits areas and suspicious packages. Even automatic payment at the fast-food drive through line, including unpaid alerts. So much more.
But you also need power efficiency. That comes down to high performance, and you need to be able to do it at very power-efficient means or ways. That’s where Qualcomm comes into the story, with the Cloud AI 100 product line, for example. It’s extreme power efficiency and extreme performance, and this allows for deployment in ways and in places that otherwise simply couldn’t be conceived. For the first time, we have a single product and a single software tool chain that can span that extreme expanse, providing multiple orders of magnitude in performance with one tool chain, one set of tools, one interface. Cloud AI 100 is doing that.
VB: What do you see as a business leader’s biggest challenge right now as the technology leaps ahead?
MV: I talk with my counterparts in other companies often, including the startups. It’s very clear that the market is moving very fast, the market dynamics are evolving very quickly, and requirements are changing. The whole AI and inference market is still rather nascent. It’s still going through a lot of growth. You need to keep your eye on the research, what’s being done in academic environments, to see what the next big disruption is. But I guarantee there will be a disruption, because it’s so young in its life cycle. It’s not yet gotten to a steady state. As a business leader, making sure that we’re prepared for disruption and that we can not only manage through it, but actually take advantage of disruption when it happens, that’s the exciting part.
VB: How does an all-in-one product simplify tasks for IT business leaders and those who work with them?
MV: Let me use the following example. Think of yourself as a systems integrator or an OEM. Your job is to support your enterprise customers. Without naming names, I can think of enterprise companies that sell furniture directly to the public. They have good meatballs, by the way, at warehouse scale. I’m using that as an example as a type of company who needs a lot of computer vision. They want to watch what’s going on in the store. Everything from people to safety to inventory in the store. There’s a phenomenal amount of stuff that goes on in a store like that. But that same company has their own supply chain, their own distribution chain, their own business logic that goes on in the background, that does happen in the cloud, or in an on-prem — they have mainframes and big supercomputers crunching a lot of numbers in a central location as well.
As you can imagine, a company like that, where they do AI in their data center, doing recommended workloads, doing analytics, doing language processing — they have a website, so they’re doing NLP with people who are trying to talk to the chatbots on the website. But that same company is putting these low-power computer vision appliances in their store. From their perspective, it’s a continuum. There’s no natural divide between what’s in the store and what’s in the data center. There are too many shades of gray in between. Some of them put a small data center in the store. Some of them put the store logic in their data center. It’s not segmented.
As soon as they can have a single partner, a single tool chain, a single set of tools to debug, optimize, and scale, they’re happy. They really want to have a single architecture from soup to nuts. What Qualcomm is bringing to the table is our strategy and our belief in open frameworks. What our customers are telling us very clearly is that it also allows them to integrate, for example, what we do with other IT infrastructure that’s also holding to open industry standards and open-source models with their other IT environments very easily.
Some vendors in the AI space are driving a strategy of a walled garden, which makes it that much more difficult to easily and seamlessly interoperate with IT equipment that doesn’t come from that walled garden vendor. Directly answering your question, there’s a real advantage for soup to nuts, and even greater advantage in doing so in an open framework, an open APIs model.
VB: What is Qualcomm’s vision for the distributed intelligence space?
MV: While Qualcomm Cloud AI solutions are playing their role on the edge-cloud and server side — inferencing large scale deep neural networks — there are also AI running purely on your device, which is called on-device AI. For example, our Qualcomm Snapdragon 888 5G Mobile Platform has a very powerful 6th generation Qualcomm AI Engine.
The number of neural networks that are running on any given smartphone today is staggering and ultra-secured, meaning it’s all processed locally without leaving your device. If you combine our on-device AI technology with cloud AI processing, and then throw in the speed of our 5G solutions as the link, you’ll have the basis of Distributed Intelligence — where AI is distributed across channels to power new experiences and Qualcomm is an undisputed leader in all of these spaces.
VB: Where does the new Cloud AI 100 go from here? What is the future?
MV: We’re going to make the investments to ensure that we maintain the leadership position on power efficiency. We believe this unlocks much of what we’ve talked about today. As soon as you’re power efficient, which means you need extreme performance and extreme efficiency, and you put the two together and you have an industry-leading power/performance metric, that’s going to be something you’ll expect to see Qualcomm continue to push. That’s in our wheelhouse. It’s what we do as a company in the mobile markets and other markets. It’s what we need in order to unlock new segments. We believe it’s fundamental to all these new use cases, even in autonomous driving. We need exactly this. Extreme performance and extreme efficiency. First and foremost, that’s it.
We’ll continue to expand into adjacent markets. We’ll continue to improve and invest in software and SDKs and tool chains. We’ll continue to support open source and embrace open-source frameworks. We’re going to continue the playbook you see from us today. Expect more from Qualcomm around power efficiency going forward.
Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. Content produced by our editorial team is never influenced by advertisers or sponsors in any way. For more information, contact email@example.com.