Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More
Connected devices and instant mobile access to data is a common facet of modern life, but the fact is that we’ve only just begun this transition to a digital universe. In the near future, autonomous cars will be buzzing through our streets, everything from our shoes to our eyeglasses and even our own body parts will be connected, and digital agents will be assisting us at every turn, and cataloging everything we do.
It sounds scary, and it will most certainly produce a number of thorny issues surrounding privacy, self-determination, and even what it means to be human. But underpinning all of it will be the edge, the layer of infrastructure currently under development that will provide much of the processing and storage needed by devices to carry out their real-time functions.
Automated and autonomous on the edge
By its nature, the edge will be widely distributed. Small nodes of compute and storage will exist in towns and neighborhoods, along highways and power lines, and virtually anywhere else they are needed. They will also be unmanned, for the most part, and will have to be enabled with a great deal of automation and autonomy to accommodate the massive and diverse requirements of a connected world.
This sounds like a job for artificial intelligence.
Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.
The edge, after all, is an ideal environment for AI, largely because it’s still in the greenfield stage of development. Unlike in the datacenter, there are no legacy systems to contend with, no processes to be reworked, or code to be altered. AI becomes the foundational element of an all-new data ecosystem. Dell Technologies, for one, is already churning out edge-specific AI solutions, many of them fully validated and integrated among compute, storage, network, software, and services for optimized AI workloads.
If anything, the pandemic has accelerated the drive to infuse AI into edge infrastructure, says IOT World Today’s Callum Cyrus. As remote work and ecommerce took off, organizations turned to machine learning and other tools to overcome the significant operational challenges they faced. But this only increased the data load at the edge, which now requires greater use of AI in order to maintain the speed and flexibility that emerging applications require. A key development is a new generation of intelligent chips, which will soon inhabit all levels of the edge processing spectrum, from general, entry-level machine learning cores to specialty A/V and graphics machines and advanced neural network microcontrollers.
Putting AI to good use on edge
A look at some use cases for AI on the edge shows just how powerful this new intelligent infrastructure can be, notes XenonStack’s Jagreet Kaur. Once you empower systems and devices with high-level decision-making capabilities, you can push a wide range of advanced applications to users. Among them are digital map projections, dual-facing AI dashcams, advanced security for shops and offices, and broader use of satellite imagery. Virtually every function that enters the digital ecosphere will be empowered by AI before long.
Organizations that are looking to strategize around these developments should keep three factors in mind, says Intel VP Brian McCarson. First, open source becomes a key enabler because AI thrives on ready access to as much data and as many resources as possible. Secondly, video will become a major asset as organizations evolve in the new economy. This means AI’s capability to leverage video at the edge will be a primary driver for success, and this will accelerate the need for greater investment in both AI platforms and infrastructure. And finally, change will take place rapidly on the edge as new systems and new applications eat the old ones. Whatever you deploy on the edge now, be prepared to revamp it sooner rather than later.
Note that AI development on the edge shouldn’t take place independently of development elsewhere on the enterprise data footprint. Interoperability among the datacenter, cloud, edge, and any other infrastructure that comes along will be crucial — again because AI is only as good as the data and resources it can leverage.
While it may be tempting to view the edge as simply an extension of legacy infrastructure, the reverse is true: the edge is the new foundation for the services that affect people’s lives. In this light, AI at the edge should be the driver for AI in the cloud and the datacenter, at least if your business model is centered on fulfilling user needs, not your own.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.