With the exponential growth in enterprise data from a proliferation of Internet of Things (IoT) devices, cameras, sensors, and other devices, companies are increasingly turning to edge computing.
Indeed, research firm Gartner notes that “more than 90% of enterprises start with a single unique use case for edge computing; over time, a typical enterprise will have many.”
VentureBeat spoke with David Sprinzen, VP of marketing at Vantiq about what is driving interest in edge computing, and what organizations need to do to be successful in this environment.
Vantiq is a low-code application development platform for building and deploying real-time edge-to-cloud solutions. The platform enables developers to quickly create distributed intelligence for systems such as smart buildings, smart cities, and smart grids/energy management. It also enables the monitoring of physical safety and security, ultimately digitizing and automating business operations.
VentureBeat: What is driving growing interest in edge computing, and what types of organizations are most likely to embrace it?
David Sprinzen: There are three key reasons why companies are moving to the edge:
- It decreases response latency by bringing computing power out in the environment
- It increases the level of security and sovereignty of data by keeping it on local premises rather than transmitting sensitive data through the cloud, which reduces the attack surface
- 3.It can address the explosion of data coming off of enterprise systems and sensors by filtering and processing that data locally
Reasons 1 and 2 are the primary market drivers for edge computing at the moment. But research firm Gartner cites the third reason as the primary driver for edge computing over the next few years. This is due to the proliferation of data across enterprises.
In fact, Gartner claims that by 2025, more than 75% of all enterprise data will be processed on the edge. That is a tectonic shift in the way businesses must build and think about their data systems.
Edge computing will serve virtually all industries. Early leaders are being driven by requirements of computationally intensive technologies such as computer vision. For example:
- Manufacturing – in manufacturing facilities, AI is being used to detect errors across the assembly line.
- Retail – retailers are building customer applications that require imperceptible latency for things like routing within a store and helping customers find items on their shopping list, as well as augmented reality experiences.
- Utilities/naturally distributed resources – utilities companies need to distribute the processing and management of their systems out into the field, both for the performance benefits that come from enabling computing to happen locally, and for resiliency and reliability, where, if connectivity is lost in remote locations, the application will still run without the need to communicate with the cloud.
VB: How does adopting edge computing practices typically impact an organization’s existing systems and applications?
Sprinzen: There are two phases to leveraging edge computing. Phase one is a “cloud-out” approach, where you take what you’re doing on the cloud and move it to the edge. This should improve performance and decrease latency. While this is a valid approach, it will only deliver incremental benefits.
Phase two is considered an “edge-in” approach. Here, systems and applications are built specifically for the edge. This means they are edge-native and can fully take advantage of distributed computing. With this approach, you can still bring data into the cloud as needed. But the edge is the primary computational resource for the system and delivers a full range of benefits as discussed above.
VB: What are the top benefits, and limitations, of edge computing practices?
Sprinzen: By moving to the edge, businesses can become much more aware and reactive to what is happening in their environments. That, in turn, enables them to unlock new operational efficiencies and become more effective with the resources they have.
At the same time, there are a couple of major challenges when it comes to building or deploying edge solutions. One is that you must pick the right kind of edge infrastructure to match the type of application you are developing.
For example, you can run applications on the device edge, also known as the far edge, such as with an AI-enabled smart camera. Or you could have local servers that are doing the processing, such as a computer that’s running the factory. Or you could have network edge MECs (Mobile Edge Computing). These are basically mini-clouds that have significantly more processing power but may be shared across many applications and tenants.
In other words, you have to think about what infrastructure you need for different applications. That includes deciding if must purchase that hardware, or if there are existing resources available such as a Telco MEC. Then you need to optimize where pieces of the application need to run and exactly how much computing resources they will need.
It should also be noted that in the cloud you have vertical scalability which can easily spin up or down as applications require additional vCPUs. Edge hardware is resource constrained. That means you need much more visibility into what resources are available and how they are being used.
The other factor that edge computing introduces is one of location, as you now have a physical location of the compute hardware.
With cloud computing, you don’t have to think about the location of where processing is occurring. But with the edge, you do. This is especially true when it comes to applications that are dealing with things that are moving, such as tracking expensive assets or communicating with vehicles.
In an edge-enabled computing environment, it’s not just a matter of what should be on the edge, but which edge it should be on and how the workloads get migrated between computing locations.
VB: What role do edge-native applications play in successful adoption, and what are they all about?
Sprinzen: In the same way that cloud-native technology is required to fully take advantage of the cloud, edge-native technology and applications are required to fully take advantage of the edge. You need architecture, data models, a flexible topology, and communication strategies to handle distributed rather than centralized computing. Application development platforms that are built to distribute across many edge devices are going to streamline and simplify how you’re able to take advantage of the edge.
VB: What are the most important things that data infrastructure professionals should know about edge-native applications?
Sprinzen: Let’s consider what makes an application edge-native. One of the requirements is that it’s easily partitioned across many different locations. To enable that you need loose coupling between the different nodes in the network. That requires an event-driven architecture to handle many nodes that are all communicating asynchronously with one another.
Alternatively, if you have tight coupling, as in the case of request response, the system is dependent on the weakest link. If a node goes down, the whole system can be interrupted.
An event-driven architecture enables asynchronous communication directly between nodes in the network. That provides greater flexibility in terms of where your processing is occurring, and resiliency if anything goes down.
A traditional three-tier client/server database will fundamentally not work for a many-node distributed edge-enabled environment. The database is really good if you have everything centralized because there’s one place of record.
But the moment you’re dealing with a distributed application there is no such thing as centralization. You must be able to do things autonomously between different nodes. Trying to manage a database across that will introduce concurrency issues.
Ultimately, when you have information distributed across different nodes in the network that are all trying to compete with one another, a database-oriented model will not work.
VB: What benefits do leaders at edge computing enjoy that other organizations do not?
Sprinzen: There is a common thread among organizations that are emerging as leaders in using the edge. They tend to also be leaders in the way that they’re leveraging technology to optimize and automate business operations. The edge becomes the frontier of how digital systems are going to start serving the operational needs of companies because it allows you to focus on more than just IT.
Much research has been done that points to the edge being the missing piece in the puzzle in enabling the massive convergence between IT and OT. In this case, IT is the digital systems and data backbone, and OT is the day-to-day operations. OT provides the reactivity, the situational awareness of what’s happening in the business, and the ability to automate parts of daily operations.
The edge becomes the intersection of those two domains. It enables an organization to merge technologies like the cloud, IoT, AI, and business applications, with the operational side of the organization. This intersection requires technology that can handle the real-time asynchronous nature of distributed applications.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.