We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!
In 2014, data was declared the “oil of the digital economy,” and the analogy remained accurate until recently. In 2020, however, data reflected oil only in the parallels to the 2020 oil glut — too much production, not enough consumption, and the wholesale commoditization of storage.
Today, the overriding demand is for data’s refined end product — business insight. And the most crucial link in the data insights supply chain is compute power.
This makes the infrastructure of CPU cycles that enables distillation of value from mountains of data the new oil of the digital economy. And it’s driving some dramatic changes in the computing hardware ecosystem. Here’s what I mean:
Processing power doesn’t just belong to Intel anymore
Cloud vendors like AWS came to understand that the core differentiation of their offerings had little to do with data itself and everything to do with what customers can get from their data. Yet deriving value from massive datasets spread across multiple cloud storage instances, and leveraging advanced AI and ML-powered graph analytics and other analytics, takes a lot of processing juice.
The exponential growth in demand for processing capacity (and the costs associated with it) was what initially drove organizations to move to the cloud. Yet once the move to the cloud was a fait accompli, cloud vendors could take a long, hard look at their own processing capabilities.
What they saw was that processing was the hands-down biggest variable cost in the cloud environment. And they realized that buy versus build priorities had flipped. Just as Amazon had verticalized deliveries — lowering costs and competing with UPS and FedEx — cloud vendors could verticalize chipmaking, or outsource to competitors other than Intel and AMD.
So they did.
AWS dipped its toes in the silicon waters in 2018, when it began offering services over its first gen Graviton chips, which were designed with technology licensed from Arm (which NVIDIA is in the process of acquiring). This year, AWS dove headfirst into the chip pool, launching services based on Graviton2 – which are touted as massively faster and cheaper than its Intel-based offerings. AWS also announced a new ARM-based supercomputing service two weeks ago.
In 2017, Microsoft announced it was committing to use chips based on Arm-based technology for cloud purposes. It was among the first to test the Altra processor from Arm server CPU start-up Ampere in March, actively evaluating the chip’s capacities in their labs to help bolster Microsoft’s hyperscale data centers. Two years ago, Google launched its Tensor Processing Unit (TPU) 3.0, a custom application specific processor to accelerate machine learning and model training.
Meanwhile, Apple announced in June that it would gradually transition away from Intel-based chipsets in its personal computers, and more recently stated it was going to produce its own cellular modem chips too.
What comes next?
What we’re seeing is the decoupling of processing power from its traditional members-only club. Like oil, compute power is moving the direction of storage and other commodity services. And just like airlines care deeply about oil prices, inasmuch as oil’s derivatives are a pillar of their service offering, enterprises will look at computing power as a means to an end.
Cloud vendors will relentlessly pursue ever-cheaper processing power. The entire compute layer will be commoditized, and we’ll see apps routinely running across tens of thousands of CPUs in parallel. Companies that embrace multicloud will be able to split processing intensive tasks between providers, based on highly-competitive and micro-segmented incremental pricing.
Computing power will become a commodity in the full and traditional sense of the word, too. It will be traded on markets like any metal, energy, livestock, or agricultural commodity. Traders will be able to arbitrage processing cycles and hedge with processing futures.
This shift will force cloud vendors to rethink themselves. Differentiation will be based on computing cycle availability and the quality of the algorithms used for AI/ML analysis.
What does all this mean for Intel and AMD? Unless they make some radical changes, I think the expression “old soldiers never die, they just fade away” may be apt. Consider high street retail, whose demise began with the advent of widespread e-retail and accelerated during the pandemic. With the shift to cloud computing, the demand for CPU power on the desktop and in the data center will continue to shrink. And if cloud vendors make their own processing power, we could see traditional chipmakers go the way of Sears.
The bottom line
The burgeoning demand for insights from the petabytes of data that continues to flood into enterprise cloud storage is completely reshaping the computing ecosystem. As cloud vendors step into new verticals to take control of their computing supply chain, the old order of processors stands before a time of dramatic and fundamental change.
David Richards is co-founder and CEO of WANdisco.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.