Presented by Supermicro
With cloud computing accelerating worldwide, companies need to determine what best fits their needs in compute, storage, networking, and the software stack. Join this VB Live event to learn how open clouds reduce latencies, improve innovation, and more.
“Thinking about openness is the key to being an IT leader in the cloud computing space,” says Michael McNerney, vice president of marketing and network security at Supermicro. “And that means building your cloud to take advantage of performance and innovation not just today, but in 18 months, 36 months, and on and on.”
Cloud infrastructure has become a catch-all for modern datacenter architecture, and the open cloud has been leveraging the innovation in open-source software and industry-standard hardware over the last decade, McNerney says. And that innovation continues at a rapid clip. Moore’s law, that every 18 months hardware will double in performance at the same cost, is not slowing, and is perhaps even accelerating, when you include CPU, memory, storage, and I/O.
“The key for IT leaders right now is to make sure they stay on the right side of innovation, and live on that lower cost, better performance curve that the cloud has been driving over time,” he says, “And stay able to leverage rapid improvements to deliver new and more products and services.”
But there are a lot of solutions out there, and determining what works for your technology, your industry, and for your company overall means there are a huge number of variables to take into consideration. The key, McNerneyy says, is nailing down your basic business strategy – everything from what your decision points are and how you compete in the market to what’s your value add, and what your competitive landscape will look like in the future
“You want to optimize both your software and your hardware for optimal performance and efficiency – the days of the same general-purpose server being used for every application are gone,” he says. “Every workload requires an optimized platform.”
For AI, you need high-throughput systems that can manage the high-power consumption of the CPU/GPU solutions. Storage requires a spectrum of offerings from all-flash NVMe boxes for highly responsive applications to scale-out 60/90 bay storage systems for archival. And the same is true in CPUs now. Intel’s newest processor has better instructions per clock and more cores, but the major performance enhancements came from accelerators for specific workloads, like crypto, networking, speed select for CSPs, and more.
And from a cost and innovation perspective, the traditional lock-in model that IT departments have embraced in the past is prohibitive.
“That’s why we come back to the open cloud,” he says. “When you keep your environment open, you can control costs, leverage multiple vendors, and optimize by taking advantage of the new and best technology without worrying about being locked in to a specific vendor.”
The industry has shown again and again that both open standards and open source drives innovation, McNerney says.
With open source, technology shared with a community of people who are able to contribute to its development drives quality and innovation. When multiple parties can contribute to a specific implementation without locking you into their implementation you can draw from a rich pool of tools and applications.
Open standard interfaces mean that there is a standard layer to build below or build on top of. That shared standard interface layer is the key to innovation. He points to the evolution of storage: all-flash NVMe drives delivered an order of magnitude better performance than traditional media. But the storage interface didn’t change immediately; NVMe drives supported traditional protocols and applications would get the benefit simply by changing the drive. The open standard interface meant over time the software could be adjusted to maximize the NVMe protocol and deliver even better application performance.
Open standards across the features of the cloud allows innovation at every level and allows companies to take advantage of an ecosystem of vendors and solutions for every problem or project.
In the end, the most important decision to make is to build openness and Moore’s Law into your cloud strategy.
“For openness, make decisions that keep your options open in a way that allows you to control costs, allow substitutions, and drive competitive rivalry in your suppliers,” he says. “For Moores’ Law, ask yourself, what would I do with twice the compute performance or half the costs? Which projects are stalled today that could provide competitive advantage in 18 months when the performance is ready?”
To learn more about the benefits of open cloud, and how to ensure you nail the three main components for success – the right hardware, software stack, and network choice – don’t miss this VB Live event!
Attendees will learn:
- Why an eco-system based on open standards can reduce costs and increase innovation
- How to evaluate workloads
- How a cloud based on open standards can easily be constructed
- How open standards apply to hardware and software choices
- Keith Tanski, Chief Technology Officer, Optum
- Rick Villars, Group Vice President, Worldwide Research, IDC
- Steve Watt, Distinguished Software Engineering Manager, Red Hat
- Michael McNerney, Vice President Marketing and Network Security, Supermicro
- Joe Maglitta, Moderator, VentureBeat