Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
A new survey by Rescale indicates enterprises are quickly combining high-performance computing platforms and cloud best practices to accelerate product development for physical things like jets, cars, chips, and drugs.
The big takeaway is that enterprises are increasingly adopting automation approaches from the world of cloud computing to bring efficiency to high-end computing. This dramatically lowers the barriers to testing new ideas, combining multiple simulation techniques, and identifying and rectifying problems much earlier in the development cycle.
These things have been mainstays in enterprise software development. But their adoption in engineering workflows could accelerate the adoption of techniques like multi-physics simulation, generative design, and multimodal AI for the design of physical products. These improvements could help drive the growth of the HPC market to $55 billion by 2024.
“Pioneering enterprises are actively rethinking the role of supercomputing in a digital era,” Chirag Dekate, VP of analytics for AI infrastructure, digital R&D, and emerging technologies at Gartner, told VentureBeat.
MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.
HPC platforms or supercomputers have traditionally operated in a kind of computing niche, outside the workflows used for other kinds of enterprise applications. Older HPC management tools were not well known for their efficiency or flexibility. Leaders are rearchitecting HPC applications for clouds to take advantage of advanced cloud capabilities.
Chirag said the long-standing on-premises-only supercomputing delivery model is increasingly unsustainable. He sees enterprises across enterprises devising hybrid cloud strategies for supercomputing due to:
- High demand for supercomputing skills resulting in brain drain from the vendor and end-user enterprise organizations.
- Supply chain challenges across on-premises systems vendors.
- Delays in maximizing value capture from the latest technologies.
- Growing risk of analytics islands in which CAPEX and OPEX intensive extreme-scale analytics capabilities are disconnected from broader enterprise architectures.
Connecting the dots
Cloud architectures lower the barriers for developing complex digital twins that combine multiple simulation techniques such as finite element analysis (FEA), computational fluid dynamics (CFD), and machine learning to take advantage of the best price versus time to solve tradeoff for a given question.
More efficient workflows reduce researchers’ time on non-research-related tasks like finding lost files and setting up infrastructure. It also accelerates the speed of innovation and allows organizations to take on broader scientific and engineering challenges. Companies that have adopted cloud processes are more than twice as likely to achieve product goals consistently.
“Letting engineers and scientists have easy access to compute at scale has a measurable impact on project timeliness and success, as well as the scope of their research,” Rescale CPO Edward Hsu told VentureBeat.
This mirrors the innovation the cloud brought to software development a decade ago with tools like CI/CD pipelines for reliably and repeatedly provisioning infrastructure as code. “If you make it harder for developers to get their code used, innovation velocity drops,” Hsu said.
Powering new business models
The real power of the cloud comes from how everything can be connected. “Just as tools like Google Docs have changed how we collaborate in writing, running all computational engineering workloads from a cloud-based control plane changes what’s possible,” Hsu said. Examples include automating entire computational pipelines, sharing best practices, simplified data access and management, or continuously adding to a surrogate model that can be shared and improved.
Down the road, this could change the way enterprises create value around physical products. Hsu predicts that companies taking engineering seriously can use computing to improve product designs in the short run and turn the model themselves into products in the long run. These represent the definition of the process by which the product works and performs.
Traditionally, engineers only run simulations of things companies plan to manufacture. Now they are starting to run simulations before making a design decision. More efficient computing workflows will enable them to run simulations to satisfy their curiosity and explore the bounds of what is possible. These models of what is possible could become the company’s intellectual property and provide a source of competitive advantage.
The products shipped are instances of the latest model optimized to solve a particular objective. A company might say to a customer, “if you change your requirements on this product by this much, we can deliver significantly more value to you.”
“Digital twins will be a core part of this journey towards the model becoming the company’s product, since it’s the way we integrate what we observe in the real world with what we’re modeling in the computing space,” Hsu said.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.