Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
Red Hat and KPMG LLP today revealed they are working together to make the Red Hat OpenShift platform, which is based on Kubernetes, a foundational core of the KPMG Ignite AI platform.
The KPMG Ignite platform combines machine learning algorithms with document ingestion and optical character recognition capabilities to analyze both structured and unstructured data. Kubernetes has emerged as a preferred foundation for building AI platforms because it makes it simpler to dynamically orchestrate the consumption of IT infrastructure on behalf of containerized applications.
Developers employ containers to build AI models out of modular components that are easier to create and update. In most cases, the volume of data a monolithic approach to constructing an AI model requires simply isn’t practical.
The Red Hat OpenShift agreement extends an existing alliance between Red Hat, now a subsidiary of IBM, and KPMG, a system integrator that often competes with IBM. The ongoing alliance between Red Hat and KPMG suggests that, despite potential conflicts of interests with its parent company, systems integrators such as KPMG are not walking away from their existing relationship with Red Hat.
It’s not clear, meanwhile, to what degree organizations are now relying on external service providers to build and deploy applications infused with AI. However, just about every global IT service provider has launched an AI practice to help organizations overcome a chronic shortage of data science expertise at a time when many of them are engaged in an AI arms race.
A recent survey published by KPMG notes the COVID-19 pandemic drove increased adoption of AI in the past year as organizations accelerated a wide range of digital business transformation initiatives. Some of those organizations worry that even though they need to at least keep pace with rivals, they may be deploying AI too far ahead of AI regulations that are still being debated, the same survey finds.
Most applications that are being infused with AI models are being built in the cloud because of the massive amounts of data involved. IT organizations that are building these applications, however, don’t want to get locked into a single cloud platform. It’s become apparent that some cloud platforms are simply more optimized than others for running certain classes of workloads, noted Stu Miniman, director of market insights for cloud platforms at Red Hat. The Red Hat OpenShift platform makes it possible for organizations to build and deploy applications as they best see fit on any public cloud or on-premises IT environment, Miniman said.
Red Hat and IBM may have been late to the cloud, but as hybrid cloud computing environments continue to evolve, the combined companies are now enjoying a “second mover advantage” at a time when the bulk of enterprise IT applications are still running in on-premises IT environments, said Miniman.
Cloud service providers are especially focused on AI workloads because of the sheer volume of infrastructure resources required to build and sustain them. IT organizations that can at least demonstrate an ability to move workloads will be able to command better cloud pricing terms.
It is, of course, still early days for AI in the enterprise. Many organizations struggle to bring together a diverse range of cultures in organizations that include data scientists alongside developers and IT operations staff. Each of those communities within an IT organization is likely to have their platform preference. In the case of organizations that decide to employ the expertise of KMPG, however, that platform decision has already been made.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.