Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
Microsoft’s Azure public cloud service is introducing new Kubernetes-friendly services for developers today, including Kubernetes Event-Driven Autoscaling (KEDA), an open source project created with Red Hat. KEDA can handle event-driven and event-based architecture for automated scaling based on an organization’s needs in any public cloud, private cloud, or on-premise environment.
Also new today: Azure IoT Edge integration with Kubernetes clusters, and general availability for Azure Kubernetes Service (AKS) virtual nodes to deal with rapid scale and spikes in demand. The feature first previewed last year and is powered by the Virtual Kubelet open source project.
Azure Dev Spaces for teams to build and debug apps reached general availability today after a preview release last summer.
The Kubernetes system for managing containerized apps and services was first released by Google in 2014 and is now managed by the Cloud Native Computing Foundation (CNCF).
MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.
With initiatives introduced today at its annual Build developer conference in Seattle, Microsoft is continuing support for the open source project.
Organizations that are embracing AI and need flexibility to deploy or run applications in private cloud, public cloud, or on-premise environments are increasingly interested in Kubernetes container orchestration and serverless computing.
A 2018 CNCF survey of more than 5,000 organizations found that 58% currently use Kubernetes in production. Containerized services like Amazon’s ECS, Docker Swarm, and the Google Kubernetes Engine are also growing in adoption.
The news follows the introduction of four new Kubernetes-based AI services last week, as well as new AI from Azure Cognitive Services and Azure Machine Learning and an inference speed upgrade for the ONNX open source project for the interoperability of AI frameworks and hardware.