Did you miss a session from GamesBeat Summit Next 2022? All sessions are now available for viewing in our on-demand library. Click here to start watching.

At Microsoft Connect(); 2018 today, Microsoft launched a bevy of updates across its cloud, artificial intelligence (AI), and development product suites. That’s no exaggeration — the company made its Azure Machine Learning service generally available, enhanced several of its core Azure IoT Edge offerings, and debuted Cloud Native Application Bundles alongside the open source ONNX Runtime inferencing engine.

And that only scratched the surface of today’s small mountain of announcements.

Microsoft also took the wraps off improvements to virtual nodes in Azure Kubernetes Service (AKS) and autoscaling for AKS. And it made available JavaScript and Python support in Azure Functions, its serverless compute service that allows customers to run code on-demand without having to explicitly provision or manage infrastructure, in addition to a new Azure API Management Consumption tier.

First off, Azure Kubernetes Service (AKS), which let developers elastically provision quick-starting pods inside of Azure Container Instances (ACI), is available in public preview starting today. Customers can switch it on within the Azure portal, where it cohabitates in the virtual network with other resources.


Intelligent Security Summit

Learn the critical role of AI & ML in cybersecurity and industry specific case studies on December 8. Register for your free pass today.

Register Now

Also launching today: the aforementioned cluster autoscaling for AKS. It’s based on the upstream Kubernetes cluster autoscaler project and automatically adds and removes nodes to meet workload needs (subject to minimum and maximum thresholds, of course). It works in conjunction with horizontal pod autoscaling and enters public preview this week.

Dovetailing with those announcements, Microsoft debuted graphics processing unit (GPU) support in ACI, which lets developers run demanding jobs — such as AI model training — containerized in accelerated virtual machines. And the Seattle company detailed the Azure Serverless Community Library, an open source set of prebuilt components based on common use cases, such as resizing images in Blob Storage, reading license plate numbers with OpenALPR, and conducting raffles, all of which can be deployed on Azure subscriptions with minimal configuration.

GPU support in ACI launches in preview today, and the Azure Serverless Community Library is available on GitHub and the Serverless Library website.

Rounding out today’s news, Microsoft took the wraps off a new consumption plan for Linux-based Azure Functions, which was teased at Microsoft Ignite earlier this year. Now it’s possible to deploy Functions built on top of Linux using the pay-per-execution model, enabling serverless architectures for developers with code assets or prebuilt containers.

Finally, Microsoft launched Azure API Management (APIM) — a turnkey solution for publishing APIs to external and internal customers — in public preview in a consumption-based usage plan (Consumption Tier). Effectively, it allows APIM to be used in a serverless fashion, with instant provisioning, automated scaling, and pay-per-play pricing.

Along with the new APIM plan is the general availability of Python support (specifically Python 3.6) on Azure Functions runtime 2.0, and JavaScript support for Durable Functions, an extension of Azure Functions and Azure WebJobs that manages state, checkpoints, and restarts in serverless environments.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.