Google Cloud Platform (GCP) today launched Deep Learning Containers, environments optimized for deploying and testing applications and services that utilize machine learning. Now in beta, GCP Deep Learning Containers works in the cloud and on-premises, making it possible to develop or prototype in both.

Amazon introduced AWS Deep Learning Containers with Docker image support in March.

Google plans for its Deep Learning Containers to “reach parity with all Deep Learning virtual machine types” in the future, according to a blog post sharing the news. The new service includes preconfigured Jupyter and Google Kubernetes Engine (GKE) clusters and launches with machine learning acceleration available from Nvidia GPUs, Intel CPUs, and other hardware. Nvidia GPU use with Deep Learning Containers requires use of nvidia-docker.

Deep Learning Containers also come with access to a number of packages and tools, such as Nvidia’s CUDA, cuDNN, and NCCL.

GCP Deep Learning Containers have launched with support for machine learning frameworks like PyTorch, TensorFlow 2.0, and TensorFlow 1.13.

The new service also works with GCP AI Platform, allowing data scientists to collaborate on AI model development first introduced by Google at the Cloud Next conference in April.