Google Cloud Platform (GCP) today launched Deep Learning Containers, environments optimized for deploying and testing applications and services that utilize machine learning. Now in beta, GCP Deep Learning Containers works in the cloud and on-premises, making it possible to develop or prototype in both.

Amazon introduced AWS Deep Learning Containers with Docker image support in March.

Google plans for its Deep Learning Containers to “reach parity with all Deep Learning virtual machine types” in the future, according to a blog post sharing the news. The new service includes preconfigured Jupyter and Google Kubernetes Engine (GKE) clusters and launches with machine learning acceleration available from Nvidia GPUs, Intel CPUs, and other hardware. Nvidia GPU use with Deep Learning Containers requires use of nvidia-docker.

Deep Learning Containers also come with access to a number of packages and tools, such as Nvidia’s CUDA, cuDNN, and NCCL.

GCP Deep Learning Containers have launched with support for machine learning frameworks like PyTorch, TensorFlow 2.0, and TensorFlow 1.13.

The new service also works with GCP AI Platform, allowing data scientists to collaborate on AI model development first introduced by Google at the Cloud Next conference in April.

VentureBeat

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform
  • networking features, and more
Become a member