Microsoft Azure cloud customers can now use Nvidia’s GPU Cloud for the training and inference of deep learning models.
The Nvidia GPU Cloud provides software containers to accelerate high-performance computing (HPC) and deep learning for researchers and developers. Powered by Nvidia Volta and its Tensor Core GPU architecture, the GPU Cloud launched in spring 2017.
The Nvidia container registry supports popular deep learning tools like TensorFlow, Microsoft Cognitive Toolkit, and PyTorch.
Nvidia chips like the Tesla V100 are used in many of the world’s largest supercomputers today and, alongside other graphic processing unit chips, have been central to the increase of computing power that enables deep learning.
Microsoft also today announced general availability of Azure CycleCloud, a tool for management of high performance computing clusters in Azure, according to a blog post.
In other recent efforts to provide deep learning in the cloud, earlier this year Microsoft launched Project Brainwave in preview, an Azure service for serving AI models powered by Intel Stratix 10 field-programmable gate array (FPGA) chips designed to deliver faster performance than CPUs or GPUs.
High-performance computing is used in a variety of fields today, including powering health and medical research such as drug discovery, running complex simulations for militaries and governments, and executing solutions in finance.
The audio problem: Learn how new cloud-based API solutions are solving imperfect, frustrating audio in video conferences. Access here