Google today unveiled a slew of updates to its popular TensorFlow machine learning framework to make it useful for a wider variety of developers and give data scientists new ways to get started building AI models.

TensorFlow is one of the most popular programming frameworks developers can use to set up and run machine learning models at scale. It contains useful abstractions for that task, so it’s easier for developers to get their AI code up and running without having to reinvent the wheel. It’s built around the concept of computational graphs, which describe how data flows between mathematical operations.

It’s an essential part of Google’s AI strategy because it helps the company’s data scientists build more intelligent features and perform machine learning research. Making it available as an open source project means Google can reap the benefits of others’ contributions while driving the field of AI forward with more broadly available technology.

Google integrated TensorFlow with JavaScript for running machine learning tasks in web browsers. The company also expects to release TensorFlow for Swift next month, providing deep integration between the machine learning framework and the programming language Apple introduced in 2014. TensorFlow Lite, Google’s framework for executing machine learning on less powerful hardware, now supports Raspberry Pi, in addition to Android and iOS devices.

Google also announced the TensorFlow Hub, which provides a repository for sharing different pre-built modules developers can reuse across multiple models. Those modules, which are self-contained bits of code, come pre-trained on large datasets but allow developers to retrain them based on particular needs. They’re designed to reduce the complexity of building machine learning systems by serving as building blocks.

TensorFlow also received a new graphical debugger that will allow developers to view how the internal nodes of a computational graph are functioning in order to understand better the way their models work.

On the more technical side, TensorFlow’s eager execution feature will exit beta. It’s designed to simplify the process of setting up and executing a computational graph, which TensorFlow initially kept as separate tasks.

In addition, it’s now easier for users to run models built using the Estimator APIs on a single machine with multiple GPUs, thanks to a new method that Google unveiled today. Finally, the company announced a new TensorFlow Probability API that expands the framework’s support for Bayesian analysis.

All of these new features are critical, given TensorFlow’s strategic importance for Google’s business. The tech titan competes against companies like Microsoft, Amazon, and IBM in the cloud realm, and the popularity of TensorFlow could encourage business customers to reach for Google Cloud Platform rather than a competitor’s offering.

That’s because the company offers managed services based on TensorFlow, but also because its mindshare among developers makes its cloud appear more suited to machine learning.

Plus, Google is competing with other tech companies to attract top machine learning talent, and the popularity of TensorFlow can help with that.