Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Uber today open-sourced Neuropod, an abstraction layer on top of existing machine learning frameworks that provides an interface for developing, training, and deploying AI models. It’s designed to make it easier for researchers to build models in the framework of their choice while simplifying productization, according to the company.
In both industry and academia, it’s not uncommon for data scientists to use multiple frameworks during development. For example, Uber’s Advanced Technologies Group (ATG) integrated Google’s TensorFlow before transitioning to Facebook’s PyTorch, which led to memory corruption and debugging problems when running alongside TensorFlow. Neuropod aims to prevent incompatibilities by making frameworks look the same when running a model, with out-of-the-box support for TensorFlow, PyTorch, Keras, and TorchScript.
Neuropod starts with an outline of a problem for models to solve: a canonical description of inputs and outputs including names, data types, and more. This allows it to treat the problem as an interface and abstract away the implementations so that any models that solve the same problem are interchangeable, even if they use different frameworks and programming languages. (For example, if a user wants to run a PyTorch model from C++, Neuropod will spin up a Python interpreter under the hood and communicate with it to run the model.)
VB Event
The AI Impact Tour
Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you!
Neuropod wraps existing models in a package containing the original model along with metadata, test data, and custom operations. Applications interact with framework-agnostic APIs, and Neuropod translates these framework-agnostic calls into calls to the underlying framework. Lastly, Neuropod exports the models and builds a metrics pipeline to compare performance with defined baselines.
Uber says it has used Neuropod internally over the past year to develop hundreds of models for demand forecasting, estimated time of arrival (ETA) prediction for rides, menu transcription for Uber Eats, and object detection models for self-driving vehicles. Future releases will introduce version control that will let users specify a required version range of a framework when exporting a model; seal operations that enable applications to specify when they’re “done” using model training resources; and a dockerized worker process that provides even more isolation between models.
“As we continue to expand upon Neuropod by enhancing current features and introducing new ones, we look forward to working with the open source community to improve the library,” wrote Uber in a blog post. “Neuropod has been very useful across a variety of deep learning teams at Uber and we hope that it’ll be beneficial to others, too.”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.