Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
Domino Data Lab, a pioneer provider of a machine learning operations (MLOps) platform, is making it easier for data scientists to manage code at a time when providers of DevOps platforms are starting to treat AI models as just another software artifact that needs to be managed within the context of any application development project.
Version 4.4 of the Domino platform adds a CodeSync capability that is integrated with Git repositories in a way that makes it possible to more easily track all aspects of experimentation, said Nick Elprin, Domino Data Lab CEO. While Domino Data Lab sees data science teams employing Git repositories to manage the artifacts that make up an AI model, the processes employed for building them will remain distinct from the DevOps processes that developers employ to build applications, Elprin added. “Models are fundamentally different,” he said.
As part of an effort to make it simpler for MLOps teams to achieve that goal, Domino Data Labs has also added a feature called Durable Workspaces that makes it possible to run multiple sandboxed environments simultaneously to improve productivity. Durable Workspaces will also reduce infrastructure costs by enabling data scientists to stop, edit, and resume workspace configurations as required, Elprin said.
Finally, Domino 4.4 adds support for the Transport Layer Security (TLS) protocol to enable encryption in transit and the ability to mount external Network File System (NFS) volumes from within the Domino file system.
The AI Impact Tour
Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you!
The move to tighten integration with Git repositories such as GitHub and GitLab comes at a time when the providers of those repositories are enabling DevOps and data science teams to build and deploy AI models in a more collaborative fashion. DevOps teams are incorporating AI models into their workflows to accelerate deployments of applications infused with AI capabilities. It’s not clear yet if best practices in DevOps and MLOps will simply converge or whether the tasks currently managed by MLOps platforms will be assumed by continuous integration/continuous delivery (CI/CD) platforms that many organizations already have in place.
Elprin noted that most organizations are already challenged when it comes to hiring both data scientists and DevOps engineers. The odds they will find a DevOps specialist who also knows the intricacies of MLOps are very slim, he added.
One way or another, though, organizations are looking to accelerate the rate at which AI-enabled applications are being deployed. Today it’s not uncommon for data science teams to take several months to create an AI model that needs to be deployed in a production environment. The challenge is that many application development teams are deploying and updating applications at a pace that makes it difficult to align the efforts of data science teams with application development. As such, DevOps advocates are now making a case for making DevOps platforms more accessible to data science teams.
It’s still early days for the building of AI models within enterprise IT organizations. Most of the processes are far from mature. At some point, however, there will need to be more integrated processes spanning data science, developers, and IT operations teams as the number of AI models being deployed and updated in production environments continues to steadily increase.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.