Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More
While machine learning (ML) can enable all sorts of use cases and automate decisions at scale, moving it to production levels is complicated. The paramount challenges have been long feedback loops and a lack of fast, continuous movement.
But modern operational ML applications simply demand more. “Many machine learning use cases require the ability to transform streaming data, serve features to the machine learning model and calculate feature values, all on a real-time basis,” said Kevin Petrie, vice president of research for Eckerson Group.
To help facilitate this process for modern enterprise, real-time data platform Redis and enterprise feature store company Tecton announced a partnership and product integration. The result, according to the two companies, is low-latency, high-volume, highly scalable and reduced cost feature serving for real-time ML.
As noted by Taimur Rashid, Redis chief business development officer, feature stores are the center of modern data architecture, and more organizations are storing features for low latency serving.
Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.
“As more organizations operationalize machine learning for real-time, performance becomes especially important for customer-facing applications and experiences,” he said. He highlighted the combination of Tecton’s data orchestration capabilities with the Redis Enterprise Cloud’s speed and low cost. “Organizations can deliver online predictions and perform complex operations in milliseconds,” he said.
The Redis Enterprise Cloud is a database-as-a-service (DBaaS) available as both hybrid and multicloud. It is built to support Amazon Web Services (AWS), Microsoft Azure and Google Cloud. The in-memory data store allows organizations to process, analyze, predict and take action on data in real time with sub-millisecond latency retrievals for modern online stores.
Competing offerings include DataStax Enterprise, Cloudera Enterprise Data Hub, MarkLogic, Couchbase and Databricks Lakehouse Platform.
MLops brings new features to data teams
Tecton is a MLops offering that serves as a home for commonly used features. Data teams can build new features for a project and add them to a store, ensuring their reuse. Those features can be shared across models and use cases without having to build data pipelines.
Feature store repositories are increasingly being leveraged to build AI models more efficiently, and leading players in the space include Molecula, Hopsworks and Splice Machine.
The new Tecton-Redis integration enables Tecton customers to use Redis Enterprise Cloud as an online store. The result is serving latencies that are three times faster and at a cost 14 times lower than that of Amazon DynamoDB, according to Gaetan (GC) Castelein, Tecton’s vice president of marketing.
Organizations can support more demanding ML use cases, such as real-time pricing and inventory tracking or search ranking and recommendations. The new integration can also be applied to real-time fraud protection.
“We’re talking millisecond latencies, but it matters,” said Castelein. “Organizations need to be able to catch fraud in time while providing good customer experiences.”
As he noted, Tecton customers with latency-sensitive and high-volume uses cases had been asking for some time for the option to use Redis Enterprise Cloud for their online stores. Now officially providing that option helps make the company’s feature store “more flexible and modular,” he said.
“Tecton and Redis are partnering in order to reduce the time to action for enterprises,” said Petrie. “Tecton helps transform incoming data and calculate feature values, and Redis helps retrieve feature values at ultra-low latency for model serving.”
Castelein also underscored the benefits of speed and efficiency and the ability to provide more integration and choice. Companies can use resultant predictions to power new apps and automate more business processes.
“ML is the future for many enterprises,” he said. “The goal is to make offerings as pluggable as possible, as interoperable as possible. It comes down to giving customers more choices.”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.