Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More

The need to continuously make data available to AI applications in real time is starting to stretch existing data architectures to their breaking point, according to a report published today.

A survey of 106 IT managers and decision-makers in North America responsible for machine learning/AI operations strategy, conducted by Forrester Consulting on behalf of Redis Labs, finds more than 40% of respondents said the data architectures they have in place do not meet the requirements for training AI models and deploy inference engines that consume data in real time.

In all, 88% of decision-makers said they expect the use cases that require these capabilities will increase in the next year or two. More than a third (38%) are developing roughly a third of models on the real-time spectrum. Nearly two-thirds of respondents (64%) say their firms are developing between 20% and 39% of their models on real-time data collected via streams of data generated from connected devices.

It’s becoming apparent processing data in memory will be required to address latency issues as the amount of data being generated continues to increase, said Taimur Rashid, chief business development officer for Redis Labs. Too many organizations are layering AI models on top of legacy storage architectures based on external storage systems, he asserted.


Transform 2023

Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.


Register Now

Transitioning to data stores that run in-memory would allow IT teams to prepare data more efficiently (49%), improve analytics efficiency (46%), and secure data better (46%), the survey finds. Much of that stems from the fact that AI models rely on semi-structured data that is challenging to process and analyze using legacy storage systems. “There’s an opportunity to modernize,” said Rashid.

Many organizations are already struggling with AI models. Nearly half of decision-makers cite reliability (48%) and performance (44%) as their top challenges for getting models deployed on their existing databases, the survey finds. It also finds 41% of respondents believe their databases cannot meet the necessary data security and compliance requirements.

The survey also finds ensuring model accuracy over time (57%) and struggling with the latency of running the model (51%) top the list of AI inference engine challenges. Almost two-thirds (63%) of model inference engines run on managed clouds. However, survey respondents expect to increase their usage of edge and AI as a service (AIaaS) to run their models.

It’s not clear to what degree data stores are shifting toward being deployed in-memory, but AI models are not the only technology shift that may force the issue. Historically, the Redis in-memory database has been employed primarily used for caching purposes. However, many organizations embracing microservices are providing each one with its own database running in-memory to ensure resiliency.

Redis says its data store has been downloaded 1.5 million times in the past nine years. At its core, it supports structures such as strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, geospatial indexes, and streams. The Redis service includes support for built-in replication, Lua scripting, transactions, and varying levels of persistence.

Most recently, Redis Labs raised an additional $100 million round of financing to bring its valuation to above $2 billion. The company now claims to have more than 8,000 paying customers, with a 54% compound annual growth rate in sales for the past three years.

It’s not likely Redis will supplant existing databases anytime soon, given the cost of memory. However, it’s clear that databases running in memory could obviate the need for a traditional database. The challenge now is determining not just what type of database to deploy, but also where it needs to best run, weighing the cost of memory against performance of disk-based storage systems.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.