Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More

NeuroBlade, a company developing an in-memory inference chip for datacenters and edge devices, today announced that it raised $83 million in a funding round led by Corner Ventures and Intel Capital with participation from StageOne Ventures, Grove Ventures, Marius Nacht, MediaTek, Pegatron, PSMC, UMC, and Marubeni Ventures. The company says that it’ll use the capital to bring its technology to the mass market, specifically by expanding its technical team and building a larger presence worldwide, establishing partnerships, and ramping up production lines.

In-memory computing performs analytics in RAM rather than across processors, disks, and memory. The technology promises to help enterprises solve the growing challenge of managing — and leveraging — big data. For example, because in-memory computing can improve analytics performance and scale to large quantities of data, it can allow businesses like retailers, autonomous vehicle manufacturers, health care firms, banks, and utilities to detect anomalous patterns and perform operations quickly.

Founded in 2017 by Elad Sity and Eliad Hillel, Israel-based NeuroBlade is developing chips that combine both compute and memory into a single hardware block. Sity claims that by integrating the data processing function inside of memory, NeuroBlade’s chips can minimize data movement bottlenecks to “enable new scales of processing in both capacity and speed.”

NeuroBlade in-memory computing

Above: NeuroBlade’s chip architecture.

Image Credit: NeuroBlade

“The concept for NeuroBlade was born out of a game Hillel and I would play. We would sit and tinker with stock trading in our free time,” Sity, who serves as CEO, told VentureBeat via email. “Stock trading takes enormous amounts of data and requires extremely fast processing of that data. So, like any good engineer, we wanted to make that process better. First, we would run models with a CPU and then move them quickly to a GPU based system. But this was not enough. Current systems didn’t process the data fast enough. So we built a new one.”


Transform 2023

Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.


Register Now

In-memory processing

NeuroBlade isn’t without rivals in an in-memory computing market that was estimated to be worth $23.15 billion in 2020. GigaSpaces is also developing in-memory computing solutions for data analytics, AI, and machine learning. Hazelcast offers speedy in-memory compute services. There’s also managed AI startup, which recently raised $72.5 million in funding to further develop its in-memory compute offerings.

But NeuroBlade, whose roughly 100-person team hails from companies including Intel, Marvell, and Texas Instruments, claims it’s the only startup to have commercialized an in-memory processor using a combination of software and hardware tools. Sity says that NeuroBlade’s technology — thousands of parallel processors embedded in wide-I/O-bus DRAM memory blocks — can speed up data processing and analysis by over 100 times at lower energy compared with existing systems.

“Amazon has taken the approach of utilizing caching to solve the problem with a proprietary caching layer called Amazon Aqua. However, this is not something that the rest of the market can leverage. Nvidia has made some great advancements with things like Nvidia GPUDirect to remove the latency caused between the GPU and CPU complex. With all that said, we do not know anyone who has specifically built a whole infrastructure stack [50 times faster than DDR5] from silicon to software to solve this problem,” Sity said.

NeuroBlade software suite

Above: The NeuroBlade software suite.

Image Credit: NeuroBlade

Ahead of a planned production push, NeuroBlade says it’s begun shipping its chips, a chip-packing appliance with storage, and a software development suite for monitoring, management, debugging, compiling, and programming to early customers. One is SAP, which plans to work with NeuroBlade on ways its chips could be deployed in server racks.

“The performance projections and breadth of use cases prove great potential for significantly increased performance improvements … at higher energy efficiency and reduced total-cost of ownership on premises and in the cloud,” SAP head of innovation Patrick Jahnke said in a statement. “Through this exciting collaboration with NeuroBlade, SAP will unlock new possibilities to build the datacenter of the future.”

In the coming months, NeuroBlade intends to double its headcount to around 200 people and “enrich [its] product further.” To date, the company has raised $110 million in venture capital.

“[W]e are only scratching the surface of our data analytics acceleration and computational memory technology. Further growth will enable us to bring more of these ideas to life and we look forward to the future,” Sity said. “[A]s the amount of data created, captured, copied, and consumed worldwide grows exponentially, our technology becomes that much more critical … We’ve heard firsthand from specific customers that if the data is too big, they are required to cut away at some of it, thus lowering the accuracy of the results. [M]any customers expressed their excitement about seeing the entire picture without any unnecessary constraints on the collected data.”

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.