MemVerge, a San Jose, California-based enterprise infrastructure provider that taps high-speed memory and storage optimized for AI and data science workloads, today emerged from stealth with $24.5 million in series A funding led by Banyan, with participation from Gaorong Capital, Jerusalem Venture Partners, LDV Partners, Lightspeed Venture Partners, and Northern Light Venture Capital. The fresh capital will be used to “significantly” expand engineering, sales, and marketing teams, said MemVerge CEO Charles Fan, and to accelerate product development.
“The transformation of the data center is long overdue,” added Fan, a former EMC and VMware executive who cofounded the company with Caltech colleagues Professor Shuki Bruck and postdoctoral student Yue Li. “By eliminating the boundaries between memory and storage, our breakthrough architecture will power the most demanding AI and data science workloads today and in the future at memory speed — opening up new possibilities for data-intensive computing for the enterprise.”
MemVerge’s secret sauce is memory-converged infrastructure (MCI), a system architecture it claims is one of the first on the market to incorporate Intel’s Optane DC Persistent Memory (PM). Aided by the company’s proprietary distributed memory objects (DMO) tech, it provides a “convergence layer” with “sub-microsecond” response time that delivers up to 10 times the memory size and 10 times the data input/output speed compared with conventional solutions.
Optane DC PM is largely responsible for the impressive performance. It’s the newest in Intel’s 3D Xpoint memory portfolio, a non-volatile memory technology developed jointly by Intel and Micron Technology that’s PIN-compatible with DDR4 and combines large caches (up to 512GB) with smaller DRAM pools (for instance, 256GB of DDR4 RAM combined with 1TB of Optane DC PM).
The AI Impact Tour
Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you!
Paired with the latest generation of Xeon Scalable Processors, Intel pegs Optane DC PM’s performance at 287,000 operations per second (versus a conventional DRAM and storage combo’s 3,164 operations per second), with a restart time of only 17 seconds. Furthermore, it says Optane DC PM is up to eight times faster in Spark SQL DS than DRAM (at 2.6TB data scale) and supports up to nine times more read transactions and 11 times more users per system in Apache Cassandra.
Intel launched a beta for Optane DC PM on October 30. Google, an early partner, recently announced the alpha availability of virtual machines with 7TB of memory using Intel Optane DC PM and said that some of its customers have seen a 12 times improvement in SAP Hana startup times.
MemVerge’s system is available in beta starting today. It’s been used by LinkedIn, JD.com, and Tencent Cloud.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.