Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
In-memory computing startups are hot at the moment. You don’t need to look further for evidence than Hazelcast, an eight-year-old San Mateo, California-based startup developing a speedy low-latency data processing platform. After raising $21 million in June 2019, the company today announced that it’s secured $50 million in an oversubscribed series D venture and debt round (a combination of $28.5 million in addition to $21.5 million raised in June) co-led by the European Bank for Reconstruction and Development (EBRD) via its Venture Capital Investment Programme and Deutsche Investitions- und Entwicklungsgesellschaft mbH (DEG), with participation from C5 Capital, Bain Capital Ventures, Earlybird Venture Capital, Capital One Growth Ventures, and Comerica Bank (which provided the growth debt facility).
CEO Kelly Herrell said the capital infusion will be used to expand the company’s footprint and accelerate various edge, enterprise cloud, and machine learning initiatives, including a previously announced coengineering program with Intel and a reselling agreement through IBM.
“The world’s largest and leading enterprises are leveraging in-memory computing to power a new breed of business-critical applications,” said Herrell in a statement. “Whether they are gaining their competitive advantage from real-time fraud detection, payment processing, risk analysis, predictive maintenance, connected cars, or any application to better serve customers, the ultra-low latency delivered by Hazelcast is at the root of business success for these digitally inspired applications.”
Hazelcast — the brainchild of Enes Akar, Fuad Malikov, and Talip Ozturk — says its distributed architecture and high-speed event processing solutions provide redundancy for time-sensitive services like payment processing and fraud detection, complementing existing systems of record, such as databases. To this end, the company’s in-memory data grid technology (IMDG) has a low-latency layer that resides in random access memory (RAM) as opposed to a hard drive, enabling apps to store up to terabytes of data distributed among nodes to protect against the failure of any single network node. Hazelcast claims that the response time for complex transactions is often cut from minutes to sub-milliseconds.
The AI Impact Tour
Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you!
IMDG confers the added advantage of multi-cluster synchronization across datacenters, plus a streamlined upgrade process that minimizes (or even eliminates) service interruptions. And IMDG’s Hot Restart Store feature allows full recovery to previous configuration states in the event of an unexpected shutdown.
Hazelcast’s ancillary Jet offering is a stream processing framework built on the foundation of IMDG. Concretely, it’s a fully app-embeddable and distributed computing solution for big data set processing. Jet’s parallel streaming core engine lets data-intensive programs operate at near real-time speed, Hazelcast claims, and provides stream, batch, and RPC processing tools, along with a range of connectors to enable integration with data processing pipelines.
There’s also Hazelcast Cloud, a fully managed service that orchestrates, scales, and updates IMDG clusters. It’s supported on multi-region public clouds like Amazon Web Services, Microsoft’s Azure, Cloud Foundary, OpenShift, and Google Cloud Platform and on-premises and hybrid clouds (plus Docker containers). And it includes a minimum of 200MB of memory and a production environment that plays nicely with programming languages that include Java, Node.js, Python. Go, .NET, Scala, Goland, Colojure, and C++. Notably, it ships with a custom cloud service provider interface, enabling nodes to discover each other automatically.
No matter the product, it’s all orchestrated from within Hazelcast’s Management Center, a web-based user interface that provides an overview of cluster activity and configurable watermarks for alerts.
“Due to the transformative nature of edge computing and machine learning, new in-memory technologies are required, and Hazelcast has a clear and compelling vision that [can be] applied to any enterprise that seeks to retain or advance its leadership,” said EBRD venture capital investment program principal Zoltan Hopka. “Hazelcast and its unified cloud-to-edge platform present powerful development opportunities on a global scale, which is in line with the type of venture opportunities EBRD seeks to support.”
Hazelcast’s customers include 24 of the world’s largest banks, eight of the largest retailers, and six of the largest telecoms, including JPMorgan Chase, Charter Communications, Ellie Mae, UBS, and National Australia Bank. This assuredly represents a not-so-insubstantial chunk of change, given that the in-memory computing segment is anticipated to be worth $31.06 billion by 2026. Among Hazelcast’s sometime rivals are memory-centric distributed storage system startup Alluxio, as well as consolidated storage developer Igneous Systems, Kubernetes native storage and data management firm Reduxio, and content delivery services provider OnApp.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.