This article is part of the Technology Insight series, made possible with funding from Intel.
Optane media may get all the buzz, but it’s Optane’s various modes that make it even more powerful in blurring the storage/memory boundary and redefining what can be done in analytics, database, OLTP, and other critical applications.
In Part 1, we covered the what and why of Intel Optane technology and the different implementations of its media into three products: Optane Memory, Optane SSDs, and Optane DC persistent memory modules (DCPMMs). We also touched on how Intel software is a key ingredient in turning 3D XPoint into Intel Optane. We'll round out our 101 with a brief look at configurations and modes.
Why? While it’s true that Optane DCPMMs require 2nd Generation Xeon Scalable processors or later, much of the Optane magic hinges on how software – especially in settings such as virtualized cloud, AI/analytics, and HPC – can make optimal use of Optane when that media is configured into given modes.
Optane Modes
DCPMMs can be configured into three possible modes: Memory Mode, App Direct Mode, and Dual Mode. Note that DCPMMs do not replace DRAM; you still need some DRAM in the system. But how much DCPMM capacity you complement the DRAM with will depend on your mode and specific application needs.
Memory Mode turns DRAM into an L4 cache. The cache is non-addressable and doesn’t show up in system memory counts, so all user-addressable capacity is the sum of the DCPMM capacities. No programming is needed for Memory Mode, but data contained in DCPMMs is volatile, just like with DRAM. Memory Mode provides a large memory pool with low but effective DRAM cache investment.

App Direct Mode allows the system to handle DRAM and DCPMM resources independently, so operations that require top speed can address DRAM and the rest can rely on larger Optane resources. Under App Direct, data in DCPMM remains persistent, which can be very helpful in minimizing large configuration reload times following a power cycle or reset. However, applications must be optimized to take advantage of App Direct. Some are already, and more are coming. Also, some user programming may be needed.

Dual Mode is a sub-set of App Direct Mode that allows some DCPMM resources to operate in Direct Mode and the remainder in App Direct.
DCPMM modes, especially the first two, tend to get the most attention for performance reasons. However, if we turn our attention to Optane SSDs, we can employ Intel Memory Drive Technology back across the CPU’s PCI Express bus. In effect, Memory Drive is much like Memory Mode, including in having data volatility, only Optane SSDs are serving in place of DCPMMs. Naturally, there’s more latency involved in reaching out to SSD resources, but the size of memory pool possible can be relatively gargantuan. As a way to contain costs on projects that require memory capacity above all else, Memory Drive can be a life-saver.

As mentioned previously, DCPMMs can be configured for either persistent or volatile operation and features latencies that are near DRAM in performance. This lends DCPMMs to deployment with workloads such as:
- SAP HANA
- Real-time analytics
- Database and/or database cache
- All virtualized and hyperconverged infrastructure memory extension
- Live streaming and hot video-on-demand workloads
- Apache Spark
- Major software packages, both open source and not, for data centers (may require optimization for Optane)
- HPC Flex memory
- KVM memory extension
- Spark, k-means clustering
- Cloud service provider (CSP) use cases
- Ceph block/object
- VMware vSAN
- Hadoop YARN temp
- Cisco HyperFlex
- Microsoft SQL
Wrapping up
And that’s about it for the Optane basics. We expect the technology to slowly make more inroads in the client space, but today it remains a clear data center play. Exactly how much cost advantages and performance gains can be provided by Optane implementations will depend on an organization’s specific needs and workloads.
But the ability to bring storage so much closer to the CPU, to the point that it becomes indistinguishable from memory, is compelling and certain to open new opportunities for application developers and enterprises alike.
