Wednesday, March 27, 2019

Memory and Storage Speeds on Optane DIMMS


Memory and Storage Speeds on Optane DIMMS

At Flash Memory Summit, I showed why 3D XPoint was advertised as 1000x faster but ended up only 7x Faster. It has to do with the fact that individual IOPS, latency, sequential read speed may or may not be limiting factor. Actual impact is always less. The best example is based on HDD to SSD comparisons from the many benchmarking sites. SSDs are technically 500x faster than HDDs… but in most applications your apps run <2x faster....




Next week, Intel will publicly update us with info on Cascade Lake and Optane DIMMS. The above mentioned challenge will be a bit of a two edged sword for Optane DIMMS.

  1. Optane DIMMS used as persistent memory will be 50 times faster than NVMe SSDs based on latency and bus speeds. This sounds great but we will find most applications do not see that level of benefit. The UCSD team did a great study on this. Whether it is worth the price and time to optimize is up to you and your applications
  2. On the other hand, Optane DIMMs used in main memory with a DRAM cache (“memory mode”) will provide tons of main memory that is theoretically 7-10x slower that DRAM RDIMMs based on latency… but actual speed difference will be less than that and if the data is “cache friendly”, maybe 90% of applications will see no measurable difference in actual use. Lots of memory, lower cost than DRAM, same measurable performance in most cases. Whether it is worth the price and time to optimize is up to you and your applications


So we will find that in actual use, Optane DIMMS are not 50x faster than SSDs… but they are not 10x slower than DRAM either. We will see what performance actual customers see in months to come

Wednesday, March 20, 2019

Cost and Performance of Optane DIMM Options in Use Today




Now that we have information on the performance of Optane DIMMs and data on recommended configurations from Intel, we are updating a blog from last year on cost and performance











Reference configuration is 192GB DRAM with 1TB NVMe SSD

With Optane, Intel is proposing 128GB DRAM with 1.5TB of Optane operating in memory mode (volatile/not persistent). The purpose is to add lots of cheaper, slower main memory. Persistence doesnt matter

Cost is about the same as today's configuration. What can we expect from this configuration? In theory…
·         Read Latency is 5-7x higher for Optane, but speed (MT/S) is only 3x slower for Optane. This is the same effect we see on latency vs speed in DDR2,3,4 transitions
·         Data sets larger than 192GB would run much faster on Optane DIMMS as no swapping is required with SSD
·         Datasets between 128 and 192GB would run slower because you got rid of some of the DRAM and replaced it with Optane. These now run with some of the memory running at 1/3 the speed
·         The trade off is simple. cost is the same. Do you have large datasets taking up more than 192GBs of memory? Are you OK with 1/3 the performance when accessing the Optane vs DRAM?
·         Reminder: In Memory mode the Optane is configured by controller as volatile memory.

A second configuration is App direct mode. In this mode we have true persistent memory. You can use it with load/store commands like memory or block level storage like SSDs. An example is replacing the NVMe SSD with persistent memory. Simplistic sense, Its like an SSD on the DRAM bus. Example

  •  Replace 512GB NVMe accelerator SSD with 512GBs of Optane persistent memory in app direct mode.
  • Cost goes from ~$400 for NVMe to ~$1800 for Optane DIMMs.
  • When accessing the Optane DIMM the speed is 10-50x faster depending on your metric and application.
  • It is persistent so data is never lost
  • Much more expensive but as an accelerator, much faster.

If you are an IT expert, a great summary of performance improvement seen from Optane DIMMs is shown here from the UCSD NVM Team. https://arxiv.org/abs/1903.05714

We have a matrix of lots of cost/performance options, including NVDIMMS available from other suppliers (Samsung, Netlist, Micron, Viking, Smart Modular, etc). call for more info.



Mark Webb