Close
  • Latest News
  • Cybersecurity
  • Big Data and Analytics
  • Cloud
  • Mobile
  • Networking
  • Storage
  • Applications
  • IT Management
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
Read Down
Sign in
Close
Welcome!Log into your account
Forgot your password?
Read Down
Password recovery
Recover your password
Close
Search
Menu
eWEEK.com
Search
eWEEK.com
  • Latest News
  • Cybersecurity
  • Big Data and Analytics
  • Cloud
  • Mobile
  • Networking
  • Storage
  • Applications
  • IT Management
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
More
    Home Development
    • Development
    • Innovation
    • PC Hardware

    How Intel Optane Helps Unify the Memory-Storage Pool

    By
    ROGER KAY
    -
    September 7, 2020
    Share
    Facebook
    Twitter
    Linkedin
      New.Intel.Optane

      Several years ago, a new memory type began to penetrate the market. Intel had been developing the technology later branded Intel Optane since 2012. Optane technology delivered technical advances on a number of fronts, with the result being a kind of non-volatile memory that was almost as fast as volatile working memory (dynamic random-access memory or DRAM) and also had the ability to retain data when power was turned off. DRAM stays “lit” with information only as long as electricity courses through it. In most computers, DRAM is stationed where, for example, a running program or working data sits.

      While Optane can’t quite match DRAM in speed, it has the advantage of persistence. That is, it doesn’t need electricity to retain its state. Persistence is often associated with magnetism. Hard disk drives and tapes use a magnetic field to set a location to either a one or a zero. Most solid state drives on the market today use a technology that changes a voltage state to alter a bit’s numerical value. Optane technology can flip bits one at a time by changing their electrical resistance instead, a more efficient process. This bit-addressability—which, like DRAM architecture, allows random access—gives Optane a speed advantage over even today’s SSDs, which require reading and writing in blocks of data.

      All these tricks don’t come for free, and so Optane is priced between faster, volatile DRAM and the slower, persistent NAND modules used in SSDs. In a certain sense, Optane is a hybrid—not so much technologically, but on a feature basis.

      Sits in the system between DRAM and NAND

      From a market perspective, this combination of technical characteristics and economics has allowed Optane to insert itself into the memory/storage pool between DRAM and NAND. It fits nicely on a continuum of price and performance, helping smooth data’s pathway to and from increasingly distant portions of the pool. Optane provides a lower-cost alternative to expensive DRAM, allowing larger memory footprints for the same price, and much-faster-performing SSDs, which can act as fast storage caches in front of slower NAND storage. Thus, Optane fills a gap in the storage continuum between high-speed, expensive DRAM and less-expensive, slower NAND.

      If the fastest, most expensive memory is right next to the central processor, the slowest, least expensive is far away.

      At the very outer ring is good old reliable cheap magnetic tape, designed for huge reams of storage that few people want immediate access to. On the next ring in is found traditional magnetic hard drives with their rotating mechanical spindles. They are slow, but big and inexpensive. Good for long-term storage. Getting data from them is relatively easy if one is not in a hurry. Closer still, traditional NAND-based SSDs, faster and more expensive, can begin to participate in near-real-time analytics. These days, SSDs are freed from the communications constraint of the previous storage interface standard, SATA, which, while fast for its time, had become a bottleneck. Today’s SSDs take advantage of the Non-Volatile Memory Express (NVMe) standard, which is faster than any connection in the system except of the processor-memory link.

      Then comes the Optane layer, which is really two layers, depending on the form factor.

      SSDs based on Optane, using the fast NVMe channel, exceed what NAND-based SSDs can do. With this level of performance, Optane SSDs can greatly accelerate data access via a fast cache or high-speed storage tier. This capability is especially important for an online transaction processing (OLTP) system, in which access to data sets larger than the memory footprint are needed.

      Also worth mentioning at this point is Optane drives’ endurance. With 20 times the life of a high-end enterprise class NAND SSD, Optane can perform many more read and write operations without wearing out, making it ideal for fast caching, which involves a constant flood of operations. A side benefit of this endurance is the ability to reduce the size of the caching layer because Optane doesn’t require the degree of over-provisioning necessary with NAND storage.

      Closer in still, memory modules made with Optane technology can participate in operations even more tightly coupled to the processor. They can do this by way of the memory bus, a dedicated high-speed connection that memory shares with the processor. With this extra scooch of performance, Optane memory can stretch the capacity of DRAM to encompass some of the more challenging data analysis problems, such as large in-memory databases like SAP HANA or Oracle for real-time analytics or artificial intelligence.

      An additional advantage of Optane memory is its persistence. One might ask: Why is persistence interesting if it’s only honored in the breach? That is, as long as the electricity doesn’t fail, why do you need it? After all, DRAM doesn’t have persistence, and many real-time analytical programs run just fine in main memory. The answer is: There’s an additional performance benefit with persistent memory, which is that the system doesn’t have to take the time to offload and save back vital data that must be replicated just in case of power loss. This step can be skipped with non-volatile memory, which will keep the data even if the juice cuts out. Although Optane persistent memory products are relatively new to the market, they have already won an innovation award and set a performance record.

      Optane = Low latency

      The discussion of Optane would not be compete without a reference to latency. A terrific advantage of Optane is its low latency. If speed measures how fast data moves through a channel, latency refers to how long a request has to wait before receiving data, essentially the startup overhead of a data request. With its bit-addressability, Optane can deliver any size data request with scant delay. This capability is particularly important when many, small data requests are made.

      In NAND-based SSDs, which can address data only in blocks, this type of pattern quickly overwhelms the system’s responsiveness. By contrast, Optane SSDs deliver fast, consistent read latency, even under a heavy write load, a predictability associated with higher quality-of-service levels.

      Next inward is the DRAM ring. As noted earlier, DRAM is fast, but expensive and volatile. To some degree, its speed is throttled by the memory bus, which, although quite fast, is not the last word. Because there are several more layers, all of them right on the processor die. These are the cache levels, up to three of them, which store temporary results from processor calculations. Relatively speaking, caches are small, super fast, very expensive, and fixed (their sizes are set in processor design and finalized in manufacturing).

      Seen another way, the rings of the memory/storage pool can be represented as a pyramid,

      which gives a notion of the size of each tier. At the bottom is the largest, slowest, least-expensive-per byte storage. At each level, quantity decreases while cost and performance rise.

      With today’s storage elements, data can be moved up and down through the pyramid, depending on the degree to which it is needed for immediate computation. Intel has created tools to help application software engineers manage data location optimally.

      Optane can deliver a big performance boost when used in front of a large array of magnetic storage. One example of an existing application of this type is SAP HANA. According to Intel executives, customers value Optane’s predictable performance, which consistently delivers a high quality of service per transaction.

      In hyperconverged systems, where virtualization, compute, networking and storage subsystems can be configured by software, Optane provides a vital link between faster memory and slower magnetic storage, reducing system bottlenecks and allowing increased virtual machine density.

      As mentioned earlier, the applications best able to take advantage of this smooth span of hierarchical memory and storage are in-memory analyses of large databases composed of mixed (structured and unstructured) elements. Today, such implementations are mostly found in the giant cloud service providers and the largest enterprises, which have the scale obtain the maximum benefit. Some very large enterprise customers can also reap these benefits. At some point, the service providers may be able to provide access to smaller customers as a service.

      Most large hardware OEMs are adopting Optane in their converged products. For example, Dell’s highest-end VxRail hyperconverged infrastructure products feature both Optane persistent memory and Optane SSDs.

      While still early days for Optane in the market, its promise is such that proliferation of Optane-enhanced systems is likely. More enterprises of all sizes will seek to distill instant wisdom from large, disparate sources of real-time data, and those unable to create and manage such hyperconverged systems themselves will likely turn to service providers for the capability.

      Roger Kay is affiliated with PUND-IT Inc. and a longtime independent IT analyst.

      MOST POPULAR ARTICLES

      Android

      Samsung Galaxy XCover Pro: Durability for Tough...

      CHRIS PREIMESBERGER - December 5, 2020 0
      Have you ever dropped your phone, winced and felt the pain as it hit the sidewalk? Either the screen splintered like a windshield being...
      Read more
      Cloud

      Why Data Security Will Face Even Harsher...

      CHRIS PREIMESBERGER - December 1, 2020 0
      Who would know more about details of the hacking process than an actual former career hacker? And who wants to understand all they can...
      Read more
      Cybersecurity

      How Veritas Is Shining a Light Into...

      EWEEK EDITORS - September 25, 2020 0
      Protecting data has always been one of the most important tasks in all of IT, yet as more companies become data companies at the...
      Read more
      Big Data and Analytics

      How NVIDIA A100 Station Brings Data Center...

      ZEUS KERRAVALA - November 18, 2020 0
      There’s little debate that graphics processor unit manufacturer NVIDIA is the de facto standard when it comes to providing silicon to power machine learning...
      Read more
      Apple

      Why iPhone 12 Pro Makes Sense for...

      WAYNE RASH - November 26, 2020 0
      If you’ve been watching the Apple commercials for the past three weeks, you already know what the company thinks will happen if you buy...
      Read more
      eWeek


      Contact Us | About | Sitemap

      Facebook
      Linkedin
      RSS
      Twitter
      Youtube

      Property of TechnologyAdvice.
      Terms of Service | Privacy Notice | Advertise | California - Do Not Sell My Info

      © 2020 TechnologyAdvice. All Rights Reserved

      Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.

      ×