MILPITAS, Calif.—Despite what you may have read, the biggest and most important meta-trend in the IT business isn’t cloud computing. It’s not so-called “digital transformation,” and it’s not big data analytics. Nor is it software-defined anything.
The biggest trend in IT? Convergence—the increasing convergence of new and conventional capabilities in processors, servers, network pipes, personal mobile devices and vehicles. This includes the convergence of factors that has brought us to this point late in the second decade of this millennium: faster chips, faster networks, leaner code, unlimited storage, huge improvements in device performance—we could go on.
It is the convergence of all of this plus the enduring support and leadership of the open source and venture capital communities that have underwritten the explosion of creativity we’re seeing in this business during the last few years.
Add to this another part of that trend, and this drills down into the storage sector: the convergence of storage media and memory inside new-generation data centers and network fabrics. This was a key topic at the Storage Visions 2017 conference held at the Embassy Suites conference center here Oct. 16.
Persistent Memory: It’s What’s on the Agenda
Persistent memory is the prime new physical storage technology at the center of this trend. The first product in Hewlett-Packard Enterprise’s Persistent Memory portfolio—the 8GB NVDIMM (non-volatile DIMM, dual in-line memory module) which came out in 2016—brings together the performance of DRAM (dynamic memory) and the persistent, non-volatile capabilities of storage to speed up workloads such as databases, big data analytics and online transaction processing by removing storage bottlenecks. It’s based on the industry-standard NVDIMM-N technology.
The NVDIMMS—which can keep data even when the power goes out—combine 8GB of DRAM and 8GB of NAND flash storage. When there is a power outage, the flash memory can act as a backup for the DRAM.
The goal of persistent memory is simply to enable businesses to get more value from their data. Early use cases are indicating this to be true, according to HPE.
Here are some of the key data points and industry insights that emerged from several keynote speakers and panel discussions at the event:
- Solid state memory is enlarging the sector’s visions of the future of storage and memory applications. Persistent memory is now an important architectural approach as it allows flexibility and lower latency by allowing access to storage directly from the DRAM bus. Like persistent memory, NVMe (non-volatile memory) enables fast storage unhindered by legacy HDD (hard disk drive) interface standards. NVMe’s new I/O Determinism feature enables an application program to better control flash performance.
- NVMe over fabrics is extending these standards by creating a new level of systems connectivity. DIMM-based flash memory also enables the movement of content closer to processing. New computing and network architectures will carry this further, enabling memory-centric processing. Emerging memory technologies are enlarging the tiers of solid state storage for consumer, client and enterprise applications.
- Emerging non-volatile solid-state storage technologies are set to replace or supplement DRAM in many applications. New fabric technologies will enable fast network storage using NVMe devices. Flash memory is moving to more and more 3D layers with three level and four level cells capable of reducing the costs for flash memory—and driving its use in more applications. At the same time, HDDs as well as magnetic tape and optical storage are getting faster. Go here to see what WD is doing in this area.
- New digital storage technologies and persistent memory are driving changes in the requirements for storage management and interfaces. Changes in PCIe and memory bus technology in addition to software-defined storage and artificial intelligence will better manage the full spectrum of digital storage cost-effectively to match the needs of different applications.
- During the entire history of computing, data has resided in storage and memory and has been summoned to the data processing element as needed. Now the industry is discovering that the movement of big data to the processor consumes inordinate power and incurs significant time penalties. Industry experts discussed their current efforts to move compute to the data to save power, accelerate processing speed, and even improve scalability, in order to greatly enhance the cost/performance of tomorrow’s computers.
- Object-based storage is showing up in more and more applications; it’s just no longer just tied to archiving. Cloud storage is enabling a new ecosystem of services that are promoting economic growth, supporting both industrial and consumer IoT, connected and autonomous vehicles and “smart” everything. New hierarchies are being forged for local and remote storage.