How HPE Designed The Machine to Handle Challenging Big Data Projects












How HPE Designed The Machine to Handle Challenging Big Data Projects
After 14 years of research and development, Hewlett Packard Enterprise has unveiled The Machine, which it describes as a supercomputer with 160 TB of main memory. In fact it's a memory-driven high performance computer, which means it makes memory, not processor chips, the workhorses of the computing architecture, according to HPE. HPE designed The Machine as a groundbreaking computer that was designed to take on challenging computing tasks required by a big data dominated world. The Machine has the memory and processing power to crunch the volume of data contained in 160 million books. This will enable it to handle such data-intensive tasks as health record analysis, genomics, weather forecasting, oil and gas exploration or quantum mechanical calculations. Read on to learn more about The Machine's capabilities.
Understanding the ‘Data Dilemma’
To understand The Machine, one must first understand what HPE calls the “Data Dilemma.” Processing speed isn’t accelerating quickly enough to match the volume of data being created, it says. Today’s technology relies on 60-year-old basic chip architecture, and computers use 90 percent of their resources just moving data between memory and storage.
What Is Memory-Driven Computing?
The Machine uses memory-driven computing, which gives every processor in a system access to its entire pool of memory, rather than using memory only for certain applications. According to HPE, today’s method creates inefficiencies and throttles data analysis. The Machine aims to create much faster and capable machines.
It's Designed to Handle on the Biggest of Big Data Analytics Chores
HPE believes today’s computing environment isn’t ready for Big Data. Memory is too far from the processing power, data is too big and machines are not powerful enough. The Machine was designed to be the world’s biggest single-memory system, reliably holding and processing massive amounts of information in as little time as possible.
The Machine Has 160TB of Shared Memory
At the heart of The Machine computer prototype is 160TB of shared memory, which is shared across 40 physical nodes and has more capacity than any other single-memory system ever created. With its memory, The Machine can work on crunching data equal to about 800 million books, all at the same time.
This Is a Linux-Based Technology
The Machine prototype runs a specialized version of Linux. The operating system runs on Thunder X2, a high-end ARM-based system-on-a-chip. It also includes software programming tools to help developers optimize their solutions for The Machine’s persistent memory.
What The Machine’s Future Might Look Like
The Machine prototype has 160TB of memory, but HPE believes in the not-so-distant future, The Machine’s architecture could scale to support 4,096 yottabyes of memory. According to HPE, that’s about 250,000 times the size of the current “digital universe” and all its data.
Looking to the Community for Help
HPE has said in numerous postings about The Machine that it wants—and perhaps even needs—the help of the broader computing science community. So, The Machine is entirely open-source and available to the community to tweak and improve as time goes on. It’s unclear, however, how many organizations are working behind the scenes on The Machine.
Security and Detection Could Be Improved
HPE says the security community could benefit greatly from The Machine, noting the increase in ways hackers can target companies and number of internet of things-enabled devices coming online. Because it can more quickly analyze and cut through data, The Machine should do a better job of detecting problems, HPE says.
Some More Ways The Machine Can Help
HPE described other ways in which The Machine could benefit industries. The Machine could one day analyze far more information in real time to limit delays in air travel, conserve fuel and help pilots circumvent bad weather. In health care, The Machine’s ability to crunch so much data could help doctors quickly diagnose and treat problems at a fraction of the time.
Major Questions Remain
While HPE believes The Machine could profoundly impact the world, The Machine currently is still a prototype and the project is in its infancy. It’s unknown how much The Machine would cost, when it will launch and whether any companies have signed on as customers. But The Machine is not a mass market product. Typically a few such ultra high-performance machines are sold to U.S. government agencies, university research centers, and enterprises with deep pockets and special needs.