How HPE Designed The Machine to Handle Challenging Big Data Projects

1 of 12

How HPE Designed The Machine to Handle Challenging Big Data Projects

After 14 years of research and development, Hewlett Packard Enterprise has unveiled The Machine, which it describes as a supercomputer with 160 TB of main memory. In fact it's a memory-driven high performance computer, which means it makes memory, not processor chips, the workhorses of the computing architecture, according to HPE. HPE designed The Machine as a groundbreaking computer that was designed to take on challenging computing tasks required by a big data dominated world. The Machine has the memory and processing power to crunch the volume of data contained in 160 million books. This will enable it to handle such data-intensive tasks as health record analysis, genomics, weather forecasting, oil and gas exploration or quantum mechanical calculations. Read on to learn more about The Machine's capabilities.

2 of 12

Understanding the ‘Data Dilemma’

To understand The Machine, one must first understand what HPE calls the “Data Dilemma.” Processing speed isn’t accelerating quickly enough to match the volume of data being created, it says. Today’s technology relies on 60-year-old basic chip architecture, and computers use 90 percent of their resources just moving data between memory and storage.

3 of 12

What Is Memory-Driven Computing?

The Machine uses memory-driven computing, which gives every processor in a system access to its entire pool of memory, rather than using memory only for certain applications. According to HPE, today’s method creates inefficiencies and throttles data analysis. The Machine aims to create much faster and capable machines.

4 of 12

It's Designed to Handle on the Biggest of Big Data Analytics Chores

HPE believes today’s computing environment isn’t ready for Big Data. Memory is too far from the processing power, data is too big and machines are not powerful enough. The Machine was designed to be the world’s biggest single-memory system, reliably holding and processing massive amounts of information in as little time as possible.

5 of 12

The Machine Has 160TB of Shared Memory

At the heart of The Machine computer prototype is 160TB of shared memory, which is shared across 40 physical nodes and has more capacity than any other single-memory system ever created. With its memory, The Machine can work on crunching data equal to about 800 million books, all at the same time.

6 of 12

This Is a Linux-Based Technology

The Machine prototype runs a specialized version of Linux. The operating system runs on Thunder X2, a high-end ARM-based system-on-a-chip. It also includes software programming tools to help developers optimize their solutions for The Machine’s persistent memory.

7 of 12

What The Machine’s Future Might Look Like

The Machine prototype has 160TB of memory, but HPE believes in the not-so-distant future, The Machine’s architecture could scale to support 4,096 yottabyes of memory. According to HPE, that’s about 250,000 times the size of the current “digital universe” and all its data.

8 of 12

Looking to the Community for Help

HPE has said in numerous postings about The Machine that it wants—and perhaps even needs—the help of the broader computing science community. So, The Machine is entirely open-source and available to the community to tweak and improve as time goes on. It’s unclear, however, how many organizations are working behind the scenes on The Machine.

9 of 12

Security and Detection Could Be Improved

HPE says the security community could benefit greatly from The Machine, noting the increase in ways hackers can target companies and number of internet of things-enabled devices coming online. Because it can more quickly analyze and cut through data, The Machine should do a better job of detecting problems, HPE says.

10 of 12

Some More Ways The Machine Can Help

HPE described other ways in which The Machine could benefit industries. The Machine could one day analyze far more information in real time to limit delays in air travel, conserve fuel and help pilots circumvent bad weather. In health care, The Machine’s ability to crunch so much data could help doctors quickly diagnose and treat problems at a fraction of the time.

11 of 12

Major Questions Remain

While HPE believes The Machine could profoundly impact the world, The Machine currently is still a prototype and the project is in its infancy. It’s unknown how much The Machine would cost, when it will launch and whether any companies have signed on as customers. But The Machine is not a mass market product. Typically a few such ultra high-performance machines are sold to U.S. government agencies, university research centers, and enterprises with deep pockets and special needs.

12 of 12

China, U.S. Retain Tops Spots on Global Supercomputers List

The 48th edition of the Top500 supercomputer list, which is updated twice a year, contains few surprises. As with previous lists, China and the United States continue to lead other technologically advanced countries around the world in building advanced supercomputers. In fact, the two countries’ companies own more than two-thirds of the world’s fastest supercomputers. They’re followed on the list by Germany, Japan and France, which still have a long way to go to catch up to the leaders. It’s a similar story in the top 10, where few supercomputers changed their positions on the list from June’s report. The top three supercomputers listed in June remain in the same spots, while two supercomputer newcomers were able to break into the top 10. As in years past, Intel continues to be the dominant chipmaker and components from Hewlett-Packard Enterprise are still used in most systems. This slide...
Top White Papers and Webcasts