Hewlett-Packard officials already are looking to set the tone in cloud computing infrastructure with their small Moonshot microservers, which CEO Meg Whitman has called as important a step in the server industry as the transition from mainframes to Unix and x86 system.
At the company’s HP Discover 2014 show last week, company officials also announced they were going to be a larger player in the supercomputer space, rolling out its new line of high-performance Apollo systems—including one with a new water-cooled design—that they said will enable HP to better compete with the likes of IBM and Cray.
However, executives for HP and HP Labs at the show also announced an entirely new open-source server architecture that will include the company’s much-touted memristor memory technology, custom processors, silicon photonics and its own operating system, a throwback to the days when HP and others—such as IBM and Sun Microsystems—made their own components and OSes for their systems.
Commodity components—from Intel processors to Windows and Linux—make up a lot of servers these days. However, HP has continued to make and improve upon various components of the systems—not only Windows and Linux, for example, but also HP-UX and NonStop—and can bring that expertise to bear to develop what officials are calling the Machine. According to Martin Fink, HP CTO and director of HP Labs, the storage and compute needs that are needed for such trends as mobile computing, cloud computing and the Internet of things are going to continue to grow and will overwhelm the current compute architecture.
“We’re seeing our customers—and the industry as a whole—dealing with a massive onslaught of data,” Fink wrote in a post on the company blog. “This huge and complex amount of data is growing at an exponential rate. We’re all struggling to keep pace today. Toward the end of this decade, data growth will come at us at a rate that surpasses the ability of our current infrastructure to evolve to ingest, store and analyze it. A step change in computing technology is required.”
He and Whitman said a new architecture is needed, and that HP will deliver that architecture within the next few years.
“HP has been talking about the individual component technologies for some time and now we are bringing them together into a single project to make a revolutionary new computer architecture that will be available by the end of the decade,” Whitman said, according to Enterprise Tech. “This changes everything.”
The Machine reportedly will leverage large numbers of special-purpose, application-specific processors to handle the range of workloads that are hitting the systems, and the memristor non-volatile memory technology that the company had been working on for several years (and which was first broached by scientists in 1971) before starting to talk about it at length in 2011. The system will also use high-speed silicon photonics—using light rather than copper wires—to link the compute and memory. HP also is developing the software—including the operating systems—to run the Machine.
HP Makes a Big Bet on the Machine
“[HP] envisions pools of processors and memory chips interconnected with photonic cables, which Fink said will carry data at up to 6TB per second,” Daniel Amor, EMEA (Europe, the Middle East and Africa) lead for application modernization for HP, said in a post on the company blog. “Managing the new architecture will require new operating systems. HP is building a Machine OS from scratch, but it’s also developing a version based on Linux and another with Google’s mobile OS.”
Not only will the Machine be able to address huge amounts of data and massive workloads at much greater speeds than current systems, but it will do this using significantly less power, according to Fink. Company officials reportedly expect the Machine will be six times more powerful than existing systems, but consume 80 percent less power, and that the architecture will be able to be used in everything from supercomputers and data center servers to PCs and smartphones.
“The notion of simply continuing to expand the current data center model isn’t a feasible one,” Fink wrote in his blog. “Today, Big Data means bringing all the data into one place. Tomorrow, some of data will be too big and too expensive to move. Tomorrow’s analytics will work where the data is created, transforming data locally into intelligence which is then sent to a centralized learning engine powered by The Machine. The Machine not only increases performance, it will also greatly reduce the amount of energy that is needed to achieve those speeds.”
Not everyone was enamored by the Machine. Dell officials at a meeting in San Francisco told reporters that the Machine was a dream of computer scientists that had little basis in reality. John Swainson, president of software at Dell, said that “the notion you can achieve some kind of magical state by re-architecting an operating system is laughable on the face of it.”
In an eWEEK article, Eric Lundquist said that HP’s efforts with its OpenStack-based Helion cloud computing project—which the company introduced in May, along with $1 billion to support it—will be more important to the company in the coming years than the Machine.