The scientific and tech communities for many years have been talking about the much-anticipated evolution to exascale computing. Now governments and vendors have spent billions of dollars developing hardware and software innovations that will make up the infrastructure of exascale systems that promise to be 50 times faster than some of the most powerful supercomputers currently in operation.
Increasingly complex applications such as big data analytics and machine learning along with slower processor performance improvements under
Moore’s Law have put a premium on developing new computer architectures that can handle workloads that are far beyond what most systems can offer today.
The rise of workloads like data analytics and machine learning have added momentum to the effort to develop exascale computing, which is defined as computer systems capable of performing at least one exaFLOPS, equal to a billion billion calculations per second. The first petascale computer was introduced in 2008, and exascale computing would provide a thousand-fold increase over that 10-year-old computer architecture.
Now, after years of discussion, hype and planning, exascale-capable systems are on the horizon. The Chinese government has rapidly increased its investment in exascale computing efforts, which includes three projects that are currently underway. One of those projects involves the development of a prototype called the Tianhe-3, which is scheduled to be ready by next year.
The United States, driven by the year-old Exascale Computing Project (ECP), has plans underway to unveil the first of its exascale-capable systems in 2021, followed by two more two years later. Meanwhile, governments and vendors in the European Union and Japan also have exascale efforts in the works.
While it’s still unclear exactly what these systems will look like or what new technologies they’ll implement, they promise to address the rapid changes underway in high-performance computing (HPC) and enterprise computing.
They will enable more detailed and complex simulations of everything from oil and gas exploration and climate change to drug development, genomics research and the ability to better store, process and analyze the massive amounts of data that is being generated by the internet of things (IoT) and the proliferation of mobile devices. Exascale computing is also sure to bring new advances in the fields of machine learning and artificial intelligence (AI).
The drive to exascale is also fueling a high-stakes international competition among a number of countries around the world, particularly the United States and China. The country that can lead the exascale race will have an advantage in everything from military and scientific research and development to business innovation.
Right now the United States continues to hold an edge in technology innovation, but China is putting a lot of money and manpower into growing its in-house capabilities to drive exascale development. At the same time, there is concern among some in the U.S. HPC community about the impact of possible budget cuts to the Department of Energy (DoE) under the Trump administration. The DoE is the primary sponsor of exascale development in the United States.
However the competition plays out and what the systems end up looking like, there’s little question that a shift to exascale computing is needed. The world has lived through a golden age of computing architecture driven in large part by relentless march of Moore’s Law.