SGI to Build Supercomputer for Climate Research Center

The "Cheyenne" system is based on SGI's ICE XA, the latest example of the growing demand for more supercomputing power by weather researchers.


Supercomputer maker SGI will build the next generation system for the work being done by the National Center for Atmospheric Research regarding climate change and a range of other atmospheric issues.

SGI officials announced Jan. 11 that the supercomputer that will be used by the center will be based on the vendor's new ICE XA system and will be 2.5 times more powerful and three times more energy efficient than the facility's current system, called "Yellowstone."

The new supercomputer at the National Center for Atmospheric Research (NCAR) will be called "Cheyenne." The system "be an important tool for researchers across the country to understand climate change, severe weather, air quality and other important atmospheric and geoscience topics," SGI President and CEO Jorge Titinger said in a statement.

Cheyenne, which will be operational next year, is the latest example of growing demand for supercomputing power among weather forecasting and climate research centers, which continue to push to find ways to more quickly and accurately project weather events and gain insights into a rapidly changing global climate. Both SGI and rival Cray have won numerous contracts from weather research facilities for supercomputers over the past several years.

For example, SGI in 2014 announced that NASA's Center for Climate Simulation was buying 1.9 petaflops of Intel-based Rackable clusters to augment the facility's Discover supercomputer. For its part, Cray over the past two years has agreed to supply supercomputing power to such facilities as the National Oceanic and Atmospheric Administration (NOAA), Australian Bureau of Meteorology, Swiss National Supercomputing Centre and MeteoSwiss, and Finnish Meteorological Institute for weather and climate research.

"Improving and expanding weather and climate forecasting activities is a challenging proposition, requiring investment in fundamental scientific research, development of more scalable numerical models, and enhanced supercomputer resources upon which those models can run," Philip Brown, earth sciences segment leader at Cray, wrote in a post on the company blog in September 2015. "But the benefits are clear: In addition to the expansion of scientific knowledge, investment in weather and climate forecasting provides important socio-economic returns."

For NCAR, SGI's ICE XA will aid researchers in their work in better understanding the Earth's atmosphere and geospace systems. Scientists at the center are working to determine how regions will be impacted by rising sea levels and changing storm patterns, precipitation and temperature, and the supercomputer also will help them better predict climate patterns over the next 10 years or more, according to NCAR officials.

In addition, the system will enable researchers to offer improved predictions around severe atmospheric events, such as hour-by-hour risks from thunderstorms.

Cheyenne will run highly complex, data-intensive calculations that can be used by governments, businesses and local communities for preparation for atmospheric and weather events. The system will be able to process 5.34 petaflops (quadrillion calculations per second), and will include more than 7,000 next-generation Xeon server processors from Intel in a modular design that will include SGI's E-cell warm water cooling technology. Other features include an enhanced hypercube interconnect based on Mellanox Technologies' EDR InfiniBand and 20 petabytes of storage from Data Direct Networks.

The supercomputer will run SGI's HPC (High-Performance Computing) Software, which includes Performance Suite for parallel application processing and Management Suite for auto-configurability and power, health and remote system management.

SGI officials pitch the ICE XA systems as well-suited for weather and climate research, given its flexible configurability, system software stack, performance and power efficiency. All of that's important to an industry that is craving more compute power to improve forecasts and research.

In his blog post, Cray's Brown wrote that researchers want to increase the number of input sources and improve the methods for assimilating data, enhance the resolution of forecast models, bring new physics and chemistry into models, introduce interactions with parts of the Earth system (such as solar radiation and ocean surface), and run many models with similar initial conditions and physics in parallel. They also want to run a greater variety of simulations.

"One common thread with all of these is that they significantly increase the amount of computational effort required to complete a forecast," he wrote. "For example, if the resolution of a forecast is doubled, computational intensity may increase by almost 6x. Because a forecast is only useful if it's delivered within a fixed time window, a higher-resolution forecast must be run across a larger set of resources, challenging the scalability of forecast models.

"Given the clear value of higher-accuracy and more diverse weather and climate forecasts, and the return on investment weather centers around the world are able to demonstrate to their funding agencies and users, I expect the investment in improving and expanding forecasting capabilities to continue over the coming years."