1Seven Ways Enterprises Are Using Supercomputers to Solve Global Issues
Supercomputing has morphed greatly since its early days back in the 1970s. From megabytes and megaflops to petabytes and petaflops, supercomputers have evolved over the years to keep pace with the exponential growth of data and the constant demand for ever-increasing application performance at scale. But aside from all the ways they’ve improved, supercomputing always has been focused on solving current problems and making it possible to anticipate future questions. This eWEEK slide show, using industry information from supercomputing provider Cray, explores seven surprising ways that organizations are using supercomputers to solve some of the world’s most complicated issues.
2Reducing Famine in Developing Countries
Computational biologist Dr. Laura Boykin partnered with a research team in East Africa to use supercomputers to ease famine. The team focused its efforts on the cassava plant, a crucial food source in the region plagued by infestations of virus-carrying whiteflies capable of completely wiping out a year’s crop. The goal was to provide farmers with new whitefly- and virus-resistant strains of cassava. Boykin’s team used a supercomputer’s petascale power to expose the species’ genetic differences and vulnerabilities and to make sense of billions of genetic base pairs. The calculations made possible by the supercomputer enabled the team to distinguish damaging whiteflies from others effectively.
3Expanding Our Understanding of the Human Brain
Professor Dirk Pleiter at Germany’s Jülich Supercomputing Centre is using a supercomputer to simulate and understand the human brain. As part of the Human Brain Project, Dr. Pleiter and his team are developing a data-intensive supercomputing infrastructure that helps researchers make advancements in neuroscience and brain-related medicine. The Centre’s supercomputer, JULIA, addresses the Human Brain Project’s vast demand for dense memory integration, scalable visualization and dynamic resource management. These are all needed to fuel insights into developing new treatments for brain diseases and pioneering new technologies.
4Exploring New Renewable Energy Sources
Teams from Carnegie Clean Energy and the University of Western Australia have turned to supercomputing to power their research in ocean-wave energy. For the last decade, Carnegie has been developing a wave-energy device that converts ocean swell into zero-emission, renewable power and desalinated water. Supercomputing provides researchers with an understanding of what’s going on in extreme ocean environments and brings them closer to the potential of deploying wave-energy technology at large scale.
5Bolstering Efforts to Treat Memory-Related Diseases
Annette Milnik is a post-doctoral fellow at the University of Basel using a supercomputer to help unravel and explain how human memory works. She and her colleagues examined 500,000 genetic variations in conjunction with 400,000 methylation patterns to see how they interact. Methylation is a mechanism that interacts with DNA molecules and, in doing so, leaves a pattern of markers that can identify the function of a given cell. This pattern might yield clues for finding memory-relevant genes. Once Milnik and her colleagues locate memory-relevant genes, they can look at existing drugs to identify which ones could influence memory capacity, opening a whole new set of treatment options.
6Protecting Lives and Livelihoods Through Advanced Weather Forecasting
With the threat that extreme weather presents to our lives and livelihoods, advanced weather forecasts can safeguard us all from otherwise unforeseeable disasters. The UK’s national weather service, Met Office, is estimated to save as many as 74 lives and £260 million a year. These life-saving forecasts, however, demand immense compute power. The Met Office uses more than 10 million daily weather observations, an advanced atmospheric model and a supercomputer to create 3,000 tailored forecasts each day. The weather center is able to turn out such a large number of forecasts because its supercomputer is capable of processing 16,000 trillion calculations a second.
7Moving Science Closer to Curing Cancer
A multi-governmental program, the Joint Design of Advanced Computing Solutions for Cancer (JDACS4C) is using supercomputing labs in the fight against cancer. Professor Rick Stevens is a principal investigator on JDACS4C at Argonne National Laboratory and helps lead three critical “pilot” projects: understanding key protein interactions, predicting drug response and automating patient information extraction to inform treatment strategies. At the core of these are data and deep learning. He and his team are building a deep learning model, which is a neural network that combines all the information already gleaned about drug response in all types of cancer. ANL’s supercomputer is helping to predict how specific patients and tumors respond to the hundreds of different cancer drugs now available.
8Providing a Transparent and Nonpartisan Approach to Redistricting
Redistricting legislative boundaries is a process often fraught with political partisanship and gerrymandering. This practice of manipulating district lines to favor a particular political party is exactly what inspired political science professor Wendy K. Tam Cho to look toward supercomputing for a solution. Cho’s algorithm generates billions of voter district maps to measure the degree of partisanship present within any given electoral boundary and bring greater transparency to the process of redistricting. The use of supercomputing has equipped Cho’s algorithmic model with the computational power necessary to identify fairness of political power, and it’s ready for a field test in the next redistricting in 2020.