No technology accepted by the scientific community today can accurately forecast underwater earthquakes and the ensuing disasters they can cause. However, supercomputing and storage systems are being put to use to assist in the repair of damage done by the recent catastrophic tsunami in Asia, by processing air- and sea-condition models.
The Fleet Numerical Meteorology and Oceanography Center of the U.S. Navy uses supercomputers and storage technology from Silicon Graphics Inc. To support the Department of Defenses tsunami relief initiatives, the center is running a high-resolution, regional weather-prediction model called COAMPS (Coupled Ocean/Atmosphere Mesoscale Prediction System), which forecasts conditions along the coasts of Indonesia, including Sumatra.
In addition, to help relief planes select optimum flight paths into the region, the center is running a model for aircraft routing.
The weather centers modeling programs rely on observations collected around the world from ships, aircraft, land stations and satellites. On average, more than 6 million observations come into the center each day. From that input, analysts develop approximately 500,000 charts and forecasts of oceanic and atmospheric conditions, which they distribute to the military around the world.
“We process about 1TB of data through our system per day. We really cant afford downtime,” said Mike Clancy, acting technical director at the center, in Monterey, Calif. “We have customers relying on [the delivery of] our products in a timely manner.”
Operating 24 hours a day, 365 days a year, the center generates a weather forecast for the entire globe every 12 hours. In addition to providing data to the military, the center makes much of its information available to the public via the National Weather Service and an agreement with The Weather Channel, as well as through its Web site.
FNMOC began using SGI servers, supercomputers and storage technology in 2001. Today, the network includes two Origin 3800 machines, two Origin 3900s and two 12-processor Origin systems, which are clustered because the centers work is continuous and the output of one job often contributes to the input of the next, Clancy said. The systems are connected through SGIs shared-file system, called InfiniteStorage Shared Filesystem CXFS, which permits data to be passed among operating systems without any replicating.
“We have a very complicated operation run, with a number of jobs that run in sequence,” Clancy said. “Theyre very interdependent.”
-Growing Amount of Data”>
The volume of data is growing as more collection platforms, especially more satellites, are added. A National Polar Orbiting Environmental Satellite System in the works will considerably increase data input, Clancy said.
But besides the rising volume of information, computer modeling is growing more sophisticated, producing higher degrees of resolution and accuracy and demanding greater computing power and storage capacity.
In 2001, the center targeted a performance level of 100 gigaflops sustained processing power, with plans for 100 Gflops in 2003 and 400 Gflops in 2004. “Right now, we have a total capacity of about 4.4 teraflops peak in terms of supercomputing,” Clancy said, adding that the total combines power from IBM computers and SGI computers. “Our projection for the end of the decade is about 18 teraflops.”
To further complicate the tasks of data sharing and storage, the center processes both classified and unclassified information. Data coming from military ships or aircraft—classified because it could give away strategic locations—is delivered into a multilevel security system, SGIs Trusted Irix, where it is tagged and segregated. The centers Web site provides two levels of information—data seen by the public and data seen only by the military.
To manage the many layers of data over time, FNMOC uses a hierarchical storage system that provides transparent access via different technologies, depending on the age of the information. The high-speed Fibre Channel SAN (storage area network) includes high-performance online disk storage and nearline storage in an automated tape silo. The latest information is stored on the online disks—dozens of terabytes at any given time—while older information is moved onto tape. After about 30 days, the data migrates to long-term storage off-site.
It is users such as FNMOC, on the high end of the market, that drove SGI to develop its multitiered storage technology, said Gabriel Broner, senior vice president for the storage and software division at SGI. In addition to the military, companies in the oil, gas and energy sectors demanded ever-more-flexible storage capabilities.
“At the weather center, people tell me they need the last seven days of data on a high-performance Fibre Channel, and they need the last 30 days of data on SATA [Serial ATA],” Broner said.
SGI technology supports everything from NASAs efforts to develop a system to rapidly simulate a space shuttle failure to Hollywood filmmakers production of digital movies. “What I always wonder is, Whos next?” Broner said. “In every area, were going to see an explosion of data. Our customers typically double their data needs every nine months.”
One of SGIs latest customers, the Department of Homeland Security, is also using the InfiniteStorage system with SGI Altix supercomputers at the Air Marine Operations Center, in Riverside, Calif. The center receives data from military and civilian radar sources, such as reconnaissance planes, to protect the airspace over the United States and the Caribbean.
The Air Marine Operations Center installed SGI technology nearly 10 years ago but recently upgraded it to handle an increasing volume of data from added radars, said Jim Durrett, assistant director for systems management. The upgrade enables three times the radar input capacity and vastly increased the storage capacity, Durrett said, enabling the operation to go “from gigabytes to terabytes.”
Check out eWEEK.coms for the latest news, views and analysis on servers, switches and networking protocols for the enterprise and small businesses.