FRANKFURT, Germany—When the vast airbus a380 super jumbo jet touched its wheels—all 22 of them—to the runway here one March day, it stopped being a part of the sky and instead became a creature of the ground. At the same time, control of this massive jet transferred from the air traffic control system operated by the German government to ground control, run by the Frankfurt Airport using data provided by the airports data center.
That data center, buried 9 meters below the ground near one of Fraports—as the Frankfurt Airport is normally called—main runways, controls more than just the movement of airliners on the ground. This data center, a unique public/private partnership, controls nearly every aspect of every operation at this sprawling aviation facility, from the screens that provide flight information to the movements of the passenger trams.
Because it controls everything, the principle behind the data centers operation is that it must never—for any reason—go down. “The airport would have to close to all but emergency landings if the data center went down,” said Falk Wieland, manager of the data center for Fraport, the equivalent of a chief information officer in the United States.
He said the data center controls all passenger movement within the airport.
Thats no easy task, given the size and growth of what is the largest airport on the continent. In February, the airport carried more than 3.5 million passengers, a 3.8 percent jump over the same month a year earlier. Air freight tonnage increased 4.4 percent, to 157,360 metric tons.
The new data center, which opened in September 2006, enabled Fraport officials to shut down an outdated 30-year-old facility and several other ones that were housed at the airport, Wieland said.
The new data center, powered in large part by American Power Conversions Infrastruxure architecture, is a model of redundancy. In addition to the rows of servers, each with redundant power and cooling, there are redundant servers nearby in a separate room, theres an entirely redundant data center. Fraport uses a variety of System x and and BladeCenter blade servers from IBM and ProLiant DL x86 servers from Hewlett-Packard in the data center, though the most mission-critical tasks are run on System p servers from IBM running the vendors AIX operating system, Wieland said.
Power is provided by three separate power companies from three different and separate power grids. Cooling water comes from redundant chillers that offer 1.4 megawatts of cooling capacity, and just in case something happens to those, the airport maintains a refrigerated lake to provide emergency cooling water.
The redundancy of this data center goes beyond power, cooling and servers. Even the building is constructed to provide redundancy. When the Fraport data center was built, the first thing the designers demanded was another building—built inside the already subterranean bunker it was to occupy. This building-within-a-building means that even if the hardened structure around it is compromised, the data center itself will remain unscathed. While government data centers tend to have this type of assurance, its unusual that a commercial facility does.
The ownership and management of the airport is a complex private/public partnership, which is a common way of doing business in Europe, since it enables both the government and the private entity to defray the costs of doing business. For example, Volkswagen is partly owned by state and national governments.
At the Frankfurt Airport, Gedas Operational Services is jointly owned by T-Systems—which in turn is part of Deutsche Telekom—and Fraport AG. About half of Fraport AG itself is owned by governments—the state of Hesse, where the airport is located, and the German government. The other half of Fraport is owned by private stockholders. Fraport AG maintains control of the data center operations, while Gedas Operational Services runs it, essentially carrying out Fraport AGs directions.
Another unique feature of the Fraport data center is that its looking for customers outside of the airport who want to pay for use of the data centers capabilities. Wieland said his facility has its own profit-and-loss responsibilities, which means that he can help his bottom line if he can attract companies that want their data treated with the same level of care that the airport receives.
“Were looking for midrange companies,” Wieland said, pointing out that the airport can offer small to midsize businesses access to a high-availability data center and all the benefits that go with it, the sort of benefits SMBs couldnt afford on their own. “A small company can go out of business if they lose their data center.”
The Fraport data center was designed for reliability from the ground up. However, this did not mean that the design was years in the making. According to officials with APC, in West Kingston, R.I., the final decision to go with such a highly redundant data center model was made during a visit to the CeBIT trade show in 2005 in Hannover, Germany, when Wieland had a chance to inspect APCs Infrastruxure products firsthand. The project was finished in less than a year after Fraport officials decided to go with the APC technology. Construction on the first of the data center buildings began last spring, and once the buildings were constructed, it took two to three months to get the data center infrastructure running.
The speed in getting the data center equipment in place “is dramatic for an installation of this size,” said Aaron Davis, chief marketing officer for APC. He said that one reason the installation could be done so quickly is that the Infrastruxure data center solution integrates everything—from power and cooling to security and services—with the equipment racks, so everything can be simply rolled into place and connected.
Equally important, Davis said, was that Fraport could build the data center before it had chosen the type of servers to go into it.
“Usually what you find is that people buy the servers and then start thinking about the power,” Davis said. “In this case, they built and powered up the data center before they even chose the servers. The fact that they could wait to choose the servers meant they could get the latest server technology.”
Davis said that one reason Fraport was able to get its data center up and running so quickly is because officials there started from scratch.
“This was a greenfield operation,” he said. “It was a brand new build, so they were able to take a clean-sheet-of-paper approach.”
There also were some internal time pressures, Davis said. “They also had a significant timeline because they had to move very quickly,” he said. “The new CIO was coming on board and was expecting to take on an operational data center.”
He said that many companies would do well—and probably save a significant amount of money in both capital expenses and operational costs—by simply building a new data center rather than to try upgrading their old one.
Wieland said one reason for going with a new data center was so he could have support for the high server density he needs. This dictated a move away from the massive traditional water cooling systems in his 1,250-square-meter facility if he was going to have room for the equipment racks he needed. He said he also needed the latest technology in the power management system because it was getting difficult to buy enough “clean” power on the market at prices that are reasonable. As a result, he can use new technology to clean up “dirty” commercial power.
“A data center is a living object,” Wieland said, explaining why he chose a modular approach with the APC equipment rather than more traditional power, cooling and rack systems. “Theres always something different.”
Check out eWEEK.coms for the latest news, views and analysis on servers, switches and networking protocols for the enterprise and small businesses.