Like people, jobs and most other things on Earth, data centers evolve. And, as they are modernized, some things are improved more than others.
In an IT era of ever-more-powerful data servers and storage machines, Georgia State Universitys cutting-edge data center is a real-world example of change for the better.
Here is a data center that has improved its output and service levels, yet has cut back on the amount of electricity it takes from the national grid by as much as 80 percent.
In fact, the Atlanta institution, which opened its first computer center in 1968 with three Unisys mainframes, has evolved its central data center into one that can almost stand on its own feet powerwise. In the most fundamental terms, Georgia State has acquired and installed enough self-generating power to meet all the universitys daily computing requirements.
Only about 20 kilowatts of the centers power comes “out of the wall,” and that is used to run three unprotected, nonproduction racks of servers designated for testing and other special projects, said GSU Technology Operations Center Manager Jerry Allen.
GSUs main data center servers are lit up by three powerful UPSes (uninterruptible power supplies), one of which uses old-fashioned kinetic energy supplied by a large flywheel, and two cogeneration plants that turn heated, compressed air into additional electricity through a turbine. Think of a grandfathers clock pendulum: Its the same general idea, with the 800-pound spinning flywheels from Active Power needing little maintenance and requiring only that their ball bearings be replaced once every four years.
And this is not a small data center. Some numbers put it in perspective:
- Sixteen full-time employees man the center. The staff does all its own racking, stacking and cabling; the only thing the center contracts for is electrical work.
- The data center operates 24/7, 365 days a year, and is staffed even on Christmas Day.
- The center is responsible for network monitoring of all end nodes, which include all wireless access points on campus, all networked devices, all host servers, all UPSes, the schools Avaya telephone system and an additional VOIP (voice over IP) system.
- With 5,500 square feet of floor space, the center holds 95 racks of servers and 550 hosting servers.
The center runs about 30 business-critical applications, including Banner, the universitys online registration software, and Solar, the student database.
“[Solar] has all of [the students] grades, their financials, all their registration info, the course catalog, faculty information—all kinds of good stuff,” Allen said.
The center also hosts a 10-node IBM P575 clustered supercomputer, with four or eight CPUs on each node. The supercomputer is being used for research projects such as SURAGRID, or the Southeastern University Research Alliance Grid system. There are 27 of these clustered systems networked among a group of research institutions in the Southeast. The computers are linked into a grid system to share a massive amount of computing power if needed.
GSU uses Spectrum as its financial system, PeopleSofts human resources application, a Novell backbone for the schools e-mail services and a number of other applications.
“On the servers, were running Windows, Solaris 8 and 10, Unix, GroupWise for e-mail and StorageLocker,” Allen said. StorageLocker allows students and staff members to store a predesignated amount of data on the network storage machines.
The center even provides access to the Georgia Public Library System, Allen said, so that students can borrow books from other libraries if the GSU library doesnt have the right book at the right time.
“This is the heartbeat of the university,” Allen said. “This is where it all happens. We monitor access to the Internet; we monitor whats coming in and whats going out.”
Modest beginnings
The GSU data center, situated on the first floor of its downtown location, was never intended to be the computing center it now is. Its location was lent to the IT department by the campus library, which had been using the space as a book repository.
To help power the systems and applications it hosts, the center once relied heavily on batteries—big batteries.
“We had a whole bunch of APC Symmetra [batteries] all over the data center, and we had two, three or four racks connected to each Symmetra,” said Allen. “There was no such thing as an EPO [electrical power-off] button because when you hit the EPO, all it did was make all the UPSes come on.”
Staff members also were storing batteries, which carries risk.
“We needed to get out of the battery business,” Allen said. At the same time the university was growing, its computing requirements were increasing. Allen and Assistant Data Center Manager Melissa Lamoureux saw that they needed to increase the power intake.
“I started looking around, and I found the Active Power CoolAir [compressed air] UPS—no batteries involved, no storage, no maintenance, a lot of cost savings. You could get it on an annual usage plan. You didnt have to buy it. So we got a 100-kva, 80-kw unit. We put it in about 18 months ago,” Allen said.
Page 2: University Breaks All the Storage Rules
University Breaks All the
Storage Rules”>
This single, compressed-air UPS runs the universitys research computing grid system and another campus-wide portal project called Luminous that takes up about four racks of equipment. GSU is getting ready to put Luminous into production, Allen said.
The compressed-air UPS, which uses an air turbine, releases compressed air into the server rooms at 55 degrees Fahrenheit, so it actually cools the space while its creating energy to run the system, Allen said.
“Its old technology. It uses a small flywheel to keep you going for the 2 or 3 seconds that you need after a power failure happens, until the compressed air turbine takes over and starts running,” Allen said.
“That air is superheated—to 734 degrees—so that when it comes out the other end, there is no condensation involved, so its dried. Its cooled at 55 degrees when it comes out the other end. It goes through the turbine, powers the turbine, sends over the AC current, and its converted over to DC by a big conversion box—which all UPSes have anyway—then down to the PDUs [plug distribution units], then out to the racks, and it powers your data center.”
When another project came along, GSU decided to go out and get another 50-kva, 40-kw compressed-air UPS unit to run and protect it, Allen said.
“We also have unprotected regular power from the building—there is no generator, no UPS. That is basically for test and evaluation stuff and for noncritical projects. So if we lose power, it just kills the box,” Allen said.
The data center has a full 15 minutes of self-generated power at full load, in case of an emergency. “This means we have time to gracefully shut out our equipment as needed,” Allen said. “Knock on wood, in all the time Ive been here, we have never seen a power outage longer than 5 minutes, unless it was self-induced. Then thats a whole different story.”
GSU is on the same power grid as a local hospitals trauma center, so “were not going to see too many power outages anyway,” Allen said.
“Our compressed-air UPS gives us 15 minutes in case of an outage, which will get us past most of the standard outages. Our challenge was to get us a full 7-minute window in case of a power outage, so were comfortable with what we now have,” Allen said. “If you have a power outage thats much longer than that, weve got a lot more problems anyway.”
Allen said that the data center, with its self-generated flywheel and compressed-air power sources, is saving a lot of money on the power bill but that its hard to quantify it right now. It is possible that the data center is saving anywhere from 50 to 80 percent on power by using self-generating power supplies.
“The problem is, weve never been metered separately,” Allen said. “Because were in a building, the whole building is metered. I can tell you that we have reduced our use of power in large measure, but that doesnt give us a dollar figure; it doesnt give us an amount based upon the whole [building]. Wed have to get the electrical guys to study it and come up with some figures for us, and that would take a while to do.
“Active Power tells us we should see a significant drop [in power usage] because of how efficient their equipment is, and I believe that,” Allen said. “But I dont have any numbers yet to back it up.”
Green is good
This whole green data center setup has been “better than what we ever thought it would be,” Allen said. “To us, this is the way we think it should be.”
Allen said he never dreamed seven years ago when he came on board that the GSU data center would have evolved into what it is today.
“When I was hired, they were looking more for organizational and management ability, rather than technical ability,” Allen said.
“So I came into this job with no pretenses, no preconceived notions about what a network op center should be. I didnt know there were certain things that were sacrilegious. I just blew all the rules and made my own rules as I went, which has really been a lot of fun because Ive been given creative license. Weve tried some things that didnt work and tried a lot of things that did work. Weve had a really great time building this thing.”
Check out eWEEK.coms for the latest news, reviews and analysis on enterprise and small business storage hardware and software.