In his article in eWEEK’s sister publication, Smarter Technology, Dennis McCafferty reports on the results of a study that shows that White House efforts at consolidating federal data centers to save money and improve operational and energy efficiency will be a tough nut to crack.
As McCafferty points out, the agencies are territorial, they don’t want to use private vendors, and relatively few federal IT managers believe in the process.
The article references a MeriTalk survey that reveals that federal managers don’t have the guidance they need to close down unneeded data centers, they don’t have operational guidance about how they’re supposed to consolidate with another agency, and they don’t have any assurance that their needs will be met if another agency is in charge of their data center. But in reality, the problem is a lot worse than the survey indicates.
First, the current discussion about data center consolidation assumes that existing federal data centers can be consolidated at all. Second, it assumes that the operational needs of an agency operating a potential data center can meet the needs of tenant agencies.
Then there’s the whole question of security-right now, each agency that requires a secure computing environment and requires security clearances for its people is under orders to satisfy its own clearance standards, meaning a security clearance for the Department of Defense isn’t directly usable by say, the Department of Homeland Security. Likewise, the security standards for data centers are different.
But even that isn’t the biggest problem. The biggest problem is the morass of federal procurement rules over the years that have mandated that agencies buy from the lowest bidder. Because of this, federal data centers are heavily populated by systems that run a single application or set of applications. Frequently they don’t even communicate with each other, much less with systems in other agencies.
So while there are standards for federal data centers, such as support for a Unix-like user interface, this isn’t the same thing as compatibility. Unless interoperability with some other system is required by the standards set in the original procurement, then there’s not going to be any interoperability, if only because it adds to the cost.
So the result is that federal IT managers are faced with having a vast array of small data centers that generally meet only a few functions, serve only part of an agency’s needs and aren’t interoperable with anything else.
FCC, FBI Struggle with Data Center Consolidations
There are really only two ways to consolidate such data centers. The first is to physically move the data centers to a new location. The second is to contract for, and then build, a new data system that could exist in the new location on new hardware.
To accomplish either of these, such a move would have to be put into the agency’s budget, be approved, have the money appropriated, have the process put out for bids, and then have the contract issued. Finally, after all of that, the consolidation could take place.
Even in a fairly simple consolidation, such as within a single agency with compatible systems, the process can take years. For a major consolidation, the process could easily consume a decade. Sometimes the process is so difficult that it simply can’t be done within the constraints placed on agencies by congressional limits, despite wishful thinking from the White House.
Two major system consolidations at federal agencies over the last few years illustrate the difficulties. The Federal Aviation Administration has been trying to modernize and consolidate its data systems and data centers for years. While there has been some progress in updating the computer hardware by making it compatible with obsolete systems, the major upgrade that needs to happen to make the FAA’s computers fully interoperable simply hasn’t happened.
The problem? There’s never been enough money appropriated when it was needed, and the procurement process has been so convoluted that actually getting an upgrade done takes years. Worse, congressional interference with the requirements that the FAA puts on its bids leads to cost overruns, and that in turn requires more money, and that money isn’t there.
The FBI also tried to consolidate its data systems so that it could actually fight crime more effectively. But because of the difficulty of the process, from getting a contractor to create new software to getting systems to communicate with each other, the process took awhile. Then coupled with a constantly changing set of requirements, ranging from demands to communicate with systems at the intelligence agencies to developing terrorism databases, the FBI effort basically stalled.
So is the federal data center consolidation outlook as dismal as McCafferty suggests in his article, and as the MeriTalk survey suggests? Actually, it’s probably worse. It may be impossible, unless Congress frees up significant funding so that it can happen and also allows the agencies to manage the process without meddling.
But in a down economy, it’s far too easy for lawmakers running for re-election to use a federal data center initiative that they cut from the budget as a trophy that they can point to with a claim in their campaign materials that they saved millions of dollars.
But in reality, such cuts, and the meddling that goes with them, cost huge amounts of money in the long term, and in addition it wastes money, requires extra staff, and it’s a huge energy waste. If federal IT managers were able to consolidate as managers can in the private sector, there might be hope. But in the real world of federal IT today, the White House initiative is nothing more than a fantasy.