Federal Data Center Consolidation a Virtually Impossible Task

 
 
By Wayne Rash  |  Posted 2010-07-06 Email Print this article Print
 
 
 
 
 
 
 

News Analysis: Consolidating federal data centers takes more than just overcoming interagency jealousies. Congressional oversight and the intricacies of the budget-approval process can delay major IT projects for years.

In his article in eWEEK's sister publication, Smarter Technology, Dennis McCafferty reports on the results of a study that shows that White House efforts at consolidating federal data centers to save money and improve operational and energy efficiency will be a tough nut to crack.  

As McCafferty points out, the agencies are territorial, they don't want to use private vendors, and relatively few federal IT managers believe in the process.  

The article references a MeriTalk survey that reveals that federal managers don't have the guidance they need to close down unneeded data centers, they don't have operational guidance about how they're supposed to consolidate with another agency, and they don't have any assurance that their needs will be met if another agency is in charge of their data center. But in reality, the problem is a lot worse than the survey indicates. 

First, the current discussion about data center consolidation assumes that existing federal data centers can be consolidated at all. Second, it assumes that the operational needs of an agency operating a potential data center can meet the needs of tenant agencies.  

Then there's the whole question of security-right now, each agency that requires a secure computing environment and requires security clearances for its people is under orders to satisfy its own clearance standards, meaning a security clearance for the Department of Defense isn't directly usable by say, the Department of Homeland Security. Likewise, the security standards for data centers are different. 

But even that isn't the biggest problem. The biggest problem is the morass of federal procurement rules over the years that have mandated that agencies buy from the lowest bidder. Because of this, federal data centers are heavily populated by systems that run a single application or set of applications. Frequently they don't even communicate with each other, much less with systems in other agencies. 

So while there are standards for federal data centers, such as support for a Unix-like user interface, this isn't the same thing as compatibility. Unless interoperability with some other system is required by the standards set in the original procurement, then there's not going to be any interoperability, if only because it adds to the cost.  

So the result is that federal IT managers are faced with having a vast array of small data centers that generally meet only a few functions, serve only part of an agency's needs and aren't interoperable with anything else.  



 
 
 
 
Wayne Rash Wayne Rash is a Senior Analyst for eWEEK Labs and runs the magazine's Washington Bureau. Prior to joining eWEEK as a Senior Writer on wireless technology, he was a Senior Contributing Editor and previously a Senior Analyst in the InfoWorld Test Center. He was also a reviewer for Federal Computer Week and Information Security Magazine. Previously, he ran the reviews and events departments at CMP's InternetWeek.

He is a retired naval officer, a former principal at American Management Systems and a long-time columnist for Byte Magazine. He is a regular contributor to Plane & Pilot Magazine and The Washington Post.
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Thanks for your registration, follow us on our social networks to keep up-to-date
Rocket Fuel