Standardizing the Federal Desktop

 
 
By Larry Seltzer  |  Posted 2008-02-03 Email Print this article Print
 
 
 
 
 
 
 

Opinion: It's hard to argue with the Federal Desktop Core Configuration, but you  have to sympathize with the poor IT guys who have to implement it.

The latest in security compliance regulation is the FDCC, or Federal Desktop Core Configuration.

As of Feb. 1, all federal agencies were required to provide to the Office of Management and Budget a list of assets running Windows XP and Windows Vista, and which are compliant with FDCC requirements. By March 31, they will have to submit a technical report on how they will get all the other machines into compliance.

This began with an OMB memo from last March stating that federal agencies with Windows XP and Windows Vista systems will have to use security configurations for those systems, based on standards developed by NIST (National Institute of Standards and Technology). The standards have to do with the usual things like not running with admin rights, which ports are open and so on.

Wouldn't it be great if such standards could actually be established and enforced? It should make testing and configuration of applications so much easier, and Microsoft's deployment tools really do help to roll out such configurations.

But federal agencies don't turn on a dime for rules like this. Look at their generally dismal performance complying with FISMA (Federal Information Security Management Act), which only involves reporting, not actual reconfiguration of systems. Many of these agencies don't even know for sure what assets they have.

I spoke with Amrit Williams, CTO of BigFix, which sells compliance testing tools for FDCC and other standards. Williams likes the idea of IT taking a proactive approach to securing systems, as opposed to the usual reactive "scan and patch" approach. What's not to like? But it's happening in a rushed way, Williams said. I have to agree. As much as I like these standards and the idea of standardizing security configurations, it just isn't going to happen as soon as OMB wants it to.

There are many tools that assert that they can scan for FDCC compliance, but none are yet certified for the purpose. Actually, the certification is lacking SCAP (Security Content Automation Protocol), which is a standards-based system for scanning for compliance with various regulations, FDCC among them. But it's another sign of how complicated and immature the whole system is.

There are lots of tools out there to test the security of your systems and their compliance with various requirements, but no clear way for you to meet your legal obligations. And if you were to commit your agency to a specific path to compliance, you'd be committing to an imposing amount of work; Agencies are struggling just to do the automated scanning work, and there are manual checks that must be done as well.

Of course security is always job 1, blah blah blah, but on a day-to-day basis, what people expect from IT is to deliver services and to keep systems running. In the long term, secure and consistent configurations help to do that.  However, in the short term, requirements like this get in the way.

When Visa sets rules for PCI compliance and merchants or processors don't comply, especially if there's a scandal, there are consequences. People lose their jobs, I bet, Visa fines vendors (sometimes anyway) and some companies get into real trouble (think CardSystems).

What's going to happen to agencies that don't comply with FDCC in time? A memo will be written by OMB, a congressman will express disappointment and business as usual will go on (unless the new administration appoints an IT security czar). But if they fund it, it will happen eventually.

Security Center Editor Larry Seltzer has worked in and written about the computer industry since 1983.

For insights on security coverage around the Web, take a look at eWEEK.com Security Center Editor Larry Seltzer's blog Cheap Hack.

 
 
 
 
Larry Seltzer has been writing software for and English about computers ever since—,much to his own amazement—,he graduated from the University of Pennsylvania in 1983.

He was one of the authors of NPL and NPL-R, fourth-generation languages for microcomputers by the now-defunct DeskTop Software Corporation. (Larry is sad to find absolutely no hits on any of these +products on Google.) His work at Desktop Software included programming the UCSD p-System, a virtual machine-based operating system with portable binaries that pre-dated Java by more than 10 years.

For several years, he wrote corporate software for Mathematica Policy Research (they're still in business!) and Chase Econometrics (not so lucky) before being forcibly thrown into the consulting market. He bummed around the Philadelphia consulting and contract-programming scenes for a year or two before taking a job at NSTL (National Software Testing Labs) developing product tests and managing contract testing for the computer industry, governments and publication.

In 1991 Larry moved to Massachusetts to become Technical Director of PC Week Labs (now eWeek Labs). He moved within Ziff Davis to New York in 1994 to run testing at Windows Sources. In 1995, he became Technical Director for Internet product testing at PC Magazine and stayed there till 1998.

Since then, he has been writing for numerous other publications, including Fortune Small Business, Windows 2000 Magazine (now Windows and .NET Magazine), ZDNet and Sam Whitmore's Media Survey.
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...

 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Rocket Fuel