Metrics Blastoff

By Matthew Hicks  |  Posted 2001-12-17 Print this article Print

Metrics Blastoff

Thats exactly where NASA officials started after receiving the bad news from audits in 1998. Nelson and other NASA managers carried out their own six-month evaluation of the agencys IT security program. That led to the identification of five overarching goals. Those, in turn, allowed Nelson and the CIOs and IT security managers to define metrics both across the agency and within particular divisions.

Those security goals identified by NASA officials were: ensure that employees understood their security responsibilities; keep system vulnerabilities—such as known but unpatched security holes—to a minimum; be able to thwart intrusion attempts; effectively manage authentication of users and system access; and maintain effective security policies.

Through a series of brainstorming sessions and workshops among as many as 30 people—including Nelson and his agency-level IT security staff as well as divisional CIOs and security managers—NASA came up with metrics to go with each goal.

To measure whether NASA is meeting its goal of increasing employee understanding of their security responsibilities, for example, the agency this year is requiring that at least 90 percent of employees at each center undergo training in IT security awareness. To deal with system vulnerabilities, each center needs to reduce known security gaps to no more than one for every five hardware systems tested. (In all, NASA has 89,500 hardware systems, such as desktops and servers.)

At NASA, security metrics are anything but static. Over the past three years, while the agencys security goals have remained the same, the metrics that NASA tracks have changed each year as IT and other managers have learned more about what makes a good metric, Nelson said.

NASA has struggled, for example, to develop a good way to measure how well its thwarting intrusion attempts since it is difficult to define what qualifies as an attempt. For now, NASA conducts trial runs of its procedures for notifying key personnel in the event of a hack attack and measures how many are reached and in what time frame, Nelson said. But, he said, the agency is hoping to come up with something better.

While there are no complete standards on which enterprises can rely to decide what security metrics to track, organizations dont have to begin completely from scratch. They can borrow ideas from the best practices, methodologies and benchmarks of several security-related standards that already exist (see chart, below). Although none of these represents a one-size-fits-all metrics guide, they do include standards for judging the effectiveness of security vendors products such as firewalls and routers.

Among these standards are the so-called Common Criteria, created by the meshing of government standards work in North America and Europe, and ICSA Labs certification from TruSecure Corp., of Herndon, Va. At a more granular level, the Center for Internet Security, a nonprofit organization whose members range from vendors to large end-user corporations such as Caterpillar Inc. and Hallmark Cards Inc., has created benchmarks for configuring operating systems and is working on standards for implementing firewalls, routers and VPNs (virtual private networks). A broader standard is the International Organization for Standardization ISO 17799 document, which outlines general best practices that companies should be following.

For the most part, however, these standards offer only a starting point. In most cases, IT managers will need to work closely with business managers to develop specific security metrics that fit their enterprises. Thats what managers from the consulting group within DuPonts global services organization have been doing since the mid-1990s.

Working with line-of-business officials, the group, under Robert George, the benchmarking program manager who oversees information security, defined security goals in four key dimensions: financial, customer, internal and learning. The group then developed metrics to measure progress in each. Results are reported in the form of a balanced score card, a management tool that matches business objectives with metrics that can measure performance.

Take the financial dimension, for instance. DuPonts objective is to improve shareholder value. To determine the extent to which its security efforts are achieving that goal, the Wilmington, Del., company measures the percent of IT costs associated with security. It seeks to be within the top quartile of peer companies in security spending as a percentage of IT spending, George said. Similarly, in the learning dimension, George tracks the percentage of employees who undergo training through the DuPont Information Security Organization University on security issues, with a goal of 100 percent.

But NASA and DuPont are among the few organizations taking comprehensive internal steps to formally measure the effectiveness of IT security, experts say. Most, instead, are only beginning to take initial steps to better validate their IT security by seeking certifications of their security practices from outside service providers and consultants, implementing evolving best practices or tracking compliance with policy.

One such company is managed hosting provider Cervalis. The company is attempting to validate its security measures for its peace of mind and to assure its customers. Cervalis, which opened its first data center in May, had TruSecure audit its security practices by judging its policies and attempting to penetrate its infrastructure. In November, Cervalis received the TruSecure Service Provider certificate.

Cervalis will continue to undergo reviews by TruSecure to maintain its certification, said Eddie Rabinovitch, vice president of network engineering for Cervalis, in Wappinger Falls, N.Y. Cervalis is also following the standards set forth by the CIS. Rabinovitch has used the centers benchmarks for securely implementing Solaris and Windows 2000 and plans to follow other guidelines as they come out.

While such certification provides validation that certain security processes exist at a company, they shouldnt be seen as a substitute for a rigorous program of tracking security metrics, experts say. For one thing, certifications and audits performed by consultants, unlike internally created metrics programs, are often one-time events and cant be used to improve performance over time. For another, theyre often very subjective.

"There are no units of measure associated with [an audit]," said Ron Knode, global director of managed security services at Computer Sciences Corp., in El Segundo, Calif. "It is reassuring for auditors to say [you pass], but what is the standard? The standard is in the eye of the beholder."

At both NASA and DuPont, the metrics programs have helped communicate IT security needs and readiness to upper management. At NASA, Nelson collects metrics every quarter from the CIOs and security managers at the agencys centers and reviews them with the overall agencys CIO. Every year, a report is created that NASAs upper management reviews and that is presented to the agencys inspector general and shared with the GAO to satisfy federal requirements.

At DuPont, George shares the results of his balanced score card as well as a report on security incidents, losses and expenses with the corporations senior executive committee, including the CEO, the CIO, key IT partners and the companys audit committee.

Such sharing of hard security metrics can even help boost spending on IT security. At NASA, even though managers of individual programs determine security spending rather than IT managers, spending on IT security this year has almost doubled, to 5 percent of overall IT spending, since NASA began the metrics effort, Nelson said.

DuPont has also seen IT security budgets increase but wouldnt say by how much.

Metrics alone wont solve all security woes. NASA, for example, has made substantial progress. In the most recent computer security report card released last month by the U.S. House of Representatives Subcommittee on Government Efficiency, Financial Management and Intergovernmental Relations, NASA jumped one entire grade higher from a year ago and recorded the third-highest grade among the federal agencies rated. The bad news: It barely received a passing mark with a C-, leaving plenty of room for more improvement.

Nelson is confident that the agency can keep improving. Just as every mission in space teaches scientists and astronauts something new, each report on security metrics helps NASA refine its plans for improvement. "Good metrics demonstrate your progress, and they push you toward [more] progress," Nelson said.

Matthew Hicks As an online reporter for, Matt Hicks covers the fast-changing developments in Internet technologies. His coverage includes the growing field of Web conferencing software and services. With eight years as a business and technology journalist, Matt has gained insight into the market strategies of IT vendors as well as the needs of enterprise IT managers. He joined Ziff Davis in 1999 as a staff writer for the former Strategies section of eWEEK, where he wrote in-depth features about corporate strategies for e-business and enterprise software. In 2002, he moved to the News department at the magazine as a senior writer specializing in coverage of database software and enterprise networking. Later that year Matt started a yearlong fellowship in Washington, DC, after being awarded an American Political Science Association Congressional Fellowship for Journalist. As a fellow, he spent nine months working on policy issues, including technology policy, in for a Member of the U.S. House of Representatives. He rejoined Ziff Davis in August 2003 as a reporter dedicated to online coverage for Along with Web conferencing, he follows search engines, Web browsers, speech technology and the Internet domain-naming system.

Submit a Comment

Loading Comments...
Manage your Newsletters: Login   Register My Newsletters

Rocket Fuel