In the controversy over IBMs sale of its PC business to a company in China, we find an important message about the changing nature of enterprise information security. We used to get along with simple hierarchies of trust, providing a straightforward increase in privilege—from 0 to 100 percent—as we moved from the outside in. Whats needed now are much more complex structures supporting multilateral partnerships, with different parties having access to overlapping sets of information assets.
Most security technologies, and almost all actual security deployments, fail to provide the needed precision and flexibility for assigning the right privileges to the right parties. Some old but obscure ideas need new implementations. It needs to be much easier for people to match the architecture of security to the activities that take place within that framework.
The view of security as an either-or proposition can manifest itself in many ways. The first time we saw the proposed floor plans for our old testing lab near the San Francisco airport, the architect knowingly said, "And heres the beta room." He knew enough to include an area with limited access, not visible to staff who were not party to nondisclosure agreements but who had access to the general testing area.
Even so, he apparently didnt realize that we might have prerelease, confidential testing in progress with more than one company at a time and that those companies representatives would need to be kept from seeing one anothers equipment.
We wound up with three beta rooms, rather than one. But many IT security plans are like the original floor plan: They do just enough to be worse than doing nothing at all.
By that I mean that when a system is clearly insecure, people are careful. If you knew that your office area was wide open to any visitor, youd make sure your whiteboard wasnt visible from the hall. Youd avoid loud conversations about sensitive subjects. For that matter, if you knew that your e-mail was being scrolled across the news ticker in New Yorks Times Square, youd write only in indirect terms about anything you didnt want to share with the world.
But when systems seem secure, guarded by much-ballyhooed technology with lots of nuisance rules about signing for keys and updating passwords and the like, peoples behavior may become much more careless. If youre working in a special-projects area, why not leave that document on your desk? If your e-mail is encrypted, why not mention the products actual name and ship date in a message?
A system that might otherwise be defended in depth, by the awareness and caution of the people in it, will become vulnerable to single points of failure. One lost master key, or one easily guessed administrator password, and the whole system is blown wide open.
It doesnt require encryption breakthroughs to enable overlapping groups to have secrets in common while others are carefully guarded. We just celebrated the 25th anniversary of MIT professor Adi Shamirs 1979 paper "How to Share a Secret." In that paper, Shamir—the "S" in "RSA"—explained the ease of devising encryption schemes that let any given number of people in a larger group unlock content or exercise other privileges.
In financial operations, any three of a group of five corporate officers might be enabled to digitally sign a check for payment. An IT department might require two members of a group of administrators to authorize certain major operations instead of just using a single administrative password.
But its not enough to be able to lock things up. Bill Reinsch, president of the Washington-based National Foreign Trade Council, observed in comments to internetnews.com last month that "companies doing business overseas like IBM have extensive experience in blocking access to their property." Absolutely. Its not hard to put stuff out of reach—but in the process, cost and complexity arise that get in the way of doing useful things.
The need now is to grant specific access rights to many different partners, confident that youre showing them only what you mean to share.
Technology Editor Peter Coffee can be reached at firstname.lastname@example.org.