Peter Coffee: National Academy's recommendation that vendors face more liability for security flaws is wrong-headed.
Perhaps youve heard the story about the physicist, the engineer, the economist and the can of beans. The three people find themselves stranded on an island with no other food and with limited tools, and theyre discussing their options. "We could heat the can until it explodes," suggests the physicist. "Wed need to have some kind of enclosure, or wed lose the beans," observes the engineer. The physicist is applying a useful principle, but is overlooking the point that the goal is to get the beans, not merely to open the can; the engineer keeps the essential purpose in mind.
I thought of this story when I read the National Academy of Sciences report, "Cybersecurity Today and Tomorrow: Pay Now or Pay Later."
You wont find new technical information on exploits or countermeasures in this concise document; rather, its a 48-page treatise that uses recent events (including the 9/11 attacks) as context for an overview of past reports that are anywhere from two to 10 years old.
Isnt it depressing that such well-aged observations are still points well worth making in security discussions?
The document provides a useful strategic overview of security issues, in terms that can readily be understood by non-IT management. Particularly poignant is its comment on the failure
of the federal governments "Orange Book" criteria to elevate the security of federal IT practices: "The government demanded secure systems, industry produced them, and then government agencies refused to buy them because they were slower and less functional than other nonsecure systems available on the open market."
If the document makes a single most important point, its that rational IT buyers generally pass over secure systems because they are, almost invariably, later to market with less capability than competing systems that are built from off-the-shelf components and configured for maximum function rather than minimum risk.
I part company, though, with the study when it consequently recommends
that "policy makers should...increase the exposure of software and system vendors and system operators to liability for system breaches..." This is a dangerous path. Existing product liability laws define breach of warranty, negligence and strict liability as the three different standards to which a seller of goods may be held. I dont believe we want to move in the direction of narrowing buyer choice on any of these grounds.
Its easy to say, for example, that a Web server software product carries an implicit warranty of fitness for use in the threat environment of the Internet, but I dont believe we want all Web server products to be built to the highest achievable level of security. There are applications, such as physically isolated intranets or internal document servers, in which the administrative costs and performance overheads of state-of-the-art security are out of proportion to the business risk. People should be able to buy products that suit their needs.
Likewise, strict liability has long been defined as the appropriate standard for products whose use involves unacceptable downside risks, especially when the buyers own testing or modification cant reasonably mitigate those risks.
But as the report itself observes
, "operational security can only be maintained by systematic and independently conducted red team attacks and
correction of the defects that they reveal" (emphasis in the original). IT products are configured in so many different ways and used in so many different environments that I would argue that its inappropriate to apply strict liability.
That leaves us with negligence as the only practical basis for making a claim against an IT vendor: The definition of negligence is a moving target, defined by the pace of technical development and the speed with which a competent practitioner can stay abreast of new threats. I dont want to see this threshold being defined by legislators. Do you?
Its time for the punch line of my opening story. The physicist and the engineer turn to the economist for any additional suggestions. After a pause, the economist says, "Suppose we assume a can opener."
Lets not wave our hands and say, "IT buyers do stupid things, so lets assume that we can create legal or supra-market incentives that force them to make better decisions." If legislators or insurance companies knew how to make IT secure without destroying competitive advantage, theyd be funding startup companies.
Our only real-world choice is to learn together.
E-mail eWEEK Technology Editor Peter Coffee