Chicken Little Law

 
 
By Peter Coffee  |  Posted 2003-07-14 Email Print this article Print
 
 
 
 
 
 
 

Disclosure legislation overwarns, underprotects.

As of July 1, Californias Security Breach Information Act mandates disclosure of data security incidents. By promising more than it delivers, by inducing more customer concern than increased confidence and by ultimately making people less sensitive than they are now to warnings of data insecurity, the act is likely to demonstrate the law of unintended consequences—unless IT professionals add their judgment to its inadequate mandates.

At a minimum, the prudent IT administrator will take the necessary steps to avoid embarrassing and costly violations of this law—and also of its possible imitations in other states or at the federal level.

The act requires any entity doing business in California "that owns or licenses computerized data that includes personal information" to disclose "any breach of the security of the data ... to any resident of California whose unencrypted personal information was, or is reasonably believed to have been, acquired by an unauthorized person." Use of encrypted data storage confers complete protection from penalties under this law, in itself not a bad thing: Its high time that data custodians took responsibility for defense in depth of the sensitive information in their charge, but it would be a setback for IT security if this became the codification of good and sufficient care.

To be sure, Internet users have for too long been led to believe that online transaction security is adequately protected by link encryption. People have been taught to look for the padlock icon or other indicator that a browser displays when engaged with an e-commerce site as their assurance of safety. About 10 years ago, a Purdue University professor, Eugene Spafford, put that misconception in its place with his comment that "using encryption on the Internet is the equivalent of using an armored car to deliver credit card information from someone living in a cardboard box to someone living on a park bench."

Its clearly a waste of time to secure the link between two points if the end points are themselves unprotected. Opportunistic theft of customer or citizen data, by someone with simple physical access to a server with a 3.5-inch floppy drive, is the kind of thing that data-store encryption is likely to discourage—and that, to be fair, this law may therefore actually reduce.

Despite a search of the legislations text, though, I can find no statement of the quality of crypto thats needed to earn exemption from the law. Weak crypto algorithms or poor implementations of good algorithms or poorly administered deployments of even robust crypto products are equally hollow in their promises of protection. If the password is the administrators mothers maiden name, the key length doesnt much matter—but under this California law, at least until case law says differently, it looks to me as if the data is still considered "encrypted."

Dont let yourself become the test case that defines lack of reasonable care. Passwords of reasonable length, meeting reasonable tests of complexity and frequency of change, arent the leading edge of security practice. Dont let their lack become the cutting edge that decapitates your credibility with customers and supply chain partners.

At the same time, Im concerned that mandated disclosures of possible data theft will become a constant assault on use confidence. In California this is already the norm: When you walk into a restaurant, youre greeted by an appetite-suppressing warning that "Chemicals Known To The State Of California To Cause Cancer, or Birth Defects or Other Reproductive Harm May Be Present In Foods or Beverages Sold or Served Here." Similar warnings are given at any number of public establishments where carcinogenic chemicals are used—even if only in nonthreatening amounts. Everyone sees these signs, and no one notices them any more. It looks as if the state is headed down that road again.

Warning people too often, without a context for balancing risks and benefits, is as bad as failing to warn them at all—unless your goal is to pass a law, the purpose of which is to make it look like youve done something. IT professionals must provide the sober perspective that no useful system can be totally secure.

Peter Coffee can be reached at peter_coffee@ziffdavis.com.

 
 
 
 
Peter Coffee is Director of Platform Research at salesforce.com, where he serves as a liaison with the developer community to define the opportunity and clarify developers' technical requirements on the company's evolving Apex Platform. Peter previously spent 18 years with eWEEK (formerly PC Week), the national news magazine of enterprise technology practice, where he reviewed software development tools and methods and wrote regular columns on emerging technologies and professional community issues.Before he began writing full-time in 1989, Peter spent eleven years in technical and management positions at Exxon and The Aerospace Corporation, including management of the latter company's first desktop computing planning team and applied research in applications of artificial intelligence techniques. He holds an engineering degree from MIT and an MBA from Pepperdine University, he has held teaching appointments in computer science, business analytics and information systems management at Pepperdine, UCLA, and Chapman College.
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Thanks for your registration, follow us on our social networks to keep up-to-date
Rocket Fuel