I feel as if I could get an entire years worth of columns, or perhaps even build my next career, out of the material in a Task Force Report that was issued at the beginning of this month by the National Cyber Security Partnership.
The NCSP formed in response to last years White House National Strategy to Secure Cyberspace, creating five task forces, including one on "Security Across the Software Development Life Cycle." It is that particular group whose 123-page report seems like almost enough to spend my next several dozen weekly columns addressing.
Coincidentally, the Task Force report emerges on the heels of Richard Clarkes reappearance on the national scene as a voice for the urgent need to attend to computer systems security as a national priority. Some have deprecated Clarkes concern with this subject during the end of his time in the present Bush administration as a sign that he was not aware of worse threats to the nation, but he may wind up looking smarter than his critics on this point. The situation is not merely serious, but is one that will take a long time to correct.
Especially interesting in the Task Force report, and something Ive not seen before, is a sober and objective appraisal of just how poorly we train software developers in the discipline of writing software to be secure. "Few people would accept medical treatment," the report opines, "from practitioners who were originally economics graduates, operated on people in their spare time, and went through a rapid training program to become doctors. But as a nation, the United States has taken exactly this position with regard to engineering software systems that run critical infrastructures upon which many lives depend."
The report does acknowledge the multidisciplinary nature of software security, noting that "While a doctoral level mathematician with expertise in number theory can do some limited work in theoretical cryptography and protocol analysis, and a person with a doctorate in computer engineering with a specialization in computer architecture can design new structures to support operating system enhancements, these and many other sub-specialties are required for the systems level understanding required to meet requirements for high surety systems."
This breadth, the report argues, means that "there is a need for a national community of professors with the combined understanding of these issues and the collaborative structure required to apply these experts in concert." A commitment to have just one such research professional in every U.S. state, with support for graduate students and conference participations and related costs, would represent a national initiative costing some $50 million per year, estimates the report; thats cheap compared to the cost of just one significant, large-scale security breach.
To those who argue that software as a discipline evolves too quickly to be tracked by a formal educational infrastructure, the report points out that "for the last 30 years, India has put forth a concerted effort to provide high quality university education in software design to their young people. As a result, India produces programmers that make fewer errors per line of software code than programmers trained in the United States." High-quality software can certainly be insecure, but it seems to me that low-quality software cannot possibly be secure no matter what mechanisms it attempts to use.
Some observers address the Task Force report at much briefer length than my imagined years worth of columns. Ian Grigg is the Australian co-founder of Systemics Inc., described by its Web site as "a technology company specialising in e-payments and financial cryptography." The company appears to have a deliberately obscured location, with its "Contact Us" Web page offering no physical address and advising interested parties that questions "should be sent to admin at the normal place." The company also has a colorful history that sounds more drawn from paperback fiction than from paperless banking, with business disputes involving lurid reports of seduction as a tool of a partner firms business development practices.
Be that as it may, Griggs "Financial Cryptography" Weblog calls the Task Force report on life-cycle software security "a scary document." He calls the reports recommendations a collection of "calls to certify this, verify that, and measure those"; in particular, he takes the report to task for its dismissive statement (on page 6, which is the eighth page of the Task Force PDF document hyperlinked near the top of this column) that "No processes or practices have currently been shown to consistently produce secure software."
One might note that the Task Force is co-chaired by one Microsoft staffer and includes two others as members—the fact that much of the worlds most popular software is obviously being produced by insecure practices does not prove that no secure practices exist. Research conducted by @stake Inc. has documented the effects of design-time choices and has shown their potential for a fourfold reduction of application vulnerability.
But Grigg is up against credible experts like Gary McGraw, CTO at Cigital Inc., who assisted in writing the Software Process Subgroup portion of the Task Force report that includes the statement to which Grigg takes exception. While McGraw has certainly suggested approaches to elevate the security of the software development process, it is not clear that any organization has yet succeeded in building its process on those foundations.
Is your enterprise the existence proof, or can it become one soon?
Tell me what you wish we could learn about secure software processes at firstname.lastname@example.org.
Be sure to add our eWEEK.com developer and Web services news feed to your RSS newsreader or My Yahoo page: