Theres More than One Monoculture

Opinion: The pursuit of platform diversity could lead to increased security problems if the human factor remains unchanged.

Risks due to software monoculture were accurately identified in 2003, when computer security pioneer Dan Geer and his co-authors issued their controversial report on the risks of pervasive deployment of Windows.

Theres more than one kind of monoculture, though, and the prevalence of any single operating system—insecure or otherwise—may be the least dangerous kind.

In November 1988, four weeks after the release of the Internet worm launched by Robert Tappan Morris, Purdue University computer science professor Eugene Spafford published his analysis of that attack—offering, as he said in the abstract of that paper, "a review of the security flaws exploited by the worm program." Note: "flaws" plural.

/zimages/4/28571.gifIs "monoculture" a threat to IT security? Click here to read more.

What Spaffords paper described was a monoculture, not of software but of pervasive carelessness among application developers, system administrators and users—carelessness that persists today.

The human factors of computer security might be unaffected, but most likely would be worsened, in an IT environment made deliberately more complex by a well-intentioned campaign for platform diversity.

Their cup overflowed

Spaffords 1988 paper observed, for example, that the Morris worm relied on a buffer overflow attack against a common utility used to respond to routine inquiries about the log-in status of a user. By sending a carefully crafted string of bytes in the form of an inquiry to that utility program, the worm was able to overwrite code on the target machine with a different series of instructions. These granted the worm a connection to a remote command shell on the target machine.

This mode of attack had already been known for two decades at the time that Morris used it, but despite nearly two more decades of experience, buffer overflows continue to appear.

Overflow attacks are hindered in modern systems, to be sure—by the hardware-enforced distinction between data and code in the newest generation of microprocessor and by the software-enforced injection of security signatures that detect unexpected alteration of code at the time that a program runs.

Even so, developers who are under pressure to achieve peak performance will continue to be tempted to use highly efficient but unsafe coding mechanisms—including some that disable automatic buffer-overflow detection—that are still explicitly and intentionally provided in languages such as C++.

Sending a simple message

Developers may create other vulnerabilities through lack of concern for usability during software design. The Morris worm was able, for example, to use the vital and normally benign sendmail utility as a vehicle for attack because that utility was so often installed with its debugging capabilities enabled. This was done, Spafford found, because the intended means of configuring and controlling sendmail were so frustrating and awkward to use.

/zimages/4/28571.gifGuru Jakob Nielsen offers advice on designing applications for usability. Click here to watch the video.

"The worm would issue the DEBUG command to sendmail and then specify a set of commands instead of a user address as the recipient of the message," Spaffords analysis explained. "Normally, this is not allowed, but it is present in the debugging code to allow testers to verify that mail is arriving at a particular site without the need to activate the address resolution routines. The debug option of sendmail is often used because of the complexity of configuring the mailer for local conditions."

There are two messages here. First, too much software is too complex to understand and use. Second, both system administrators and end users have many reasons to leave things alone once a system is working well enough to do its job. The security mindset of preventing whats not wanted, as well as enabling whats wanted, is both difficult to teach and time-consuming to enforce in the field.

When code is released without adequate attention to making it understandable after deployment, administrators and users are likely to steer clear of doing anything that might backfire—for example, adjusting site-specific parameters that developers thought would enable precise control of the trade-offs among capability, convenience and security. The result, more often than not, will be that security settings are relaxed until things work, at which point the administrator or user will declare victory and move on.

In passing

Finally, the simplest enabler of the Morris worms attack was the ease with which the worm was able to crack password protections due to combinations of obvious password choice and inadequate security infrastructure. As long as systems rely on inexpensive but user-dependent measures such as passwords, rather than more reliable means such as security tokens and biometrics, even the most secure operating system will be a 2-inch steel door secured by a $2 padlock.

Making the IT environment more complex will merely make the human side of the security problem worse.

Peter Coffee can be reached at

/zimages/4/28571.gifCheck out eWEEK.coms for the latest news, reviews and analysis in programming environments and developer tools.