Kill Pests, Dont Spread Them

By Larry Seltzer  |  Posted 2005-10-05 Print this article Print

Opinion: Why does this idea of "good worms" stay alive? Why not just do things the right way?

The idea of a "good worm" is obviously a tempting one. Youd think that white-hatted researchers would want to be responsible with their clients systems, but someone always wants to do it the sneaky way. Dave Aitel is a well-known researcher at Immunity Inc., a vendor of security products and services. His recent presentation on "Nematodes", basically good worms, doesnt change my mind much, although he does seem to have put some thought into the matter.

Aitels use of the term nematode refers to a "A controlled worm that can be used for beneficial purposes." The analogy, as he puts it, is to "a phylum of primitive worm-like organisms often used to get rid of other pests." Hes right that nematodes are often used for such purposes, but my few minutes of googling finds more about the harmful effects of nematodes (countered with "nematicides"): for instance, "Nematode infestation on a potato crop results in tuber yield decline and/or reduction in quality, thereby contributing economic loss to the industry". But I dont want to get into an analogy war.

Theres no arguing with much of the presentation. Organizations need to secure the systems on their network and its a hard thing to do. Large networks arent documented or organized well enough. Many tools are too slow-moving. Incidentally, Aitel is basically talking here about closed, controlled networks such as in corporations, but at one point the presentation uses ISP networks as an example that could benefit from this approach.

Instead, Aitel envisions worms that circulate the network, testing to see whether systems are vulnerable to exploits and patching them if they are. He sees slow propagation with human verification at many steps to control the spread. He asserts that mistakes are uncommon.

I have one question: Why not do it the obvious, right way? Install a network management and patch management system. Such systems aid you in documenting and organizing your network, and then you can block or quarantine anything thats not in the management system. These systems are expensive, but theres every reason to believe they should work, and theres a competitive market for them.

As I wrote about a year ago, exploiting vulnerabilities is not the way to protect against them. There the issue was a network scanner that exploited vulnerabilities it discovered, and Im sure Aitel would consider Nematodes to be the next generation of that idea. Actually, Aitel takes pains to say that a nematode doesnt need an exploit to run. But it does need to be trusted.

Its not like you can keep some back door open for the nematode to come along and run through. If a system is secured theres (hopefully) no obvious way for the security guy to deploy nematode.exe to it and execute the file, unless he as sufficient rights to do so. At this point, whats the difference between nematode.exe and a management agent?

I dont see much of a difference, and I dont like the idea of the management agent spreading itself around the network and potentially outside it not under the control of some sort of management system. The potential for a FrankenNematode that chews up bandwidth, locks up certain systems for reasons that are not obvious and causes other unforeseeable problems is altogether too foreseeable.

If your network is a jungle, its time to pull out the machete and start clearing. Nematodes are a surrender to the law of the jungle, not an attempt to take control.

Security Center Editor Larry Seltzer has worked in and written about the computer industry since 1983. Check out eWEEK.coms for the latest security news, reviews and analysis. And for insights on security coverage around the Web, take a look at Security Center Editor Larry Seltzers Weblog. More from Larry Seltzer
Larry Seltzer has been writing software for and English about computers ever since—,much to his own amazement—,he graduated from the University of Pennsylvania in 1983.

He was one of the authors of NPL and NPL-R, fourth-generation languages for microcomputers by the now-defunct DeskTop Software Corporation. (Larry is sad to find absolutely no hits on any of these +products on Google.) His work at Desktop Software included programming the UCSD p-System, a virtual machine-based operating system with portable binaries that pre-dated Java by more than 10 years.

For several years, he wrote corporate software for Mathematica Policy Research (they're still in business!) and Chase Econometrics (not so lucky) before being forcibly thrown into the consulting market. He bummed around the Philadelphia consulting and contract-programming scenes for a year or two before taking a job at NSTL (National Software Testing Labs) developing product tests and managing contract testing for the computer industry, governments and publication.

In 1991 Larry moved to Massachusetts to become Technical Director of PC Week Labs (now eWeek Labs). He moved within Ziff Davis to New York in 1994 to run testing at Windows Sources. In 1995, he became Technical Director for Internet product testing at PC Magazine and stayed there till 1998.

Since then, he has been writing for numerous other publications, including Fortune Small Business, Windows 2000 Magazine (now Windows and .NET Magazine), ZDNet and Sam Whitmore's Media Survey.

Submit a Comment

Loading Comments...
Manage your Newsletters: Login   Register My Newsletters

Rocket Fuel