Some Windows systems can stay up and running for years. But Security Center Editor Larry Seltzer wonders if it's really such a good an idea for a server to stay up so long.
Conventional wisdom would have it that a Windows system needs to be rebooted every time you play a tough game, every time the kids hit a new Web site, and every time Law and Order is on the tube. Its really an exaggerated reputation.
I know of Windows servers, especially Windows 2000 servers, that have been up for months and even years. When you have a server that performs a small number of tasks and you dont mess with itfor example, foregoing installations of random sharewareits not unreasonable to think it will go on forever. Most of the long-running servers I hear about are backup domain controllers (BDCs in NT-speak) or print servers or simple file servers. At the same time, I know of an Internet cafe that uses an Windows NT4 server as the main domain controller, print server and file serverand the thing never goes down.
Of course, one of the most common reasons to have to reboot a Windows system is to install a security patch. So if you find a Windows system that has been up and running for years, you have to wonder how many gaping, famous, unpatched security holes are sitting there on the system.
Baltimore Technologies is a security company, long a player in the enterprise public key infrastructure (PKI) business. One would think that a security company would be conscious enough of security issues that they would apply patch every now and then.
Not only does it appear that Baltimore Technologies hasnt applied these patchesbecause quite a few of them do require rebootsbut it appears that they have gotten away with it. As the Netcraft article says, in two years even a not-so-famous company like this would have had a serious amount of traffic on their Web site. And theyre still running. I suppose its possible that the site was hacked and Baltimore doesnt know it, but thats very doubtful.
When I asked Netcraft about it, I think they shared my sense of incredulity at the situation. Still, they pointed out that not every function of Windows and IIS is open to attack. If a server did not use any of the usually affected technologies, for example by offering WebDAV services or running SQL Server or leaving Port 35 open to the Internet, theres no really critical problem that a 2-year old version of Windows 2000 Server would be subject to attack. Baltimores site appears to be a testament to Microsofts work to remove memory leaks from the basic Windows kernel and IIS.
Baltimore Technologies agreed, according to the Netcraft article. "The Web sites reliability is enabled by a stable power source, good physical security, a webmaster who cooperates with the networks team and a proper screening firewall, said Keith OByrne, a network engineer with Baltimore Technologies." So its not just Windows 2000, but the good work of Baltimore staff thats responsible.
Stories like this make you wonder. Is it worth applying security patches as a matter of course, or better to scrutinize the patch to make sure that its relevant to a function youre actually running? Both approaches could be called "conservative," because examining patches follows the "dont fix it if it aint broke" maxim. But my own preference is "better safe than sorry," also a conservative track.
Yet a better question to ask would be: "Who cares about uptime, per se"?
As long as the server isnt being rebooted because of a memory leak or because some malfunctioning app refuses to quit, its just a minor hassle. Most sites can afford to be down, or serve content from external caches, for the few minutes it takes to reboot. Besides, if a site is so busy that it cant afford such downtime, it probably needs some redundancy anyway.
So its best to skip Baltimore Technologies as an example of usual practice. For most of us, running a 2 year old Windows isnt worth the risk.
Security Center Editor Larry Seltzer has worked in and written about the computer industry since 1983.More from Larry Seltzer
Larry Seltzer has been writing software for and English about computers ever since,much to his own amazement,he graduated from the University of Pennsylvania in 1983.
He was one of the authors of NPL and NPL-R, fourth-generation languages for microcomputers by the now-defunct DeskTop Software Corporation. (Larry is sad to find absolutely no hits on any of these +products on Google.) His work at Desktop Software included programming the UCSD p-System, a virtual machine-based operating system with portable binaries that pre-dated Java by more than 10 years.
For several years, he wrote corporate software for Mathematica Policy Research (they're still in business!) and Chase Econometrics (not so lucky) before being forcibly thrown into the consulting market. He bummed around the Philadelphia consulting and contract-programming scenes for a year or two before taking a job at NSTL (National Software Testing Labs) developing product tests and managing contract testing for the computer industry, governments and publication.
In 1991 Larry moved to Massachusetts to become Technical Director of PC Week Labs (now eWeek Labs). He moved within Ziff Davis to New York in 1994 to run testing at Windows Sources. In 1995, he became Technical Director for Internet product testing at PC Magazine and stayed there till 1998.
Since then, he has been writing for numerous other publications, including Fortune Small Business, Windows 2000 Magazine (now Windows and .NET Magazine), ZDNet and Sam Whitmore's Media Survey.