Jan. 1 was the five-year anniversary of the Y2K “event.” It seems so long ago and so supplanted by more recent events as to be irrelevant. But its not. Y2K taught us lessons that will always be applicable: Dont believe everything the experts tell you, and be especially skeptical of worst-case predictions for technology.
This particular set of predictions had a bullying effect to them. With so much purportedly at stake, how could you not prepare for the worst? But theres always a remote possibility of something horrible happening. If we took them all seriously wed spend all our days in our bunkers surrounded by bottled water, dried food and guns.
Why was there no disaster? Its really simple. Y2K simply wasnt anywhere near as much of a problem as many experts suggested. Looking back at the scale of the exaggeration, I have to think that there was an awful lot of lying going on. The motivation—mostly consulting fees—was all too obvious. But there were also a lot of experienced people with no financial interest who deeply believed “the problem.” Take, for example, the reader quote in this excellent piece in USA Today from early 1999.
For insights on security coverage around the Web, check out eWEEK.com Security Center Editor Larry Seltzers Weblog.
The reader assures the author—and I heard no end of this myself when I wrote about it at the time—that there was no way that the problem would be addressed sufficiently in time. And that was just in the U.S.A. and our great Western allies. Of course, the problem in Uruguay and Nigeria and India and other such places was dire, and they hadnt spent the great sums we had on remediation, so unmitigated chaos was a sure thing in the third world.
Of course, we all know now that no such thing happened. I actually did see one small Y2K problem in a custom fundraising program that I consulted on, but it didnt stop the program from basically working, and this report was typical of what we heard after the ball dropped that night.
One of the most famous Y2K alarmists was Gary North, who at least has kept all his Y2K work online for people to see. (Go to his home page to see what business hes in these days.) North was recognized even back in the 90s as a “Y2Krackpot,” but he still got on the news.
There were more famous and credible people who argued that TEOTWAWKI (The End Of The World As We Know It) was coming. Consider Ed Yourdon, whose book “Time Bomb 2000” popularized the notion that all our systems were interconnected and so if this one had a 10 percent chance of failing and that one had a 15 percent chance of failing (all exaggerated numbers), the failure percentages were cumulative and it was inevitable that there would be widespread failures. Yourdon has removed all the Y2K materials from his own site, telling us that Y2K is simply over: “Now that Y2K has come and gone, weve removed the material that was previously posted here about our Time Bomb 2000 book.” Bear all this in mind if youre think of consulting Yourdons expertise on other matters.
It was only in the wake of Jan. 1, 2000, when we all saw the lights still on, the planes still aloft and the computers still running that the rationalization went into high gear. The Y2K scare motivated people to improve their emergency preparedness. It awakened healthy skepticism in institutions like banks, governments and the computer industry. It created a new level of scrutiny in the development of software. (That last one is especially funny.) Ive seen people claim that it was Richard Clarke and his Y2K coordination work in the NSC that saved the world. It was clear that the emperor had no clothes, so the only thing left was to praise nudity.
You certainly couldnt justify all the money spent on the straightforward merits. Did all that remediation actually solve serious Y2K problems? If it did, why didnt the problems manifest themselves in serious ways in the unremediated third world? Either the problems were less frequent or less malignant than advertised. In either case, some famous consultants misled their clients.
What we all should have argued for at the time was perspective: A focus on worst-case planning is usually unwarranted. Those who argue for it hysterically and who belittle those who oppose them arent serving anyones interests well. Ironically it also taught us the exact opposite lesson that the extremists were telling: Far from being vulnerable because of their interconnectedness, our systems are robust because of their redundancy and interconnectedness.
Security Center Editor Larry Seltzer has worked in and written about the computer industry since 1983.
Check out eWEEK.coms for the latest security news, reviews and analysis.
More from Larry Seltzer