Maybe there will be a year when we will be able to say, “You know, only good things happened in technology this year. There was only progress–no dirty tricks, no security breaches that could easily have been avoided, no making life harder for the end user.” Well, 2005 was not such a year. In fact, the hard part about selecting the top technology flops of the year was figuring out what to leave on the cutting-room floor. And there was one technology application that went so horribly wrong, in so many different ways, that the analysts at eWEEK Labs argued over who would get to pick it. In the end, we decided it was worthy of “special honors” in this category.
Herewith, eWEEK Labs picks for the technology gaffes we hope never to see again, led by the most stupid technology trick of the year.
The Sony rootkit fiasco
Any of us could have seized upon the Sony BMG rootkit fiasco (aided and abetted by DRM vendors) as the stupid tech trick of the year.
It was a security gaffe, creating a major vulnerability. It was a storage screw-up, corrupting users file systems. It was an offense against developers, misappropriating open-source code. It was an abuse of networks, covertly installing phone-home code.
A multimedia-mogul villain straddles the worlds of content and code? Oh, right.
Thats the premise of 1997s James Bond movie, “Tomorrow Never Dies.” That sort of thing belongs on the movie screen, not on home or enterprise PCs.
The Sony brand name was already in trouble—it lost 16 percent of its value between 2004 and 2005, according to the annual ranking released Aug. 1 by Interbrand.
Now it has taken a body blow among tech-product Opinion leaders. Weve never done it before, and we hope well never have occasion to do it again but, for 2005, eWEEK Labs awards a stupid tech trick grand prize to Sony.—Peter Coffee
Data dohs!
Without a doubt, the most outdated security practice in the world of storage is the shipment of unencrypted media from data centers to secure data vaults. Yet it still happens.
A rash of tape thefts during the last two years has brought unwanted attention to several companies—including Bank of America, which jeopardized the personal information of 1.2 million federal workers in a highly publicized tape-theft scandal.
It defies all logic to spend hundreds of thousands of dollars on firewalls, intrusion detection systems and other security technologies when all the information you are really trying to protect can be stolen out of a delivery vehicle or at a loading dock.
Several tape encryption systems are available that will protect backup data before it leaves the secure confines of the data center.
Furthermore, a host of remote replication solutions eliminates the need for souped-up sneakernets altogether.
With so many solutions available—and with so much at stake—this is a problem I shouldnt be writing about. I hope that next year I wont be. —Henry Baltazar
Broadband boondoggle
The Bush administration—and the FCC, which does its bidding—has cited increased access to faster, cheaper broadband for the United States as a top policy priority moving forward.
Thats a smart goal. However, the government has placed its love for (or perhaps debt to) incumbent monopolist providers before the act of broadening Internet access.
Rather than encourage competition and innovation by enforcing and building upon the line-sharing requirements laid out in the Telecommunications Act of 1996, this year weve seen a dismantling of the regulation thats reducing broadband options in many communities to, at most, two options: cable and DSL.
Despite the pricing mirages of bundled service deals and three-month-long introductory rates, the cost of broadband isnt dropping anywhere near as quickly as it should be, especially if we expect to encourage more Americans to log on and help expand our buying, selling and ad-clicking economy.
Here in San Francisco, where were lucky enough that a near-fully reconstituted Ma Bell hasnt yet managed to outlaw municipal area wireless, Im waiting impatiently to see what becomes of the citywide wireless network our mayor has proposed.
With our kindly Google overlords offering to pick up the tab (and figuring out again how to profit from free), the initiative could just as easily be next years stupid technology trick as it could be a top product. Or it could fail to emerge from the ether at all. —Jason Brooks
Sins of omission
Weve said it before, and it looks like well have to say it again: Applying patches for known vulnerabilities is among the most important security best practices. Its easy and often appropriate to blame vendors for security problems. But, in many cases this year, IT administrators should have been pointing the finger at themselves.
The most noteworthy example came in August, when many companies failed to apply the MS05-039 patch, which fixed a plug-and-play vulnerability in Windows systems.
Several factors played a role in the eventual worm outbreaks, but the failure of some companies to patch notebook systems that then became exposed and infected by connecting at Wi-Fi hot spots was a major one.
Theres plenty of blame to go around, but administrators need to continue to assess risk in the context of the full spectrum of user behavior and available infrastructure.—Michael Caton
Googles boo-boo
Google had a great 2005, but not everything the company released was golden.
Just days after launching a beta of the Google Web Accelerator application in May, Google abruptly put the brakes on the test following a spate of privacy and security concerns.
Google Web Accelerator was an effort to speed the pace of Web browsing by using a combination of local- and server-based caching and preloading of Web pages to more quickly serve pages to a users browser.
Google, naturally, cited the applications popularity as a reason for the beta being pulled and even posted a message on its site that said Web Accelerator had reached its “maximum capacity of users, and [we] are actively working to increase the number of users we can support.”
It was hard to swallow that lame “beta release limit” excuse. After all, Matt Hicks reported on eWEEK.com that users were concerned about security as early as Day 1. And Google officials even confirmed that they were aware that the Google Web Accelerator was returning users cached pages under other peoples user names.
Oops.
Google says the issue has been fixed and has since reopened the Google Web Accelerator beta. Hopefully, Google engineers tested it themselves this time around.—Anne Chen
Gone but not forgotten
Multi-National Force-Iraq officials this May, intending to make a selective disclosure of sensitive information, found out too late that digital masking of redacted text can be readily reversed by anyone who can grasp the concepts of copy and paste.
At some point, it ought to become common knowledge that merely blacking out text with a markup tool leaves the unaltered text still present in that file.
In mid-April 2000, The New York Times tried to obscure portions of scanned documents by adding black masking rectangles. Users with slow dial-up connections actually had time to read the underlying text before those masking overlays appeared; other users with full-strength authoring tools found those masks easy to remove.
The Washington Post made the same mistake in October 2002, when readers growing digital sophistication resulted in even more rapid discovery.
In 2003, a similar error nullified a redaction attempt at the U.S. Department of Justice; last March, unconsolidated revisions to a file embarrassed The SCO Group by revealing background information on preparation of a lawsuit.
This is not about any competitive advantage of Adobe PDF, Microsoft .doc or any other document format, since this vulnerability is common to many formats and associated tools.
This is about the need to elevate awareness of the fundamentals of digital content—before people get hurt or vital secrets are revealed or poorly informed legislators create costly and impractical digital-rights mandates.—Peter Coffee
Spy vs. … everyone?
Adware and spyware vendors are famous for playing fanciful games with end-user license agreements—from burying clauses about bundled software deep in a 5,000-word document written in a 6-point font to skipping the pesky document altogether.
But RetroCoder upped the ante by explicitly stating in its use policy that anti-spyware researchers are forbidden to use or examine its SpyMon surveillance product.
While not exactly a new tactic, RetroCoder took it a step further.
The SpyMon use policy states, “The owner of the copyright expressly forbids any use, disassembly, examination and/or modification by anyone who works for or has any relationship or link to an AntiSpy or AntiVirus software house or related company.”
I infer this to mean that anyone who purchases or uses anti-spyware software (that is a relationship, after all) or even knows someone who works at such a company cant install or use SpyMon.
Never mind the dubious legal standing of such a claim, the possibilities for RetroCoder incriminating its own customers are almost endless.—Andrew Garcia
Storm clouds over FEMA
While technology blunders can be amusing, they are rarely disastrous in the larger scheme of things.
But there was a blunder this year that left people in the most dire of situations, unable to request the aid they needed.
During a year that saw some of the most horrific storms in U.S. history—most notably, Hurricane Katrina—many people who needed help from the government agency tasked with providing that help were unable to get it, simply because they were using the wrong Web browser.
For most of this year, the online form used to request assistance from FEMA could be accessed only with Internet Explorer—meaning that storm victims who were using a Macintosh or a Linux-based system were unable to use the Web to ask for assistance.
To FEMAs credit, the form has now been fixed to work with multiple Web browsers, but this was one technology blunder that never should have happened.—Jim Rapoza
Fishing for phishers
Popular fraud targets (insert the name of your favorite bank or payment service here) that make good-Samaritan reporting of phishing scams difficult—as measured in the number of minutes and steps needed to successfully report a suspected fraudster—should revamp reporting tools tout de suite.
At the end of the day, phishing targets must understand that the Internet technology that makes their businesses so useful and, presumably, profitable also makes them very big targets.
To protect themselves from future injury, phishing targets need to consider more and better ways to protect their reputations.
In part, that means making it easy for good Samaritans to report suspected fraud with as little muss and fuss as possible. —Cameron Sturdevant