Throw Away the Internet; Start All Over

 
 
By Larry Seltzer  |  Posted 2003-04-21 Email Print this article Print
 
 
 
 
 
 
 

The original designers of the Internet had no reason to consider security issues, writes Security Supersite Editor Larry Seltzer—so they didn't. Could we throw away the old Net and do it right?

Sometimes I look at the Internet and I see so many different ways being used to compromise security that I wonder whether wed be better off trashing a lot of the existing infrastructure. After all, the Internet was designed to be secure from nuclear attack, not its own users. The whole idea of network security probably never occurred to the designers of the Internet and the main applications that run it. In my mind, the biggest failure in this regard is SMTP, the dominant mail protocol of the net. Spam is as pervasive as it is because of weaknesses in SMTP. We know how to fix these problems; the problem is that doing so would break existing applications, which means e-mail in general. This is always a bad thing, but its not always a deal-killer. I think this is one area where, in the long term, it may make sense to move away from a protocol that has allowed e-mail to get out of control.

I asked a few people involved in solving the problems of e-mail what would be involved in fixing it. This put them in an awkward position of conflict; after all, spam-filtering vendors and other security companies make their living because these problems exist. But I think these problems are likely to get worse before they get better, and real solutions are something for our children more than for us. Youll be able to make a decent living in the security industry for a long time.

Tonny Yu, founder and CEO of Mailshell, says that any new and better replacement for SMTP would have to have some sort of certification system to guarantee that senders are who they say they are. The obvious candidates would be certificate services like Verisign, but if demand shot up perhaps there would be more competition. Mail servers would also have to be certified, or mail sent to them would not be trustworthy.

The other important requirement, according to Yu, is a system for tracking resource usage per sender. Basically this means that profiles should be established for normal amounts of mail sending from different types of users. If you limited normal users to 100 messages per second and major companies to 10,000 messages a second it would be hard for legitimate users to complain, but spamming would be much harder.

Once these systems were in place, and assuming they were implemented well, it would be simple to build tools to filter out mail that was uncertified or abusive in terms of volume, and even to blacklist users and servers that facilitate it. Conversely, whitelisting would become easier because you could whitelist users based on their certificates, not based on a from: address that is easily spoofed.

You cant just order everyone to adopt a new system and throw the switch. Over some period of time I think there would have to be SMTP gateways into the new system. Its fair to say that mail from those servers should be treated as less trustworthy than from those in the new network. Therefore that mail could be subjected to scrutiny for forged headers and so on. While any real effort at this would take a long time, I would hope that if a new network could demonstrate itself to be immune to enough significant problems it would attract new users.

Its entirely possible that if this were done right, it would increase the costs of e-mail. But up to a degree, thats just fine with me. Dirt-cheap e-mail is one of the problems that made spam so appealing to marketers. Id actually be glad if it were more expensive to send than receive e-mail. The cost increase would be trivial for normal users, but potentially crushing for spammers (and perhaps to "legitimate" direct markets; cest la vie). In the longer term, it will lower other costs, especially if it reduces spam significantly. Think of the diminished traffic load. I think its also fair to say that it will tend to reduce the volume of Internet worms and viruses because true authentication will make it easier to identify those who are infected and spreading such malware, many of which come with their own embedded SMTP servers.

Strictly speaking, strict certification means an end to anonymity in e-mail. Of course, it was never really supposed to be anonymous, and real e-mail anonymity is only possible if you forge headers and if your mail-server admin doesnt care. Speaking of not caring, I dont care about the anonymity problem. Its not the only problem out there and it doesnt completely trump others, like anonymous pornographers e-mailing our kids.

If only the designers of Internet2®, an academically based effort to develop and promote advanced networking applications, were concerned with such matters, but they have their sights elsewhere. Perhaps its time for someone to start Internet 1.5. (Id go out and reserve the name myself if it were a legal one.) Everyone knows its the .5 version that gets it right.

Security Supersite Editor Larry Seltzer has worked in and written about the computer industry since 1983.
 
 
 
 
Larry Seltzer has been writing software for and English about computers ever since—,much to his own amazement—,he graduated from the University of Pennsylvania in 1983.

He was one of the authors of NPL and NPL-R, fourth-generation languages for microcomputers by the now-defunct DeskTop Software Corporation. (Larry is sad to find absolutely no hits on any of these +products on Google.) His work at Desktop Software included programming the UCSD p-System, a virtual machine-based operating system with portable binaries that pre-dated Java by more than 10 years.

For several years, he wrote corporate software for Mathematica Policy Research (they're still in business!) and Chase Econometrics (not so lucky) before being forcibly thrown into the consulting market. He bummed around the Philadelphia consulting and contract-programming scenes for a year or two before taking a job at NSTL (National Software Testing Labs) developing product tests and managing contract testing for the computer industry, governments and publication.

In 1991 Larry moved to Massachusetts to become Technical Director of PC Week Labs (now eWeek Labs). He moved within Ziff Davis to New York in 1994 to run testing at Windows Sources. In 1995, he became Technical Director for Internet product testing at PC Magazine and stayed there till 1998.

Since then, he has been writing for numerous other publications, including Fortune Small Business, Windows 2000 Magazine (now Windows and .NET Magazine), ZDNet and Sam Whitmore's Media Survey.
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Thanks for your registration, follow us on our social networks to keep up-to-date
Rocket Fuel