A Brand New Internet?

 
 
By Larry Seltzer  |  Posted 2005-07-01 Email Print this article Print
 
 
 
 
 
 
 

Opinion: Surely it's impossible at this point and it's better just to fix the broken parts, right? Maybe not.

The Internet began as an experiment: Did it succeed? Obviously, in many ways it did, but it also failed in important ways. So what do we do about that? Ive thought in the past about the idea of starting all over with the Internet, although I was thinking most of SMTP, rather than the whole network. Its still useful to think about replacing individual protocols and applications, but some problems are more fundamental than that.

Now reports indicate that one of the leading researchers who built the basics of the Internet wants to start over again and build a new one, getting it right this time. Security is a big part of the equation.

Instead of looking at this-and-that protocol, David Clark wants to look at everything.

Boy is this tempting. One of the chief criticisms of such a plan, as Robert Kahn, another early architect, puts it, is that all of the problems Clark wants to fix can be fixed incrementally rather than building a whole new network.

In fact, I think Kahn misses an important point: You cant fix things incrementally on the Internet anymore.
Several recent examples indicate that vested interests can stop just about any effort to reform Internet protocols and standards in order to make them more secure. It may actually be easier to build a new network, using our sad experiences with the current one, and then see if it can compete in the market based on merit.

Im serious about the resistance to change. As I noted in my column about throwing away SMTP, applications have developed over the years to utilize many of the aspects of this and other Internet protocols that need to be changed, and thus change damages their interests. Unfortunately, the most common interest threatened is in the ability to do whatever you want on "your Internet connection." To me, this sounds like wanting to drive however you want on "your road." A new, more secure Internet, would be far more restrictive.

Another example of a change that would be to the benefit of everyone, but wont be happening any time soon, is DNSSEC, a secure version of DNS in which all responses from the server are digitally signed, eliminating a significant class of attacks.

But if someone with a whole lot of money and patience were to build a new Internet, there would be no interests to threaten. The U.S. Department of Commerce plans to keep control of the Internets DNS (Domain Name System) rather than hand it off to ICANN as originally agreed. Click here to read more. Actually, there could be, depending on who does the design and what their priorities are. But Id at least like to think that designers could be more forward-looking and less parochial in designing a new network than in protecting their position on the old one. Am I naive?

It would be tempting to make the new network coexist on top of the old one, but I think this, too, is a mistake. I can see it working well on the same physical network as the Internet, but I think its worth avoiding TCP/IP just so its that much harder for outside attacks to make their way onto the new network. Perhaps IPv6 is a good base for it, I couldnt say, but breaking with IPv4 is a good way to break from a lot of the old problems.

So why would anyone use it? Once it was established with some proof of concept sites on it, Im sure some large companies would create presence on the network, if only to be forward-looking. Companies like Sun and Microsoft would want to sell tools and software for it and would want to show their own products off on it. Some ISPs could provide access to it along with their Internet v.1 access. I know Im blue-skying here, but I could see it working out. What I cant see happening is substantial change to the existing Internet.

Some security problems are difficult to engineer around even if you start from scratch, including most that revolve around social engineering. Fundamentally, if people have rights to do something and can be tricked into doing it, its hard for software to stop them. But there are improvements that can be made. Its not hard to imagine technical changes in e-mail and HTML to make phishing attacks harder, if not impossible. Its not hard to imagine authentication methods that would make it easier to detect a Web site that is not what it appears to be.

I am discouraged, though, when I see, with Clarks proposal and with Internet2, a focus on new high-end applications and performance. Maybe Im the one whos being parochial now, but I dont think the Internet is suffering for lack of performance. Its suffering for lack of security, and Id happily consider a new network that addressed security at a core level.

Security Center Editor Larry Seltzer has worked in and written about the computer industry since 1983. He can be reached at larryseltzer@ziffdavis.com. Check out eWEEK.coms for the latest security news, reviews and analysis. And for insights on security coverage around the Web, take a look at eWEEK.com Security Center Editor Larry Seltzers Weblog.
 
 
 
 
Larry Seltzer has been writing software for and English about computers ever since—,much to his own amazement—,he graduated from the University of Pennsylvania in 1983.

He was one of the authors of NPL and NPL-R, fourth-generation languages for microcomputers by the now-defunct DeskTop Software Corporation. (Larry is sad to find absolutely no hits on any of these +products on Google.) His work at Desktop Software included programming the UCSD p-System, a virtual machine-based operating system with portable binaries that pre-dated Java by more than 10 years.

For several years, he wrote corporate software for Mathematica Policy Research (they're still in business!) and Chase Econometrics (not so lucky) before being forcibly thrown into the consulting market. He bummed around the Philadelphia consulting and contract-programming scenes for a year or two before taking a job at NSTL (National Software Testing Labs) developing product tests and managing contract testing for the computer industry, governments and publication.

In 1991 Larry moved to Massachusetts to become Technical Director of PC Week Labs (now eWeek Labs). He moved within Ziff Davis to New York in 1994 to run testing at Windows Sources. In 1995, he became Technical Director for Internet product testing at PC Magazine and stayed there till 1998.

Since then, he has been writing for numerous other publications, including Fortune Small Business, Windows 2000 Magazine (now Windows and .NET Magazine), ZDNet and Sam Whitmore's Media Survey.
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...

 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Rocket Fuel