A couple years ago I wrote an April Fools news story about how the Internet was going to be taken down over the weekend for maintenance. Wouldnt it be nice if things like this were possible?
Of course it was just a joke, and thats how we should treat the notion of building a new Internet to replace the existing one. This is one of those ideas that come up periodically and seem like a good one. Were coming up on the 4th anniversary of the first time I proposed rebuilding the Internet. Now, once again, “scrapping the Internet” is a fashionable research idea in pursuit of funding.
It was a naive idea when I brought it up and its still naive. Even then I could see the problems that would result and Im a lot less optimistic now about solving big problems than I was then.
The people at Stanford who have set up the Clean Slate Design for the Internet at least recognize one important point: the network itself is not the extent of the problem and that “heterogeneous applications” are a major one. This means they will rethink things like http, smtp, audio and video.
They could work for many years building something truly great and I suspect it wont matter. There are two basic problems with this approach, and especially with doing it in an academic environment.
First, there are a lot of people and big organizations out there using these applications and they dont want their investments of money and time threatened.
As the articles linked to above point out, any grand re-thinking of the Internet that has any momentum will bring in stakeholders with important and mutually incompatible interests.
Users will have an interest in compatibility with existing systems. Software and hardware companies will have an interest in forcing upgrades. Privacy advocates will want to influence every step of the way, and law enforcement all over the world will work in opposition to them, and both will have valid points.
This brings us to the potential problems with academic groups: Such groups may be insulated from outside pressures, but that will just mean that stakeholders will be unhappy. Good luck picking the right group to satisfy.
Second, private industry and mainstream standards bodies wont be sitting still. I do think that the Internet really has gotten better and more secure over the years; its just the number of users and the sophistication of the attackers that has increased dramatically. But private industry, open-source efforts and standards groups have worked to improve things over that time. Its still possible for a change to be made if the conditions are right.
IPSec is a good example. It was originally designed as part of IPv6, a spec that has a lot of weight behind it and yet it is still going nowhere, except perhaps in China. But IPSec solved a specific problem that needed solving and having a standard was desirable, so it became the standard on IPv4 for VPNs, at least in many scenarios.
Even so, some applications, broken as they can be, are intractable. SMTP is the best example. Multiple large companies and standards bodies have attempted to fix a system that is conspicuously broken, but the investment in the existing system is so immense that stakeholders veto any meaningful change.
And a ground-up rethink of the Internet still wont fix application-layer bugs in most applications. How can they possibly stop, for example, the latest Skype worm?
Changes to SMTP will only come in slow motion, over a long period of time, so that users arent substantially inconvenienced by them. Only in this way can changes to the major Internet applications be effected.
So if theres any future for research such as this its going to have to be as upgrades “live” on the existing Internet. Put them on some parallel network beloved by snooty technical overlords and they will go unused. Lets hope the directors of New Internet research have conservative instincts.
Security Center Editor Larry Seltzer has worked in and written about the computer industry since 1983.
More from Larry Seltzer