Air Time: The Internet as a Legacy System

The private sector alone is not capable of addressing the Internet's architectural flaws. Government and quasi-governmental organizations must play a bigger role.

Dave Molta

July 29, 2005

2 Min Read
Network Computing logo

Global Legacy

Today's global Internet is the most problematic legacy system. Around for a generation, it is an invention surely on the shortlist of the most significant innovations in human history. But its architecture is woefully obsolete, and perhaps even dangerous.

The roots of the Internet and TCP/IP date to the early days of packet switching, when circuit-switched and hierarchical networks were considered best practice. Remember that it was a relatively obscure Boston company called Bolt Baranek & Newman, rather than a major computer or telecom industry player, that won the Department of Defense bid. Legend suggests that the brightest minds at AT&T and IBM concluded that packet switching would never work. The Internet pioneers may have proved them wrong, but their underlying system design has required frequent tactical fixes--"architectural barnacles," as some critics refer to them, unsightly outcroppings that address short-term problems but have a negative impact on other key system attributes.

Two notable deficiencies are security and mobility. Its trusted-system architecture makes the Internet an easy target for hackers and potentially for cyber-terrorists. On the mobility front, the IP address design that seemed adequate in the 1970s won't meet the needs of millions and perhaps billions of wireless devices. Fixes like SNMPv3, IPsec and Mobile IP have short-term appeal, but they really are just ugly hacks. As the complexity of the underlying problems increases, the potential for incremental solutions diminishes.

No Simple SolutionsThe private sector alone is not capable of addressing the Internet's architectural flaws. Government and quasi-governmental organizations must play a bigger role. With the DoD more concerned with weapons systems than data networks, responsibility in the United States goes to the National Science Foundation, which funds the basic research needed to address these monumental challenges.

Recently, the academic research community has been pressuring the NSF to step up to those challenges. Although the NSF's funding for Internet2 has led to some interesting work, that network's status as a "production test bed" has made it, for all practical purposes, a parallel Internet, using the same, limited protocols. Researchers at MIT, Princeton, the University of California and the University of Washington are pressing for a more radical approach that will overcome what Princeton computer scientist Larry Peterson refers to as an ossified and unalterable status quo. They've also challenged researchers to come to grips with what they call the "impact imperative"--in essence, any new architecture must be implementable in the context of today's global Internet.

Although the complexity of this undertaking is immense, meeting the needs of the next generation of Internet users and sustaining the technical innovation that fuels societal progress are clearly at stake. Dave Molta is Network Computing's senior technology editor. Write to him at [email protected]

Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like

More Insights