% fortune -ae paul murphy

On re-inventing the internet

Last week a story by Anick Jesdanun made the rounds of, according to google, 211 nominally separate news sources under the general title Reinventing the Internet. The opening paragraphs; quoted here from the Kansas City Star:

Although it has taken nearly four decades to get this far in building the Internet, some university researchers with the federal government?s blessing want to scrap it and start over.

The idea may seem unthinkable, even absurd, but many contend that a ?clean slate? approach is the only way to truly address security, mobility and other challenges that have cropped up since UCLA professor Leonard Kleinrock helped supervise the first exchange of meaningless test data between two machines on Sept. 2, 1969.

The Internet ?works well in many situations but was designed for completely different assumptions,? said Dipankar Raychaudhuri, a Rutgers University professor overseeing three clean-slate projects. ?It?s sort of a miracle that it continues to work well today.?

No longer constrained by slow connections and computer processors and high costs for storage, researchers said the time had come to rethink the Internet?s underlying architecture, a move that could mean replacing networking equipment and rewriting software on computers to better channel future traffic over the existing pipes.

The story contains four pointers to additional internet based information sources:

  1. More information on rebuilding the Internet: http://cleanslate.stanford.edu
  2. Carnegie Mellon program: http://100x100network.org
  3. Rutgers program: http://orbit-lab.org
  4. National Science Foundation GENI: http://geni.net

The Stanford site offers a white paper which, among other things, suggests that the Internet's current peer to peer communications model be replaced with a more heirarchial system - meaning, in practice, that someone gets to control choke points on the network much the way PC LANs still often use proxy servers to connect to the internet, but expanded across the entire system.

The CM site has little of interest to others: an FAQ consisting of one question (why 100Mb/s?) pointers to the funding proposal and research papers they've reviewed, and a bunch of more or less related stuff including a project web link inaccessible to outsiders.

The Orbit site is very interesting. Here's the "bullet" from their FAQ:

The ORBIT radio grid emulator is an indoor wireless network testbed. It supports experimental research on a broad range of wireless networking issues and application concepts with various network topologies and network layer protocol options. It also support virtual mobility for mobile network protocol and application research. The ORBIT radio grid emulator currently uses 802.11a/b/g based radio cards.

And the NSF site is pretty much what any funding authority puts up for any semi-public project: both open and superbly uninformative. Here's a particularly interesting example of their text:

Communications systems such as the Internet and the telephone system (which is morphing into the Internet) are perhaps the largest and most complex distributed systems ever built. The degrees of interconnection and interaction, the fine-grain timing of these interactions, the decentralised control, and the lack of trust among the parts raise fundamental questions about stability and predictability of behaviour. There is beginning to emerge some relevant theories of highly distributed complex systems, some of which have roots in control theory and some of which draw on analogies with biological systems. We should take advantage of this work in this redesign, to improve our chances that we come as close as possible to the best levels of availability and resilience. There may be other important contributions from the theory community, for example, the use of game theory to explore issues of incentives in design of protocols for interconnection among competing Internet Service Providers. This is a chance for CISE to engage members of the theory community in this program.

My review of this stuff is colored by an assumption about the way research and large scale agency funding interact in the real work of academic politics. Specifically I believe that, in most cases, the monies go to funding managers and others with established reputations but no contributions to make beyond doing more of whatever they did to get established.

When push comes to shove, i.e. when the research papers come due, the work gets turned over to graduate students and untenured faculty, some of whom then make real breakthroughs.

Quite often those breakthroughs seem anti-thetical to the promises made in the funding proposals and therefore develop in opposition to the nominal purpose of the research - creating apparent conflicts that last until well after project sign-off, but ultimately form the basis for new directions in both practice and theory.

In this case the monies seem premised on the idea that improving the internet requires three things: bigger pipes, more content convergence, and tighter controls including end to end cryptology and an end to the lawlessness enabled by the peer to peer communications model.

Of these, the first two are continuations of present trends - trends it's hard to see changing any time soon.

It's that third one: the end of the open communications model, that combines threat with promise and is therefore worth talking about. Basically what the proponents seem to be saying is what any good corporate auditor would say: for communications to be meaningful, the receiver has to know who the sender is, what authority the person operates under, and what personal or organisational value to attach to that person's communication - and, implicitly, that it's perfectly ok to trust a choke point authority to enforce the accuracy of this information.

My problem is that I think this is reasonable within a business hierarchy established to pursue a common purpose - but that the recent actions of the communist Chinese government in cracking down on seditious person to person communication among its subjects illustrates everything that's wrong with applying these ideas across the broader communications networks of the world.


Paul Murphy wrote and published The Unix Guide to Defenestration. Murphy is a 25-year veteran of the I.T. consulting industry, specializing in Unix and Unix-related management issues.