% fortune -ae paul murphy

The PC vision was lost from the get-go

Note: this is a guest blog by Mark Miller. It started out as a comment on something I wrote, but I thought it deserved wider attention so Mark made a few minor changes and...here it is:


Alan Kay was one of the original visionaries of the personal computer concept. His idea was that it would be a personal medium, having the same impact on society as that of books after the printing press was invented. It would be a machine upon which ideas could be tried out in a graphically visual environment, experimented with, interactively with others in a P2P fashion. The environment would be dynamic, able to be altered in realtime--hence the name he gave it: Dynabook. It would be a system that would incorporate into itself knowledge gained by the user through research and experimentation via. a programming language that was easy enough to understand so that children could use it, but powerful enough that adults would find it useful. Parts of the system could be altered by the user to suite their needs. It would also be a device for information retrieval, and one by which one could make transactions over the internet (called the ARPANet at the time he outlined this vision).

The vision of a PC as an information retrieval/transaction device has been realized on the web via. the PC. But the first vision that he thought was the most important, that of the PC as a new medium, has not been fully realized. Instead what occurred was a reification of the minicomputer experience, and later the mainframe experience, in some scenarios. In fact, the only computer around today that comes close to resembling this vision is the XO Laptop, which is the reason Kay has been involved with it.

Some of the GUI ideas Kay's team developed at Xerox PARC were grafted onto the popularized PC platform. The closest that any popular software platform has come to realizing his "medium" vision is MS Office, but it's a limited, and in some ways poor example.

So in short, the vision was to have something that had qualities of a portal to a wider world, but also a localized quality of a medium. Instead what we got were stovepipes of functionality and data, and limits on interaction and experimentation, unless you wanted to learn to program an easy-to-learn, but limited language, or a difficult-to-learn language that allowed more system/machine power, but didn't encourage the building of knowledge into a system. The closest you got was grafting functionality onto it.

In recent years this model has been pushed aside more and more, and it's increasingly being turned into a terminal, with occasional local work being done on it. People haven't liked the notion of "distribute by copy" of GUI apps. They prefer "distribute by access" of the web. GUI apps. are trying to be more friendly to this desire. It'll take time for that to flesh out.

The culture that ended up bringing the PC to market was very different from the culture that created the original vision. The culture that popularized it saw the PC as a new "old thing"--it was a traditional computer, just smaller and more limited in capability, but one you could own and have some control over. No more centralized control, and no more asking permission to access computing power. That was the idea, anyway. The motivations had more to do with access and "democracy" to traditional notions of computing rather than something that was really new. So what we have now are updated versions of notions that existed decades ago.

"The computer revolution hasn't happened yet." For example, what's virtualization today but just a newer version of software virtual machine technology that existed on mainframes decades ago? What's a web browser but a VT3270 terminal with some features added, that's hooked up to a distributed, rather than a centralized, proprietary batch system? It's smaller. It's faster. It's cheaper in terms of equipment costs. But there's not much new here in terms of concept. As Kay has said even recently, "The computer revolution hasn't happened yet."

What's resulted is a cobbling together of technologies to try to get a distributed system that will work together, many times just barely. The old model of mainframes at least had a sense of cohesiveness about them, even though they were proprietary. I'm not endorsing mainframes, but I think it's worth looking at what we've done and see whether what we've created is any better than what came before in terms of enhancing our abilities to work more productively.

I think some progress has been made, but it's not the large leap in productivity I expected. And sometimes I wonder if the software systems we've created are actually more of a distraction rather than a help to the people who use them, and whether they're any better than the paper-and-file based systems they replaced.

Paul Murphy wrote and published The Unix Guide to Defenestration. Murphy is a 25-year veteran of the I.T. consulting industry, specializing in Unix and Unix-related management issues.