% fortune -ae paul murphy

From Chapter Four: The Unix and Open source Culture

This is the 41st excerpt from the second book in the Defen series: BIT: Business Information Technology: Foundations, Infrastructure, and Culture

Roots (2)

In its present form this commitment to publication and peer review, originally derived from the core academic and scientific approach giving rise to the science breakthroughs achieved in the forties and fifties, has given shape to the open source movement.

Thus as early as 1973, universities were given more or less open access to Unix system code and its ideas then became the nucleus around which significant communities of expertise developed at centers like Berkeley, MIT, and Carnegie-Mellon, among many others.

That early geographical clustering of expertise around machines and Computer Science or Engineering faculties led to an explosion in Unix variants and the research that went into them. As a process this wasn't very different from what we see in open source development today. Time scales were longer mainly because the people involved worked in smaller, more isolated, clusters because pre-internet communication methods took too long to allow geographically dispersed teams to form and succeed, but the basic processes of open peer review and contribution haven't changed.

In 1976-77 for example, Ken Thompson, one of the key originators of Unix, took a six-month sabbatical from Bell Labs to teach at UCB (The University of California at Berkeley) where he laid the foundation for what became the Berkeley Software Distribution (BSD) series of Unix releases on DEC PDP and VAX equipment.

Thus the "release early, release often" strategy that supports the evolution of Linux today, applied equally to the evolution of BSD Unix in the late seventies and early eighties. The key difference is simply that a new Linux kernel release can now be made available in minutes to thousands of people without regard to physical distance but, in 1981, ground breaking Unix development work at Berkeley became immediately available to only a handful of local colleagues while people at places like Rutgers or The University of Alberta either waited three months for a new distribution tape or invented their own solutions.

The other phenomenon we see today, businesses like Red Hat trying to use open source for commercial advantage, existed then too. Noorda's Novel networks, for example, started by commercializing network technology as a means to implement device sharing. Here the difference in time scales is far less: it took two years to get a business off the ground then, and it takes about that now.

This "trial and review" process was not understood in the seventies and eighties with the result that the traditional assumptions about code development and the power of externally imposed standardization often prevailed - particularly in textbooks and among the press.

Most of the business press therefore saw the proliferation of operating systems research ideas expressed in multiple Unix releases as evidence of fragmentation and a loss of direction. They, therefore, referred to `the Unix wars" as if their assumptions reflected reality and indeed a few of these mistaken beliefs were later made real when some of the more successful commercial Unix users started to make management decisions based on them.

For example, Digital Equipment Corporation signaled the beginning of its own end when it joined with its dominant competitor, IBM, (and seven smaller competitors) to create the Open Software Foundation (the OSF was folded into X/Open in 1994 and is not be confused with the FSF, Richard Stallman's Free Software Foundation) in 1988 ostensibly to establish a common Unix but in reality mainly to try to limit Sun's explosive market growth.

By 1983, however, most major universities in the US and Canada had Vaxes running Berkeley Unix and were contributing members of Usenet - an internet precursor built using an application called "UUCP" - Unix to Unix Copy- that moved email and other documents between systems. Thus geophysically dispersed teams started to come together right about the time that TCP/IP and Sun's NFS (network file system) started to both multiply and accelerate the connectivity links between people.

By 1985 Unix users almost anywhere in Canada, the US, and parts of western Europe could communicate on-line using highly reliable circuits and what we now think of as the open source movement had began. Its most famous contributor, at least prior to Linus Torvalds, was Richard Stallman, primary author of the EMACs editor; founder of the Free Software Foundation; and originator, in 1984, of the GNU (Great New Unix) project.

The GNU project set out to duplicate the functionality of all major Unix tools in open source equivalents. "Information wants to be free" became the rallying cry as thousands of dedicated people dispersed across hundreds of sites contributed to this effort. Today GNU's tools form the majority of every significant non commercial Unix release including Linux, freeBSD, and Darwin (which runs the MacOS X shell). They run, furthermore, on every proprietary Unix and are often the preferred tools on Solaris, AIX, IRIX, OSF/1 (although Digital is long gone, OSF/1, now morphed into Tru64, continues to be used), and HP-UX.

The basic tenet behind the open source movement is that there is social and personal value in contributing to research and the implementation of research results in software. The most widely known analysis of the resulting open source culture has been published by Eric Raymond, author of The Cathedral and the Bazaar, an important paper which can be found on dozens of internet sites.

Here's the abstract he gives on his website:

I anatomize a successful open-source project, fetchmail, that was run as a deliberate test of the surprising theories about software engineering suggested by the history of Linux. I discuss these theories in terms of two fundamentally different development styles, the "cathedral" model of most of the commercial world versus the "bazaar" model of the Linux world. I show that these models derive from opposing assumptions about the nature of the software-debugging task. I then make a sustained argument from the Linux experience for the proposition that "Given enough eyeballs, all bugs are shallow", suggest productive analogies with other self-correcting systems of selfish agents, and conclude with some exploration of the implications of this insight for the future of software.

From the beginning Unix has been about open computing, collaboration, and communication between people. It is a highly technical system, but its focus has never been on the technology. "The network is the computer" is a Sun marketing slogan, but it's also the truth about Unix systems development and use: it's the networking effect among people that counts, not the desktop or server hardware.

Thus its value as the pre-eminent secure computing platform for commercial and web based services delivery is a spin-off from the research process in the much the same way that the microprocessor and internet were spin-offs from defense research.


Some notes:

  1. These excerpts don't (usually) include footnotes and most illustrations have been dropped as simply too hard to insert correctly. (The wordpress html "editor" as used here enables a limited html subset and is implemented to force frustrations like the CPM line delimiters from MS-DOS).

  2. The feedback I'm looking for is what you guys do best: call me on mistakes, add thoughts/corrections on stuff I've missed or gotten wrong, and generally help make the thing better.

    Notice that getting the facts right is particularly important for BIT - and that the length of the thing plus the complexity of the terminology and ideas introduced suggest that any explanatory anecdotes anyone may want to contribute could be valuable.

  3. When I make changes suggested in the comments, I make those changes only in the original, not in the excerpts reproduced here.