% fortune -ae paul murphy

The IT Commandments: #2 Thou Shalt Honor and Empower thy (Unix) Sysadmins

In data processing your machine operators are nobodies - essentially semi-skilled labour invisible to anyone outside the glass room. In science based computing (i.e. Unix), however, your sysadmins are the people who work with the user community to make and implement the day to day tactical decisions characterizing successful systems operations.

Do data processing right and what you get is the intensely heirarchial mainframe (or Windows) data center with no user control, inflexible budgets, locked down desktops, and long duration change management processes that turn a request to add or change a minor function into a formal project.

Get science based processing right and what you get is centralised processing with decentralised management: combining the user control promised by the PC with the security and cost minimisation of Unix.

In the real world you don't see much sysadmin empowerment outside small businesses because big companies tend to put data processing people in charge of computing, and they see sysadmins as machine operators.

At the root of all this is a fundamental confusion: data processing is not computing.

In fact, about the only thing data processing has in common with computing is the use of computers. Unfortunately, many of the top executives ultimately responsible for computing infrastructure decisions in big organizations simply don't know this, and so rely on advice from data processing people to put data processing people in charge - and that ultimately produces the Unix data centers we generally see: metaphorical Boeing 787s towing railway cars, with users conditioned to expect, and therefore accept, defensive arrogance, missed deliveries, and inflexible routing.

Data processing started with mechanical tabulators in the nineteenth century, developed into a professional discipline during the 1920s, and became a cornerstone of business operations during the late nineteen twenties and early thirties. That was the period, too, in which fundamental organisational structures (like stringent role separation), fundamental controls (like the service level agreement), and fundamental functional assumptions (like the emphasis on reporting, the focus on processing efficiency, the reliance on expectations management, and tendency to justify costs on layoffs) all became more or less cast in stone.

When computers entered this picture in the late 1940s they weren't used to replace tabulators, they were used to control tabulators - and cost justified on layoffs among the people who had previously controlled batch processes. Thus the first assemblers were physically just that: controls enabling the automated assembly of jobs from card decks and the transfer of information from the output of one batch run to the inputs of the next one.

Out of that comes the modern data processing center: with JCL replicating much of that original control structure, COBOL directly implementing many mechanical card operations, rigid role separation now enforced in the software, and thousands of supposed applications replicating the original card deck processing and assembly routines but cumulatively forming only a small number of real applications.

Science based computing had nothing to do with any of this and focused, from its origins in the mid to late thirties, on problem solving and the extension, rather than the replacement, of human ability. Thus when Atanasoff and Zuse dreamt of solving computational problems, Shannon applied computing to communications, or Newman used a Collosus in raid planning, none of them had the slightest interest in financial reporting or other commercial tasks.

That science community focus continues with Unix today - particularly in the BSD and openSolaris communities - but it's been there from the beginning. Thus when Thomson and his colleagues first worked on Unix, they talked about forming communities:

From the point of view of the group that was to be most involved in the beginnings of Unix (K. Thompson, Ritchie, M. D. McIlroy, J. F. Ossanna), the decline and fall of Multics had a directly felt effect. We were among the last Bell Laboratories holdouts actually working on Multics, so we still felt some sort of stake in its success.

More important, the convenient interactive computing service that Multics had promised to the entire community was in fact available to our limited group, at first under the CTSS system used to develop Multics, and later under Multics itself. Even though Multics could not then support many users, it could support us, albeit at exorbitant cost. We didn't want to lose the pleasant niche we occupied, because no similar ones were available; even the time-sharing service that would later be offered under GE's operating system did not exist. What we wanted to preserve was not just a good environment in which to do programming, but a system around which a fellowship could form. We knew from experience that the essence of communal computing, as supplied by remote-access, time-shared machines, is not just to type programs into a terminal instead of a keypunch, but to encourage close communication.

(Emphasis added)

Today that's exactly what a good Unix sysadmin does: facilitate interaction among a community of users by continually adjusting services to meet their needs. As a result that sysadmin has to be empowered to act, within limits but immediately, unilaterally, and effectively, on user requests.

In contrast most machine operators have been replaced by automation, but the remaining few have no decision role, no access to users, and no control even over their own jobs.

Thus a CIO's job in a defenestrated data center is fundamentally to make sure that the sysadmins have the access, the skills, and the resources they need to respond to users - and that's the opposite of what's needed in data processing where IT management focuses on husbanding a scarce resource and deploys "client facing" people and processes mainly to buffer change in resource demand.

Confuse one with the other, try to apply lessons learnt in a hundred years of data processing to computing, and what you get is what we mostly see in larger data centers: Dilbert's world of tightly locked down desktops, long change processes, powerless systems people interacting with equally powerless users, and ever escalating costs. To fix that: adopt Unix; put science people in charge; turn IT inside out to push services and resources at userso instead of focusing inward on resource stewardship; and empower the sysadmins to work directly with users, figure out what the job is, and get it done.


Paul Murphy wrote and published The Unix Guide to Defenestration. Murphy is a 25-year veteran of the I.T. consulting industry, specializing in Unix and Unix-related management issues.