I have a cynical corolary on Moore's law as commonly understood to predict rapid, evolutionary, change in microprocessors. It goes like this: "in computing, as elsewhere, expertise decays in the presence of technical change, leaving only out dated reflexes and organizational position in its place".
I've worked, for example, for a client who runs a Sun 6800 that's been partitioned to form three smaller machines - and each time I point out that a set of four 3800s would have saved him nearly $600,000 while giving him more redundancy, more processors, more RAM, and more disk per "virtual" machine, he gets less and less likely to call me again.
What's going on with him is a failure to adapt to technology change. That Sun 6800 was a pretty good deal when he bought it, but its real value lies in its ability to bring 24 processors and 128GB of memory to bear on a single job like business inteligence processing or job schedule optimization. Since he doesn't have jobs needing the resources the machine offers, he should have bought four 3800s -but he didn't. Instead he overpaid by about $600,000 in order to be able to continue doing what he knows is right - even though the numbers don't support him and the combination of cost and technology to which his certainties apply have been out-dated for thirty years.
Partitioning is my favorite example of this kind of thing because it's so clearly a hallmark of one particular technology sub-culture - but it's not by any means the most egregious.
Back in the nineteen twenties, accounting technologies were undergoing rapid change as electro-mechanical tabulators replaced hundreds of thousands of clerks in repetitve functions like payroll processing, journal entry, and billing. Those machines worked by sorting through, and counting, card decks according to the information encoded on them - and organizations struggled to develop the structures and procedures needed to use and control these new "data processing" technologies.
(For a fascinating look at the frensy of change in this industry during the twenties and thirties, see: James W. Cortada, Before the Computer: IBM, NCR, Burroughs, Remington Rand and the Industry they created, 1865-1956, (Princeton University Press, Princeton NJ, 1993) )
Data processing became "automatic data processing" when advances in the thirties made it possible to "automatically" link the output of one batch job to both the control set up and input for the next. Flowmatic, developed for the US Navy in the forties and early fifties by Grace Hopper, codifed that control process - and gave rise to COBOL when IBM, among others, eventually used public funding for that project to force an open specification.
Look at the typical mainframe organization and you'll see we've carried most of that forward - from the reporting relationships and the role of data processing within the organization, to JCL and the use of job sequence to achieve application integration, the lessons from the twenties and thirties are practically hardwired into the people managing that technology.
We're seeing that same mental ossification happening again today. The traumas of the ninties technical and organizational adaptation to Microsoft's client-server architecture have produced its own crop of absolute certainties that are now being applied, willy-nilly, to Linux and other Unix implementations. Everywhere I go I now find myself talking to Linux experts -but most of them are doing what they learnt to do ten years ago: happily demonstrating their NT expertise by installing rackmounts of Linux "servers" as a kind second rate Windows clone.
Worse, they're generally not at all interested in learning about Unix. As a group they want specific how to information, because they already know exactly what to do and the absolute rightness of that knowledge is beyond question.
That's really where Murphy's corolary about the decay products of rapid technical change being stupidity and position, comes from: it's not that these people don't bring genuine expertise to their work it's that most of what they do so well is unnecessary and inappropriate.