% fortune -ae paul murphy

Brief: 1

This is the first excerpt from the first book in the Defen series: The board Member's IT brief.

From the introduction:

This book is for board members and other senior managers who have to make decisions related to the organizational use of computers but are not themselves knowledgeable in the IT (Information Technology) field.

One hallmark of the strategic decisions boards are expected to make (those involving significant change in personnel, direction, scale, or acceptable organizational risk), is that getting them right generally requires the ability to weigh human, not technological, factors.

This has two related consequences: first that computer expertise is not usually required to make good strategic computing decisions; and, second that decisions requiring significant technical expertise are usually (but not always) tactical and thus not properly board business.

There is, furthermore, a corolary: that a management team which repeatedly refers tactical decisions to the board needs to be re-evaluated.

For example a choice between two IT vendors selling similar products or services requires considerable expertise to make, but is merely tactical and thus a management responsibility no matter how large the dollar value.

There is, of course, a "gotcha" in this because ensuring that the vendors being seriously considered represent the full range of strategic options open to your organization is board business and does require you to understand both what the key issues are and the broad brush consequences of the technical choices available.

That "gotcha" is what this book is about: helping you decide when to refuse a decision because merely tactical; when to ask for more information; and how to evaluate project proposals, IT personnel, and your organization's overall use of the computing dollar.

There are four chapters:

  1. Chapter One has three sub-sections: each dedicated to a cluster of related ideas you'll need to understand both for the rest of this book and in any discussion with IT people.

  2. Chapter Two is on Evaluating IT Proposals.

    It looks at the major kinds of IT decisions board members face, and focuses on helping readers recognize and understand the "tells" that distinguish certain losers from possible winners.

  3. Chapter Three: focuses on hiring systems management.

  4. It looks at the problem of who to trust in the context of hiring a CIO and helps readers assess the impact a candidate's ideas and behaviors are likely to have if set loose in their organizations; and,

  5. Chapter Four discusses some key questions your IT management should be able to answer to your satisfaction.

From Chapter 1, section 1.1 "Information Architecture"

An information systems architecture consists of:

  1. management methods or processes;
  2. software; and,
  3. hardware.

In addition, each implementation has a business or organizational context.

Nobody orders Slime Head for dinner - but Orange Roughy is a restaurant favorite
A lot of the terminology used by IT people to describe themselves is assumed from other fields. There is no "architecture" in information architecture - and very few systems or network engineers have any engineering credentials.

These elements are inter-dependent; they co-evolve as people use change in one as both a cause and a consequence of change in the other three.

From Chapter 1, section 1.2 "Technology and heritage: understanding the cast of characters"

Most people believe that the age of computing started somewhere in the late 1940s and early 50s, and has carried forward in one accelerating wave ever since, but that's not true. There are actually two computing traditions: data processing started in the 1880s and become commercially important during the 1920s, while science based computing started in the late 1930s and become commercially important in the 1960s.

IBM's 1921 Hollerith Type III electro-mechanical Tabulator, for example, dramatically lowered the unit cost of recording certain kinds of transactions and became widely used because it allowed IBM's customers to lay-off hundreds of clerks for each machine leased.

The widespread introduction of data processing machines in the 1920s brought organizational change - in particular it led to the creation of a new class of data processing professional and the addition of a formal data processing function within Finance.

Those 1920s and 30s data processing groups had three main roles: to record transactions after they occurred; to generate standardized reports on those transactions; and to produce custom reports on request.

Data for use in accounting machines was coded on punch cards -although, oddly, the standard 80 column IBM punch card used in the 1960s wasn't patented until 1937- and the machines were mainly good at counting, totalling, and sorting cards according to the values coded on them.

Each card processing step consisted of processing a batch of cards through a machine and required both its own job control (machine set-up) and its own begin and end cards. Applications like general journal maintenance translated fairly directly to card processing, while some applications, like general ledger maintenance, could naturally be structured as a series of sequential batch processes.

As a result large data processing operations developed libraries of hundreds, or even thousands, of standardized routines in their action inventories - most of them designed to produce the results needed for the next step in some overall process, thereby forming working applications like a general ledger if (and only if) run in exactly the right order.

In data processing, not much has changed since. In the late forties engineers at Hughes Aircraft used an early digital computer to automate job control parameter passing between paired data processing machines. Later Grace Hopper and her staff extended that idea to create Flowmatic, a set of pre defined card batches aimed at making it easy to pass parameters between job control decks, and thus to assemble jobs going beyond the scale of individual machine steps.

Flowmatic eventually became COBOL - and even today replicates the head and roller movement controls needed for IBM's electro-mechanical punch gear during the nineteen twenties and thirties. The IBM System 360, introduced in 1964, completed the transition from mechanical card processing to electronic card image processing. The result, known as electronic data processing (EDP), changed the means by which processing was carried out but affected neither data processing's role nor its management methods.

As a result data processing's emphasis on discipline and control continues today, the focus on after the fact processing continues today, the assumption that data processing is ultimately a cost center within Finance and exists to reduce clerical cost and error continues today, the batch job control structure continues today, the focus on report generation continues today, and so does the emotional separation between the cost driven focus in data processing and the revenue driven focus among the data producers and consumers.

The other computing revolution, in many senses the "real" one, started with men like Atanasoff and Shannon in the United States, Newman in England, and Zuse in Germany during the nineteen thirties and forties. These people knew almost nothing about data processing and were interested only in finding ways to extend human abilities to communicate accurately, to store information, and to handle complex computation. As a result their earliest efforts were applied mainly to problems in telecommunications and pure research, with military applications in logistics and communications driving initial commercialization during the fifties.

By the time the sixties arrived, science based computing had made huge inroads in the scientific community but had excercised little, if any, impact on businesses - largely because IBM had a fifty year head start on earning corporate loyalty; the average chief financial officer easily understood what data processing products and services did for him; and, most importantly, because the fact that almost nobody noticed that scientific computing and data processing had essentially nothing in common beyond the use of computers left data processing people in charge of corporate computing.

Thus when dozens of new companies sprang up to commercialize scientific computing they were left to look for bridgehead markets from which to expand. Digital Equipment Corporation (DEC), for example, sold its PDP mini-computers into academia (and thus into the research mainstream) by programming them to bridge the gap between the real need for time shared terminal services and the typical university administrator's assumption that all computers came from IBM and belonged behind protective walls in corporate data processing.

Similarly CDC, whose gear out-performed IBM's System 360 by nearly an order of magnitude throughout the 60s and 70s, sold mainly to the military and into special purpose functions like airline crew scheduling that IBM simply couldn't handle at any cost, while a host of smaller players like Data General, Wang, and Honeywell sold science applications in areas like shop floor optimization, production scheduling, text processing, or medical pharmacy and laboratory support.

By the late seventies these intruders into IBM's corporate markets were growing rapidly and many end user managers, who were the people bringing this stuff in and running it despite objections from data processing, were not only getting real productivity benefits but also starting to question data processing's continuing corporate role.

As a result a power struggle developed in which data processing managers ultimately won control of enterprise computing by claiming that they, and only they, had the right to set enterprise computing standards and therefore to run science based enterprise applications like materials resource planning. Unfortunately they had the corporate clout to get the responsibility but neither the skills nor the technology to succeed, and so another round of multi-million dollar boondoggles developed.

The resulting user and business management frustrations then created the opportunity for the PC because user managers could, and did, take enterprise standards requiring IBM computers and operating software as a license to use data processing's own controls against it by buying first IBM PCs and then cheaper clones running the same operating system software.

Two choices from 1985
IBM sold about 4 million PC/AT units during 1985 - taking in roughly 97% of corporate dollars spent on PC hardware.

At a list price of $5,500, it offered BASIC and PC-DOS with no GUI, and no applications running on a 16bit, 5.7Mhz, Intel 80286 with 256K of memory, a 10Mb disk, and dual 360K 5.25" floppies. Adding 512K to its RAM cost $1,250.

Apple sold about 50,000 MacXL units in that same year. At a list price of $5,495 the MacXL offered 1MB of RAM on a 16/32bit MC68000 CPU at 7.54Mhz (5Mhz on early models) with a 10MB disk, a 720K 3.5 inch floppy, the MacOS GUI, and a full suite of graphical applications on a high contrast, black on white, 640 by 480 screen.

The Macintosh offered the home and professional user a complete set of graphical applications with all the advantages of Postscript printing for less money than the "bare metal" PC/AT, but drew less than 1% of the market dollars.

There was a characteristic difference between companies buying the PC/AT and those choosing the Mac: essentially 100% of companies buying the Mac had no formal systems or data processing groups in place prior to the decision - and essentially all of IBM's cusomers did.

Unfortunately there wasn't any business software for the things so this produced a gold rush among third party developers - who mostly started by porting products and ideas from the well established Apple, CPM, and Unix communities: thus visicalc for Apple became Lotus for the PC, Wordstar for CPM begat Wordperfect for the PC and the Unix inverted database became the basis for Dbase for the PC.

The PC was initially what it largely still is: a single user device. To make it more useful companies like Novell introduced first shared devices like disk drives and then PC network tools enabling users to sequentially share some documents. That works for small groups engaged mainly in text processing, but does not work for things like enterprise resource management software where hundreds or thousands of users need concurrent access to the same information.

In response people developed what is now known as the Microsoft client-server architecture - an approach in which the user PC runs an application known as a "client" whose sole job is providing the user interaction needed to access business applications running on data center machines known as "servers" -in exactly the same way that mainframe terminals on desktops accessed mainframe applications during the 1970s and early 80s.

As this architecture evolved it became more and more centralized - to the point that the typical big organization PC is now fully "locked down" - meaning that all local functions are disabled- and the IT organization supporting it functionally indistinguishable from its 1980s counterpart.

Science based processing didn't just go away while all this happened. When science applications first became important in the mid sixties they ran on mini-computers (a term which simply meant "not an IBM mainframe" and has since been superseded by the word "server"). The focus for those users was the application, not the computer, and as a result they spawned an appliance computing culture: user managed, job focused, using computers simply as a means of getting the job done, not as valuable or interesting in themselves.

During that same period, from the mid sixies to the late eighties, virtually all research, embedded processing, networking, and tele-communications work coalesced around the interactive technologies envisaged for MIT's Multics project in the early sixties and implemented, largely in response to the demise of Multics at the hands of the development contractors, as Unix.

Thus the PC got the headlines and data processing got the money, but the science based hardware and applications dominant in business today -including the web, e-commerce, CAD/CAM, ERP/SCM, e-mail, identity management, and data visualization were all pioneered in the Unix subset of the appliance computing culture.

Today most of the earlier proprietary and competitive systems, other than those from Microsoft, are either long gone or on "death watch" with vendor support cancelled or ending. As a result three major Unix variants now dominate research, telecommunications, and the embedded device markets worldwide. These are Linux, the BSDs, and Solaris. All three are available in open source, all three run the pretty much same applications and all three operate on all of the most common hardware.

Next week: from 1.2.1 - the consequences of culture

Some notes:

  1. These excerpts don't include footnotes and most illustrations have been dropped as simply too hard to insert correctly. (The wordpress html "editor" as used here enables a limited html subset and is implemented to force frustrations like the CPM line delimiters from MS-DOS).

  2. The feedback I'm looking for is what you guys do best: call me on mistakes, add thoughts/corrections I've missed or gotten wrong, and generally help make the thing better.


Paul Murphy wrote and published The Unix Guide to Defenestration. Murphy is a 25-year veteran of the I.T. consulting industry, specializing in Unix and Unix-related management issues.