% fortune -ae paul murphy

A programming proposition

Here are two phrasings for the same proposition - both applicable, as is everything in this discussion, only to larger scale interactive business applications.

Any application model that is not the application omits important detail and is necessarily incomplete or misleading with respect to critical internal relationships.

and:

The smallest applications model that is both sufficiently precise and sufficiently complete to fully define the application, is equivalent to the application.

Historically lots of people, many of them with diagramming agendas, have bought into the notion that a fully defined model is equivalent to the application. As a result there have been many, probably at least hundreds, of efforts to make and sell tools capable of automatically generating executable code from models - initially, I believe, mostly COBOL and IMS code generated from E-R diagrams and most recently Java/DB2 code generated from UML. All of these have been failures in the strict sense that none has gone beyond more or less working for some parts of some applications - but to my knowledge no one's publically drawn the obvious conclusion: diagrams are simply too imprecise for use as programming notation.

Thus developers may want to draw some swim lanes or other diagrams as aids to clear thinking, but spending time and effort trying to model business applications as part of the specifications process is a waste of resources because equivalence means that by the time you get an adequate model in place, you've duplicated the application development effort -but without producing a working application.

Since you have to show users something concrete to get design feedback it follows, I think, that: the right way to develop a business application is to prototype it with users, test the prototype with real data, and then when you get something that works, just relabel it as production and start working on the next generation.

If that's true, it has interesting consequences, including:

  1. no application is ever finished: an application that's been running the business for five years is just a prototype for the application needed next week.

  2. corporate data management is a critical means of communicating shared values across multiple user communities - and in the context of available technologies this functionally requires organizationally centralized data management and thus processing.

  3. for a prototyping strategy to work over the long haul IT has to be user owned - with centralized data storage, procurement, and processing but run by people who work in user groups, report to user management, and together form the organizational interface between users and the technologies used.

    Logically, therefore the boring stuff - running the gear, backing up the database - belongs in a central IT function while the fun stuff -using, choosing, and changing applications - belongs in the user communities.

Think about this a bit and you should see lots of other organizational implications - but there's one that may be a both a little less obvious and a source of significant discomfort: because if the lemma is true and the organizational consequences approximately what I say they are, then one cause for IT's perennial failure to meet user needs and expectations may simply be that our own organizational structures - structures which evolved to fit 1920s data processing technologies - are completely counter-productive in today's technical and business context.


Paul Murphy wrote and published The Unix Guide to Defenestration. Murphy is a 25-year veteran of the I.T. consulting industry, specializing in Unix and Unix-related management issues.