% fortune -ae paul murphy

The client-server experience

This government organization has about 2,400 PCs. All are under an out-sourced evergreen agreement that includes a two year refresh cycle, so about a third, mostly in managerial offices, are now on Windows 7 and the remainder scheduled to go that way by the middle of the next fiscal year. In addition they have a highly virtualized data center with about 160 real x86 boxes (most running multiple Windows Server and application instances under VMware), along with the usual shared data stores and absolutely centralized networking connection to the rest of the world.

Look at them in terms of industry norms, and they look pretty good: professionally run, significant system wide redundancy; active PC security group; near parallel off site backups; solid evergreen, communication, and usage policies in place; and even some decent third party career management services for IT employees.

The history here is that this organization launched, in about 1985, an effort to update its systems (then built around the 327X/3096 architecture), experimented briefly with both WANG and DEC as suppliers, and had, by the time the 91/92 fiscal year rolled around, settled on the client-server architecture recommended by their current out-sourced services supplier. Since then they've gone from 11 IT staff to about 80, from betting on OS2 to Windows 7, from megabytes to terabytes, and from rigidly limited systems and data access to a claimed information ubiquity.

In reality, however, the IT implementation has changed from 327X/3096 to Wintel client-server but the organizational structure, access controls and underlying work processes have survived with little, if any, significant change since the 1960s. Thus information is still rigidly compartmentalized: clerks entering one kind of data cannot see other relevant data; the front line people cannot get access to either the detailed institution reports or the operational cockpit data available to a handful of senior people; the people screening applications exchange paper with the people doing enforcement monitoring; and so on.

I'm told that IT people working with senior management have tried to address some or all of this at various times through initiatives like a MOM project and an organization wide Notes implementation, but the claim is that work to rule responses from middle management effectively gutted the amazing technical success each such project achieves on the resumes of those involved.

Thus the bottom line here is, I think, that what they have now is a high cost implementation of the 327X/3096 architecture they sought to replace in 1985 - and that raises the question: had they lived in some alternative universe and gone to SunOS with NCD X-terms and some Apple laptops in 1988 or 89 where would they be now?

The answer, I think, is that they'd now have evolved to Solaris, Sun Rays, and iDevices; they'd still have about 11 staff and no out-sourcer; and their history in between would included far easier access to more software with fewer failures and security risks -meaning that the impact of IT limitations, costs, and assumptions on their organizational evolution would have been both vastly smaller in absolute terms and generally more positive than negative.

How those differences would be expressed is obviously a matter for speculation since the only organization I'm familiar with that actually made that choice more than twenty years ago is a high security operation subject to radically different organizational pressures - but I believe the direction and rough extent of those differences can be assessed by asking one question: where in the 20+ years of system evolution between that 1989/90 choice and today is the client-server advantage to the organization? What, in other words, does or did the existing choice make possible that otherwise would not have been?

The most important place it's not is in the devolution of IT control to user management: in more than twenty years it not only hasn't happened, but centralized control has actually been strengthened over the period with the reaction to server growth and document loss during the NT years having led to centralized document management, centralized standards enforcement, and a complete loss of communications autonomy by unit management.

In contrast, the benefits to IT are obvious: more staff, more money, and the ability to play to common perceptions about wintel to absolve themselves of responsibility for performance, security, and continuity.

So what, bottom line, did the organization get for making the client-server decision? more IT costs, more leaks, tighter central control, less user accessible software, and reduced performance?


Paul Murphy wrote and published The Unix Guide to Defenestration. Murphy is a 25-year veteran of the I.T. consulting industry, specializing in Unix and Unix-related management issues.