Windows, Open Source, and National Security

- by Paul Murphy, -

Two weeks ago I wondered out loud about the top ten worst business IT decisions ever made and nominated HP's decision to follow DEC down the road to oblivion for top spot. Today I'd like to suggest that the US Defence Department's continued use of Microsoft's software is likely to top a future list of this kind.

The equation here is simple. First, recognise that Microsoft's software security depends crucially on keeping its source secret. That's not a comment from an anti-Microsoft bigot either, that's the testimony given under oath by Microsoft vice president Jim Alchin. Even limited release of Microsoft's code, he told judge Kollar-Kotelly's federal court in May 2002, would threaten national security because it is both seriously flawed and widely used in the Defence Department.

Secondly, Microsoft announced, only nine months later (February, 2003) an agreement giving the communist Chinese full access to the source for Windows and related tools, thus lifting the veil of obscurity for this foreign power.

You don't negotiate any kind of agreement with the communist Chinese in a few days or weeks; it usually takes months or years to get even simple agreements approved - remember, theirs is a command economy in which nothing happens without government approval - and this particular agreement included a personal briefing given the chairman of the Chinese Communist Party by Bill Gates himself.

Think about that for a moment. Here we have a senior Microsoft vice president telling an American court that releasing the code to American companies would threaten national security at just about the same time some of his colleagues were negotiating a hand-over of that same code to the communist Chinese - the people who support North Korea, actively threaten both Taiwan and India, maintain the largest standing army in the world, and continue to publicise their idealogical commitment to the replacement of American democracy with a socialist dictatorship.

The question, of course, is what the communists are going to do with their access to Microsoft's source. Most people would agree, I think, that a few thousand really bright programmers with lots of time and full access to Microsoft's source could accumulate enough information about weaknesses in the code to develop viruses and other exploits for use as economic weapons against the United States and key democratic allies like Taiwan. The question, therefore, isn't whether this could happen but whether it will happen.

Business, like law enforcement, reacts in arrears - i.e. after the event. Thus no American businessman is going to face criminal charges for failing to react to a threat that may or may not materialise. The military, however, have a pro-active mandate and are required to react to potential threats as if they are real threats however paranoid they may seem to those who choose to ignore history. Thus any officer now in a decision making role who fails to react effectively to the threat posed by the combination of Microsoft's reliance on obscurity for its operating systems security and communist China's access to the code, could eventually be charged with dereliction of duty.

To make such a charge stick, two elements would have to be proven. First that the officers responsible for the decision to continue using Microsoft's products were aware of the security problem and secondly that that they had a better alternative open to them.

It's impossible to believe that anyone now working in military IT could reasonably claim competence while denying knowledge of either the general vulnerability of Microsoft's software or communist China's access to the source code. What that future congressional inquiry is going to focus on, therefore, is whether or not there was a reasonable basis, in the 2003/4 time frame, for believing that open source offered a better alternative.

In other words the question they will be asking will be whether or not there was compelling reason to believe, today, that open source can be as, or more, secure, than proprietary software whose source code is too flawed to be revealed to the public but is available to the enemy.

Consider, on this, what Bruce Schneier says in the introduction to the second edition of his Applied Cryptography about the difference between security and obscurity:

If I take a letter, lock it in a safe, hide the safe somewhere in New York, then tell you to read the letter, that's not security. That's obscurity. On the other hand, if I take a letter and lock it in a safe, and then give you the safe along with the design specifications of the safe and a hundred identical safes with their combinations so that you and the world's best safecrackers can study the locking mechanism -and you still can't open the safe and read the letter - that's security.

There's no possibility of obscurity in open source, that's one of its great values and part of what Eric Raymond meant with the comment that "given enough eyeballs, all bugs are shallow." In this sense open source is a continuation of the academic process of peer review in which the feedback loop between those who originate new ideas and colleagues who review the work generates a Darwinian competition of ideas in which the fittest survive. That's the difference: Microsoft relies on obscurity -but sells the safe to the communist Chinese- while open source subjects both the code and the design ideas behind it to intensive peer review and so evolves increasingly secure systems.

As choices go, this pretty much defines the no brainer category with open source winning every time - and establishes the consequence that some future congressional inquiry may nominate the Pentagon's current failure to replace every Microsoft product; whether application, tool, or operating system; with its open source equivalent as the worst IT decision ever taken.


Paul Murphy wrote and published The Unix Guide to Defenestration. Murphy is a 20-year veteran of the IT consulting industry.