- by Paul Murphy, -
Heros are for boy scouts, but I'm writing this using vi on a SPARC machine running Solaris, so when Fortune Magazine ran an extended interview with Bill Joy under the headline "Joy after Sun", I paid attention.
Fortune mentions the famous "grey goo" article from Wired (April, 2000) "which concluded that robotics, nanotech, and genetic engineering were emerging so quickly that, if we weren't careful, they could endanger the human species" only as part of the introduction to the interview, but the rest of the report makes it clear that Mr. Joy continues to be deeply concerned about the risks associated with unplanned and uncontrolled interactions among multiple computing objects such as nanomachines.
In the process of expressing those concerns, both then and now, he uses a lot of biological references and evolutionary analogies and this is, of course, quite common -- Microsoft even had a product suite called "Microsoft Windows DNA 2000"-- but there's a very fundamental problem with these references to biological parallels that goes well beyond them simply being wrong.
In fact they're right, but not with respect to any form of biology based on DNA and therefore biological evolution. Instead things like computer viruses now, and possibly autonomous nanomachines later, exhibit something much more dangerous: Lamarckianism.
The key to Lamarckian biology is the perpetuation of externally induced change in one generation in succeeding generations. In other words, Lamarckian parents who each lost a leg to surgery would naturally have one legged children. The problem with this, as science fiction writer C.J. Cherryh so ably showed in the Foreigner series, is that the resulting ability to quickly and reliably direct evolution by slicing and splicing an existing organism knowing that it will reproduce in whatever form you leave it, is almost guaranteed to have uncontrollable, and disasterous, consequences.
That's a very fundamental difference: think about nanomachines and trustworthiness issues in computing in terms of analogies to normal biology, and your thinking will influenced by evolution's image as a slow but generally benign and natural process. Think in terms of Lamarckianism, however, and the potential for malignant outcomes becomes much more obvious.
Lamarckian biology can not work with respect to DNA based life but it does characterize computer viruses today and will, almost certainly, apply to nanomachines in the future. The parallels that should influence your thinking on the subject aren't, therefore, to any form of natural biology but to an imaginary form right out of Mary Shelley's worst nightmares.
In biology, success is defined in terms of long term reproductive growth or stability with genetic change controlled through a feedback loop perpetuating adaptations leading to species expansion and damping out failures. The keys to the control processes are the time it takes for the effects of specific adaptations to work themselves out and the role the geographic containment of new adaptations plays in protecting genetic diversity across the whole of a species.
In nature, a failure of either or both controls leads to species failure: when the Ebola virus migrates to a human host, it enters a phase of uncontrolled reproduction which so quickly kills all the hosts it can reach that it prevents itself from spreading further. In contrast, a Lamarckian creation on the Internet faces no such controls. SQL-Slammer exploded out of South Korea to spread world-wide in less than thirty minutes and can be obtained, studied, and modified more or less at will by anyone interested in slicing and splicing together its successor.
It is the absence of these natural controls that is most frightening about Lamarckian biology. A disasterous mutation, whether malicious or accidental, is easy to create and spreads so quickly that a vulnerable population is denied the time and information needed to defend itself.
We tend to think of the plagues of worms, viruses, and other exploits affecting the Wintel world as significant but this is only true from the limited perspective of those directly inconvenienced. In reality the impact of an extended world wide computer shutdown, even one that went beyond the PC, would be miniscule relative to the effect of a nanotech disaster affecting a comparable worldwide mono-culture.
Imagine, for example, a world in which municipal garbage is no longer stored in landfills or incinerated but is, instead, processed by nanomachines to recover essentially all of the raw materials in purer, more accessible, forms. Now imagine someone re-programming some of those machines to escape into the materials, lie dormant for some time, and then start reproducing using what ever they find locally for energy input. Sure that's a doomsday scenario, but its also Lamarckian biology at work and a simple, logical, extension on what we know now.
It's possible, therefore, to think of the present epidemic of Wintel security violations and its worldwide economic consequences as a relatively benign demonstration, or model, of what happens when a Lamarck biology is allowed to work itself out in the absence of intelligent direction and control.
That mess didn't arise because no one understood that creating a Lamarckian biology without also putting appropriate controls in place was a bad idea. Some, as Mr. Joy says of Sun in the Fortune interview even offered partial solutions:
We did provide people with tools like Java to build more safe and reliable services on the network. But Java has been under appreciated because, once again, it was a solution to a problem people had heard about but had not felt viscerally, whereas the perceived cost of not choosing Microsoft or IBM was felt much more measurably and emotionally.
Thus the Wintel model came about because we choose, as a culture, to ignore the issues and let the uncontrolled and undirected evolution of the Wintel monoculture loose a Lamarckian biology on the internet. He doesn't compare the result to grey goo, but if if I understand his fundamental point at all, it is that we must not let short sighted self interest lead us to repeat that mistake with respect to nanotechnology.