% fortune -ae paul murphy

ZFS/Flash and the climate debacle/h3>

Here's something I wrote back in 2005:

It should be possible, for example, to build or modify coal-based power plants to intentionally inject "tuned particulates" into the upper atmosphere, thereby reducing insolation and causing global cooling.

The downside, of course, would be that the effect would be uncontrollable at any but the most aggregate level -and that we couldn't easily turn it off if we notice the other planets, a year or two later, starting to cool. In other words, there are obvious solutions, but grabbing one without understanding the problem might turn out to be even dumber than the anti-nuclear protests of the seventies that led to the carbon economy that's supposedly the cause of global warming.

Fortunately, there is a far better technical solution on the horizon, one that's unambiguously desirable but nicely illustrates how closely technology and politics are linked.

The earth has about 509,600,000 square kilometers of surface area, roughly 71% of it water. Extend our area of interest 3 KM down and 12 up, and we have about 7.644 billion cubic kilometers within which climate is of direct importance to us. Heat and material transfer functions for most of the materials found in that volume are well understood, thus there are no theoretical impediments to modeling the effect that an isolated increase or decrease in solar energy input to a cone cut across these cubes will have on the cubes themselves and thence on their neighbors.

There are a few practical impediments to extending that model to cover the globe, but the theory's all there. What's missing is the both the data and the computing capacity needed. Make those available, however, and it should be possible to fully predict the effect in San Francisco next year of man made cloud cover in Beijing this year.

Getting the data is a matter of being willing to spend the money -the lack of surface differentiation in much of the globe coupled with the availability of space-based sensors make that much less challenging or expensive than it might appear. The problem has been that the combination of processing power and storage needed has not been available at any price - but they soon will be. Both IBM's grid on a chip and Sun's SMP on a chip offer the potential to do this: both directly and in terms of the computation needed to enable data reduction to the point that the required storage becomes practical.

That threshold of feasibility probably won't be crossed this year, but it almost certainly will happen within three or four years.

(Note: I was wrong at the time to trust the data enough to conclude that some warming was occurring, and also wrong about the size of the problem: recent research on energy transfer from solar wind suggests we need to go up another forty kilometers or so.)

At the time, however, my big concern in terms of actually doing this was that the technology needed to efficiently compress, store, and retrieve the data didn't exist.

But did you catch some of the numbers the Oracle team was bruiting about last Wednesday? 60 Terabyte data warehouses? Single tables with three billion rows? sub-second performance on Oracle database machines with multi-terabyte tables?

The contextual bottom line here on what they were saying seems simple: between CMT computing, flash, and ZFS this job doesn't seem out of reach on either practicality or cost grounds anymore.

So why do it? Well, there's an obvious reason: the meltdown taking place in the global warming advocacy business as one alarmist after another proves to have been lying and cheating his way to personal fame and fortune tells us a lot about the people involved, but nothing at all about the reality or otherwise of the threats humanity faces from climate change.

Personally I'd bet an incoming Maunder minimum (mini-ice age) over another MWP (Medieval warming period) - but that's a bet: not a certainty because the combination of theory and data needed to be sure simply doesn't exist.

The best bet, of course, is that nothing significant is happening: that the next hundred years won't be significantly hotter or colder than the last hundred; but, again, discrediting the people phonying up data - and if someone had told me even late last year that the IPCC would be caught selling WWF opinion pieces as refereed science I'd never have believed them - tells us they have no evidence for their warnings, but says nothing about whether those warnings, in either direction, are right or wrong.

What we need, obviously, is good data on which to build, and against which to test, the theories we need to develop before we can understand the problem and make informed predictions.

This project will spin out all the data we need while making it possible to do next hour, next day, next month weather predictions with no more theory than what's already known about energy transfer and material reaction to it. And that, as MArtha would say, would be a very good thing - not to mention, of course, one fantastic opportunity for Oracle to do its stuff.


Paul Murphy wrote and published The Unix Guide to Defenestration. Murphy is a 25-year veteran of the I.T. consulting industry, specializing in Unix and Unix-related management issues.