I've heard a lot of comment, recently, about the lack of innovation in IT in general and in database and OS development in particular. I disagree and think we're seeing more innovation now than ever before.
This isn't true at the greeding edge where salesmen meet managers. "SOA" (service oriented architecture), for example, is about as real and innovative as reality television, but it is true where the lab meets production.
Looking at what's going on across information technology I see three great forces driving enormous innovation:
Sun's first "Niagara" hardware offers a 15x throughput improvement over their US3i, IBM's first cell based blades offer similar gains in floating point performance over their Power5. Both technologies call for revolutions in software, both rely on Unix, and both will ultimately change everything we think we know about deploying and managing organisational computing.
Imagine this big picture as a spinning globe and stab at something for a closer look. Pretty much anything you hit will show the effects of these forces. Look at database technology, for example, and what you'll see is enormous change. At one end hundreds of groups are nibbling away at applications of existing technologies; bridging the gap, for example between storage and retrieval technologies like those embodied in mySQL and presentation technologies like openGL and PostScript. At the same time, research ideas like UCB's back-up free temporal PostGres implementation of the early nineties are coming into commercial focus as big companies like IBM, Oracle, and Sun explore the cost and management implications of multi-terabyte storage needs -something SQL, as we know it today, won't survive.
Very little else will survive unscathed either. Personally I see the good guys winning, in technology and across the globe, but whatever your valuation of change it's probably safe to say that the next ten years will make the last twenty five seem like a period of great stability.