Last week's comment on the increasing futility of fighting the PC security wars on current terms drew a comment from a discussion contributor I don't remember hearing much from before. Here's part of what "Gary" had to say:
The problem is that we're fighting software with software. We're missing out on a huge advantage as defenders. Unlike the remote attackers, we have physical access to our computers. So include a small switch which prevents ALL writes to the code area. And when legitimate updates come along (ones that give us new functionality, not just plug buffer overruns), we boot into a form of safe mode by holding the switch down. This would (a) allow writes to the code area and (b) boot off a utility that does nothing more than apply the update.
While I think such an approach might have value in reducing both user carelessness and susceptibility to drive-by internet attacks, it's really only psychologically different from the standard Unix differentiation between the root and other users.
That psychological difference is, however, important. Remember what happened with the nineties version of appliance computing? People bought all kinds of single purpose devices whose reliability was essentially guaranteed by the vendor's refusal to give up the root passwords for the Linux or BSD variant running the application. That, of course, caused all kinds of angst among IT professionals who wanted to admin the boxes - and I can testify from personal experience that telling them that the secret to Unix reliability is not to let anyone change anything on a working system wasn't generally considered either tactful or career enhancing.
That it was true was, of course, shown when the people who wanted access brought in their own Linux or BSD boxes, loaded the same applications - and promptly demonstrated both enormous satisfaction with the result and reliability levels on par with everything else they were responsible for running.
This wasn't the first time that had happened - much of data processing's 70s and early 80s horror and outrage at seeing appliance computing succeed on the revenue generating side of the businesses they were supposed to serve arose directly or indirectly because those things were beyond their control. They wanted to do their jobs - which they saw as select, customise, control, and direct IT- and a departmental manager who bought a job shop scheduling application from Gould or a word processing system from Wang just wasn't playing by their rules. Rules that, in the end, they were able to enforce - briefly making IBM's DisplayWrite/370 mainframe word processing package a corporate best seller.
In between various people tried other appliance computing constructs - here's a bit from Tom Clancy's Patriot Games describing one such offering:
"I came over to look at some signal digests. Admiralty signals between London and Admiral Sir James Somerville. He was commander of your Indian Ocean fleet in the early months of 1942, and that's one of the things I'm writing about. So I spend the next three hours reading over faded carbon copies of naval dispatches and taking notes."
"On this?" Ashley held up Ryan's clipboard. Jack snatched it from his hands.
"Thank God!" Ryan exclaimed. "I was sure it got lost." He opened it and set it up on the bedstand, then typed in some instructions. "Ha! It still works!"
"What exactly is that thing?" Ashley wanted to know. All three got out of their chairs to look at it.
"This is my baby." Ryan grinned. On opening the clipboard he revealed a typewriter-style keyboard and a yellow Liquid Crystal Diode display. Outwardly it looked like an expensive clipboard, about an inch thick and bound in leather. "It's a Cambridge Datamaster Model-C Field Computer. A friend of mine makes them. It has an MC-68000 microprocessor, and two megabytes of bubble memory."
"Care to translate that?" Taylor asked.
"Sorry. It's a portable computer. The microprocessor is what does the actual work. Two megabytes means that the memory stores up to two million characters -- enough for a whole book -- and since it uses bubble memory, you don't lose the information when you switch it off. A guy I went to school with set up a company to make these little darlings. He hit on me for some start-up capital. I use an Apple at home, this one's just for carrying around."
"We knew it was some sort of computer, but our chaps couldn't make it work," Ashley said.
"Security device. The first time you use it, you input your user's code and activate the lockout. Afterward, unless you type in the code, it doesn't work -- period."
"Indeed?" Ashley observed. "How foolproof?" "You'd have to ask Fred. Maybe you could read the data right off the bubble chips. I don't know how computers work. I just use 'em," Ryan explained. "Anyway, here are my notes."
A quick google didn't turn up an actual machine by that name, but even if Clancy's wasn't real, people have tried make and sell similar products - usually with minimal success.
Appliance computing delivered during the seventies, but was killed in control battles won by data processing. During the mid to late eighties people tried to sell personal appliances like Clancy's clipboard - and got nowhere. A re-invented appliance computing paradigm again delivered value during the mid to late ninties but was killed in control battles by the then merging Windows and data processing cultures.
So what happens next time? As playphones become more powerful and laptops go the way of the dodo, what will the battle for control over the iPhone and its successors look like? It's an old battle: people like me, and possibly "Gary", want things that work - things that can be trusted to do one thing, do it well, and do absolutely nothing else - but getting that in place contradicts a lot of established agendas and attempts to do it have a long record of short term success followed by conflict and failure.