Using Linux at Home: #6

Lots of people have told you that Linux is cheaper than Microsoft's Windows brand systems; this column explores a different question: what are the risks and consequences of using it at home?

In today's installment I'll look at how to convert an old PC at home, a few bucks, and the urge to experiment with a non Microsoft solution to something you can show the neighbors.

If you are that Bill Gates, just blow 45K on a twentieth anniversary Sun workstation (hey, he should have something in the house that works) but if you're not, you may be looking at that two year old Windows Millennium Edition machine and wondering where to start.

Running Linux at home isn't so much about having more bits or megahertz than the next guy as it is about making the computer useful. To do that you need to go beyond the "ok, it works" stage, to think a bit about how you use it and whether you want it to work tomorrow too.

Before you start putting a lot of effort into resurrecting a PC that's too old to run Windows/XP effectively, think for a minute about what the continuing decline in hardware costs means to you. That clunker may have cost you thousands, but you can get a 1.1GHZ machine with 128MB of memory, a 17 inch CRT, and Linux pre-installed for all of $349.95 at Walmart. Fixing up an older machine may cost more than that - and leave you with less.

In particular, if your older PC has a bunch of non standard peripherals you may have to reconcile yourself to doing without them. Many low end PCs and related products don't meet the official standard for things like USB or IEEE 1394 ports. Furthermore older integrated modems and non standard video cards are often dependent on idiosyncratic BIOS setup changes that will cause trouble for Linux.

If your system can run these using Windows 2000, Linux will probably run them too; but if your system can only use them under one of the Windows variants that runs as an MS-DOS application (Windows 95/98/ME), Linux may reasonably not run them.

Similarly, if you've got an extensively futzed-with master slave disk set-up or a motherboard that's had the benefit of some guru's attention, be prepared for trouble. This column is about using what you have, but bear in mind that just buying a ready-to-rumble package may simply make more sense.

On the other hand, when you run Microsoft Windows your dominate hardware concerns are megahertz and megabytes, but that's not true with Linux. Linux will run on an Intel 386 with 8MB of memory - it's not exactly recommended, but it will walk. In practice once you get into the higher end Pentium II from 1998 and later, the machine will spend most of its time waiting for you and prove more than fast enough for most of the things people do with home computers.

That changes, of course, if you decide to spend your time morphing video scenes - then you can't get a PC that's fast enough and you'll find yourself eyeing stuff like Apple's X-serve or Sun's new V880Z but, for normal people, most of the time, it's not about megahertz or megabytes.

The things to focus on when considering a Linux machine for home use are:

  1. screen size and quality;
  2. whether to buy a Linux CD and, if so, which one;
  3. whether things like more memory might be worthwhile;
  4. ensuring continuity through things like backup and a practiced recovery method; and,
  5. whether or not you need a UPS.

The Screen

Of these, screen size is by far the most important. Unix does something that's completely alien to The Windows Way: it truly encourages multi-tasking. This isn't a matter of hot-tabbing between applications; it's true concurrent use of several tools and reflects a very fundamental difference in philosophy. In the Microsoft world applications are monolithic. A function either exists in the application, or it doesn't. In Unix users who want to learn how, can combine applications to make them do, together, things that the individual designers never dreamt of.

Beginning Linux users often fail to discover this level of system functionality and value because they start with relatively small screens - the 15 to 17 inch stuff that works for Microsoft Windows.

People with Unix expertise can do these things on dumb terminals from the seventies, but the one thing that makes it easier for ordinary people to learn to use advanced facilities like shell pipelining is screen real estate. If you can see what all the pieces are doing, then combining them to get a specific result looks fairly natural; but, take away that ability to see, and learning to do it requires a degree of abstract thinking that most people are never going to be motivated to undertake.

If you think about computing in terms of your experience with Microsoft's Windows products this will contradict what you know to be true. Bear in mind, however, that the Microsoft one-size-fits-all application model isn't the only possibility. Consider, for a moment, what it might mean in terms of user interface development that Unix workstations from Sun, AT&T, and Apollo, among others, offered 1152 x 900 pixels on 21 inch color screens when MS-DOS was just learning to use 240 x 328 grey scale dots on a 13" screen.

The bottom line here is simple: you'll never learn to use what you can't see; so get the biggest, brightest, clearest monitor you can afford. Note that it is screen real estate that counts; bigger may not be better, but it is generally more useful. Having a graphics board that handles 32bit color is nice, but 8bit color on a 19 inch screen running as low as 1280 x 1024 pixels is much better than 32bit color on a 15 inch screen at 800 x 600. If your budget is limited, get bigger first, better later.

If you don't already have the screen, this is the place to spend your money. The exact amount will depend on you and how much you're willing to shop around but remember two things:

  1. used monitors and cards, if they work at all in your machine, will probably run as well and as long as new stuff; and,

  2. Dell's upgrade cost from a 15" junker with an 8MB graphics card to a 19" Trinitron with a 32Mb card is about $300, so that should be about the outside limit for this cost including installation and testing from a local dealer.

A boot CD

If your monitor is adequate, or replacing it is beyond your budget; then the next place to think about spending a few bucks is for the Linux CD. You don't need one of these, you can borrow one (no license issues!) or load Linux from the net if you want to; but it's a serious convenience that's usually well worth the money.

How much money depends on you. If you want the latest release from a company like SuSe or Red Hat, it'll run you around $50 including a manual and installation guide - or around $99 if you want StarOffice and the Microsoft compatibility suite pre-installed and configured ready for use.

On the other hand, each time a new release comes out, old ones usually go on sale in the "under $25" range and not only are are they not that different, but you can update your system on-line after doing the installation.

Patches are Patches, not upgrades
In the Unix world, including Linux, patches - program fixes - are issued as soon as they're tested and available. Most do not require system reboots; very few require any kind of post installation application or change testing.

A Linux patch is not a Windows service pack! As a good administrator you need to get and install these on a monthly basis - and you can do it without worrying about them bringing your system down or your applications to a screaming halt.

Which distribution to get is a dicey issue. There are lots of reviewers with strong opinions; check out and some other Linux resource sites to decide which you'd prefer. On a personal basis, I tend to prefer the default setup on Caldera (now SCO) Linux but recommend Red Hat to technically inclined friends -and Xandros to others- not because these are the best, but because the automatic installers seem to get them into the least trouble.

A big part of the issue here is how far you want to go with the conversion away from Windows. The Xandros distribution, for example, costs $99 but is carefully tailored to make it as easy as possible to continue using Microsoft applications like Office or Quicken in your new Linux environment. To an expert the reduction in set-up hassle isn't worth the extra bucks - but if you're not a Linux expert this kind of pre-packaging can make the difference between being happy with your Linux decision and going back to Microsoft.


The next place to look at spending bucks is memory. Linux will run in 32MB and rejoice in 64MB but here too more is better. Depending on what your PC uses for memory this may be tricky; there are lots of different types, so check carefully before plugging something in that may not work right.

Your hard disk and CPU are the most expensive to change but almost anything will do including any Pentium II or later, any Mac desktop (and most Powerbooks) with a powerPC chipset, or any SPARC machine with more than one 200MB disk. (You can strip Linux to about 60MB, but you'll lose the GUIs and most of the hotter tools and applications). Again bigger and faster are usually better, but from a learning perspective there's little to gain once you get above something like 128MB of RAM on an 85Mhz Supersparc or a 450MHZ PIII. Above that and most of the time the machine will be waiting for you, not the other way around.

That changes, of course, as you learn to use the system. By the time you're writing your own java servlets you'll want your own Starfire - with 100 SMP CPUs and 200GB of RAM; meanwhile, however, a bit of older hardware will go a long way.

Backup and Recovery

Once your system is running you need to go the next step. A lot of people don't; they'll say that, for them, "running" is good enough but it's not true. To have something you can be proud of you need to understand that hardware failure is inevitable and have plans in place to deal with it when it happens.

More editorializing (sorry, force of habit)
You hear quite of lot of people saying that things like backups aren't critical if you are just using your machine to learn about the technology. To them, the inevitable disaster is just part of the learning experience.

There's a word for that and it describes a heap of smelly stuff that comes out the back end of a bull: backup and recovery are key systems functions that need to be included in your learning experience, and in your budget.

That means making backups and practicing recovery. A lot of people, including many professionals, do back-ups but don't practice recovery and so inevitably find they can't do it when they have to. Don't get caught, practice.

How you actually do backups, and so recoveries, will vary with the gear you have, and the software you decide to use (several good choices come with the system). What you pick isn't a big issue, it all works. What does count is that you don't wimp out: at a minimum you should plan your backups to accommodate two levels of recovery:

You probably don't need anything fancy; if this is a basic learn-by-doing system, you can keep a paper record of significant changes so you can rebuild from the boot CD and then back up your personal files to floppies. Dumb and slow, of course, but also dirt cheap and perfectly reasonable --remember, there are no licensing or machine ID issues to worry about so you need only save your files, not applications.

On the other hand you could decide how you want to do recovery and then blow the last of your budget on a tape drive, removable disk, or remote hosting service to accommodate it. Just make sure of three things:

  1. That you run drills.

    Practice recovery when it doesn't count, and it'll be duck soup when it does count.

  2. That you don't invest in something, like a tape drive, that works well on your system and not at all on a current generation replacement system.

  3. That you record major systems changes: passwords, account keys, etc, on paper and keep that paper off site.


Whether you a need a full uninteruptible power supply (UPS) or not depends on how stable the power is in your area and whether or not the machine is being used for something important. For the typical learn-by-playing machine, your basic $29.95 power filter will do nicely. Start using that machine to run the heater, water supply, and lights in your personal hydroponics garden and a UPS with enough battery power to get past 95% or more of outages starts to look like a good idea.

One important consideration here arises because Linux, like any Unix, can support multiple file systems all of which have different recovery characteristics. Since default Linux uses any free memory that happens to be laying around as file buffer space, a power failure can cause significant file damage because parts of the file are in memory and so lost when the crash happens.

Newer file systems are often "journaled" -meaning that changes to files, even the parts in memory, are tracked to make boot time file system recovery fast and easy. The problem is that journal files sometimes get damaged too, leaving you with lots of work to do. The older HFS file systems don't do journaling but come with a repair utility called "fsck' that runs automatically at boot time to repair what it can. Generally this suffices, so my advice on this is simple: stick with the older file systems or buy a good quality UPS.


So what will you have at the end of all this? A fully functioning system, not just a flaky word processor or games console, but a real computer center with applications, policies, procedures, and controls in place. And is that cool? Oh yeah!

back to number one (is Linux for geeks alone?)