Read Learning Old School Linux Page 2

Stallman turned the world of software development upside down.

  Almost 25 years later, the free software movement is a major force in software. Many of the tools used in the Internet are free software.

  The GNU tools have also enabled a new operating system, Linux (or as Stallman refers to it, Linux/GNU). The Linux kernel would have been both impossible and useless without GNU, because it was inspired by and developed with GNU free software tools. And, without the GNU tools to run, the Linux kernel would be no more than an academic curiosity. The Linux kernel is the core system code that makes Linux work, but the GNU tools are what makes having a Linux system worthwhile.

  Why Free Software Really Works

  Stallman's vision, though it seemed completely crazy, has revolutionized software engineering.

  Free software projects, unlike their proprietary counterparts, continue to improve over generations of developers.

  The efforts of a proprietary software company necessarily stop with the end of that company, or that product line. No matter how many loyal fans there may be, no matter how many users might depend on a piece of software, if the corporation that owns it decides to drop it, or if it goes out of business, then that software package is dead. No one can maintain it. No one can expand it. No one can port it to new computer systems.

  With free software, good software becomes immortal.

  The result is that useful, free software products tend to get better over time, and there are many software packages that have had several generations of developers, each taking over from the one before, without starting from scratch or throwing away any useful work.

  Stallman has often pointed out that free software does not have to be developed for free, or given away for free. But the free software model does change the economic model of the software industry.

  With proprietary software, you are locked into one company and need to pay whatever that company asks, or else abandon the software. With free software, that blackmail can't take place.

  Plenty of companies can still make money.

  IBM is one of the biggest contributors of free software to Linux, obviously not from idealistic motives. Free software allows the company to generate more profits.

  IBM's free software success has demonstrated that free software is not anti-capitalist; it provides yet another way in which the free market provide solutions.

  The Ways of Genius

  It's customary for geniuses to be eccentric, and Stallman is no exception. He has numerous quirks and causes, many of which can be found on his personal Web page (www.stallman.org).

  Other programmers report that Stallman is extremely difficult to work with. He has been called a "control-freak" who doesn't play well with others.

  Although his reputation for eccentricity may be well-deserved, it does not detract from his accomplishments and contributions to the field of software engineering. As one of the most famous programmers alive today, he is entitled to his quirks.

  Is Stallman a genius or a nutcase? Perhaps both. But he has changed the world forever.

  Hang 'Em High With TeX

  In 1962, the Beatles released their first record, and a young computer scientist name Donald Knuth started the first in a series of books called The Art of Computer Programming.

  Knuth's books had an impact that was immediate and widespread. They became some of the most important texts on computer science, and they often took a rigorous approach to describing computer algorithms with mathematical formulas.

  What Knuth originally intended to be one book has expanded into at least four, and the series continues, 45 years later. Volume two was released in 1969, volume three in 1973, and the first installment of volume four in 2005.

  Knuth labors on at Stanford University, and the entire computer science community eagerly awaits his releases of new material. He's a sort of J. K. Rowling of the computer science world. But this article is not about Knuth's famous books, but rather about something that happened to Knuth along the way.

  An Historical Inconvenience

  In 1976, when Ford was president and the Viking 2 spacecraft landed on Mars, Knuth was trying to get out a new edition of volume two of The Art of Computer Programming.

  At the time, most word processing was done with paper and pencil, or, for the more advanced, typewriters. If a document needed to be created for distribution, it was done so by a specialist typing these manuscripts into a typesetting machine, a giant computer system about the size of a small car, which cost thousands of dollars a month for the organization to maintain.

  The typesetter would type in the manuscript and then print the output, called the proofs, and take it back to the author of the document. The author would then edit the document (by writing on it with a pen), giving the changes back to the typesetter, who would then type them back into the typesetting system.

  Naturally, this took time, and mistakes crept into the process. It was a painful process.

  When he planned his reprint, Knuth was horrified to learn that the typesetter used for volume two was no longer around. It had been a proprietary system, as were all typesetting systems, and the company had folded, leaving Knuth with the prospect of redoing the whole process again.

  He was frustrated.

  The Personality of the Programmer

  Are you, or is any member of your family, a computer programmer? If so, you will notice certain personality characteristics.

  Programmers will often obsess about minutiae. Lots of other people were confronted with an illogical and inconvenient system for printing books. Every author right up until Knuth had to face these problems. No doubt with some grousing and complaining, they managed to deal with whatever they had to deal with to get their book in print.

  Any other author, confronted with the problem of redoing the typesetting of a book, would either get to work or abandon the whole idea. Not Knuth. He has the soul of a programmer, and he took the usual programmer path: writing software that can do it better.

  He developed a truly modern word processing and typesetting system that is still in use today. It's called TeX, usually spelled with both the first and last letters capitalized. And it's not pronounced like the first syllable in "Texas." It's pronounced like the first syllable in "technology."

  TeX Today

  TeX lives on, and has got to be one of the most stable and bug-free software tools in widespread use today. Knuth no longer changes TeX, except to fix any reported bugs. These are few, however, because of Knuth's programming skill, and because he has long offered a reward of $2.56 for each verified bug.

  Other than bug fixes, TeX is frozen. No new features will be added.

  However, that doesn't mean TeX development is at an end. It is freeware, and Knuth encourages the development of other variants and versions of the core technology. One prominent example of this is LaTeX, a layer of software that sits atop TeX and makes producing mathematical, engineering and scientific documents easier.

  He asks only that, whatever you do, you don't call it "TeX." As with his seminal series of computer science texts, TeX will always be associated with him alone.

  TeX to Texinfo

  The TeX engine is used by the Free Software Foundation—influential developers, maintainers, and distributors of the most important tools in Linux: the compilers and system tools that put the UNIX in Linux.

  Texinfo (texinfo.org) is a program that offers a simple way of encoding typesetting information in a text file. It allows you to produce a document in a variety of formats from the same text source, so that you can write one manual and have it available as a Web page, PDF file, plain text file, and PostScript file, as well as the info file used by GNU Emacs.

  Since it's the official typesetter of the GNU project, it is well supported on all Linux systems, and it is used for all GNU documentation. A quick Web search for your favorite GNU tool will show the output of the texinfo system, and if you download a printable copy you can see the typesetting done by TeX.

/>   The Triumph of TeX

  Knuth wrote TeX because of his own personal frustrations dealing with typesetters in the production of his series of books. Instead of just doing the smart thing and accepting the limitations of text processing systems, he invented his own system—and it was great enough that it still forms the basis of many typesetting systems today.

  In typical programmer fashion, Knuth estimated that one year would be required to solve the typesetting problem. In fact, it took him 10. It's nice to know that even a genius can get the schedule wrong.

  While his schedule may have been inaccurate, his software is nearly perfect, and its influence has been profound over the last several decades.

  Curses, Foiled Again!

  In January of 1984, two things happened that would change the course of computing forever.

  One of them was the release of the Macintosh, the first real consumer computer with a graphical user interface (GUI), a mouse, and bitmapped graphics as a standard. Each of these technologies had existed in other computers, but the Macintosh united them in a single consumer package, and made computing history.

  It was the bitmapped graphics that made the Mac so pretty. With a screen resolution that was far greater than any PC of that era, it presented stunning graphics, which rapidly made the Mac the computer of choice for publishers, graphic artists, and those who found the command-line interface to be more than a little confusing.

  When Apple released the Macintosh, it launched the Age of the GUI.

  Apple’s GUI did more than provide cute little icons and windows: It changed the way computers are used forever. Today, there is not one general-purpose computer that comes without a GUI. Even UNIX, always the stronghold of software conservatism, is now so GUI oriented that many Linux users never even learn to use the command line.

  But like every important technology, the GUI didn’t just create opportunity—it also destroyed technologies.

  The invention of the automobile destroyed a thriving horse-and-buggy industry, and the invention of the electric light bulb destroyed a huge gas-lamp business. Similarly, the dominance of the GUI came at the expense of another fledgling technology.

  It was a technology that could draw a screen for interaction with the user, and allow the user to move the cursor around on the screen, select text and, in short, do much of what the GUI can now do.

  These days we call it a textual user interface, but back then it was known as “curses.”

  A Terminal for Every Season

  Long before the GUI, in the era of disco music and 8-track tapes, the so-called “smart” terminals had replaced the clanking teletypes that output everything on sheets of computer paper. The computer user interface was just being born.

  At first, all that was sought was a way to reproduce what had previously been done with punch cards—that is, the input and display of 80-character text strings, with no formatting. The first modern terminals had little chunky green numbers and letters. Nobody complained about the font; we were all happy to read the screen instead of looking at holes punched in a card.

  Back then, no one even knew what a font was. If you’d asked, I would have guessed some sort of snack food.

  Soon, however, the interactive nature of the so-called glass teletypes started to assert itself. Programmers started playing with menu and help text that popped up, and then went away. The text editor vi appeared, which allowed the user to drive the cursor around the screen, and modern word processing began in earnest.

  Some terminal manufacturers even started experimenting with colored text, underlined text, and even blinking text. We’d come a long way from the plain old punch card!

  In those heady days there were dozens of different terminal types, with different capabilities and qualities. Soon, programmers had more terminal options than they knew how to make use of.

  Unfortunately, these different types of terminals all spoke different languages. While it was possible to make software work on any one or two terminals, it was extremely difficult to manage more than that. So software written by one batch of scientists couldn’t be shared with any other batch of scientists, because they were using different terminals.

  The answer, of course, was to write a software library that would present a common interface to everybody—one that would know the details of every different type of computer terminal.

  With such a library (which was called “curses”), programmers could write their program to talk to the common interface, and it would work on any terminal that curses knew about. There was even a way for users to add information about a brand-new smart terminal, so that they could start using it, perhaps before the curses programming team had even heard of it. Thus, a brand-new terminal could be put into almost instant use.

  Curses was written as part of the Berkeley version of UNIX in the early ’80s. It was wildly successful, and can still be found on every UNIX system, as part of the archaeological strata of the UNIX operating system.

  The GUI doomed curses to obscurity, but that doesn’t diminish the technical achievement of a team of early programmers, struggling against a hostile universe of hardware incompatibility.

  The Other Thing That Happened in January 1984

  January of 1984 was a fruitful month for computing technology. Not only did Apple introduce the Macintosh to much publicity and fanfare, but (with no publicity and zero fanfare) Richard Stallman announced the GNU project.

  GNU sought to produce a free software version of UNIX. The GNU project launched the free software movement, which led to Linux, and so much more.

  More than 20 years later, we can look back on the influence of the Macintosh and GNU, and wonder what current technology is about to be made obsolete.

  Introduction to Linux

  The ComputorEdge readership were early adopters of fun new technology. But Linux has a somewhat scary reputation as being hard to install and configure. While this is certainly true historically, there are many new tools that make installing Linux a breeze these days. Some ComputorEdge columns were aimed at computer users thinking of taking the plunge. As I’m a huge Linux fan, I had no trouble coming up with reasons to switch.

  Helping Windows users switch to Linux is greatly assisted by Cygwin, a terrific port of many Linux tools to the Windows platform. It’s easy to install, and co-exists harmlessly with Windows on your system. But all the Linux tools and programs are there for you. It’s a great way to learn Linux without getting rid of Windows.

  ComputorEdge was certainly the most light-hearted magazine I wrote articles for. Other technical magazines would need to look up hiumor on the internet, but Jack Dunning, the editor of ComputorEdge, let me try to have some fun. But the article I thought most funny got a cold reception from the readership. A parady of a very popular TV relationship drama, it examines the emotional aspects of switching to Linux.

  To be a successful computer user, you must be constantly learning. Some columns were about the Linux documentation tools and how to use them effectively to learn more about Linux. Once ready to install Linux (other than Cygwin), a new user faces the choice of which Linux distribution to use, and there are a plethora of choices.

  Getting Starting With Linux

  Linux is really one of the best deals ever. You get millions of dollars worth of software, absolutely free. How can you beat it?

  The hard part, sometimes, is just getting started. How does an ordinary user begin?

  In this series of columns, I will take you from zero to 60 with Linux. By the time we are done, you won't be an expert, but you will be well on your way to using the best operating system ever developed.

  The first task is to get Linux on one of your machines. You can install it on an old machine, install it on top of Windows with Cygwin, or just buy yourself a Linux computer.

  Put Linux on an Old Machine

  Give me your tired CPUs, your poorly upgraded,

  Your small-disked systems yearning to breathe free,

  The wretch
ed refuse of your Windows network.

  Send these, the RAMless, documents lost, to me,

  I lift my lamp beside the golden door!

  These days there are hundreds of thousands of old PCs around that are no longer capable of running the latest versions of the operating system from Redmond, Washington. These machines were hot in their time, but like an aging Hollywood starlet, they languish while everyone chases the younger and newer talent. Today's fickle and demanding Windows user has no interest in these machines!

  Lots of people have old computers just sitting in the basement (I've got at least three). They are fully functional computers, with five- or 10-year-old processors, hundreds of megabytes of RAM, and tens of gigabytes of disk space. If you could send one of these computers back in time about 20 years, it would be the most powerful computer in the world. If you could send it back to 1940, they would give it to the atom-bomb scientists, who would have kept it in use 24 hours a day, 7 days a week, running calculations that would vastly contribute toward the war effort.

  These days, we can't even be bothered to keep it powered up. Such are the casualties of Moore's Law!

  These computers just need an operating system upgrade to be turned into useful machines, suitable for Web browsing, office tasks (such as word processing and spreadsheets), and the other tasks of your average Windows computer. The documents you get will be compatible with Windows; your Web browser will look and work just the way it does on Windows.

  Unless you use Internet Explorer. In which case, stop now! Please.

  The great thing about one of these castoff machines is that they are completely expendable. Just cheerfully reformat the disks, and away you go. (But do check with the wife to make sure the computer isn't the only one holding photos of the kids for the year 2002. Don't just assume that because the computer is in the back of the garage, she has nothing on the disk. Don't just blithely reformat the disk without backing it up somewhere.)

  There are lots of different distributions of Linux, but you won't go wrong with any of the big favorites, such as Ubuntu or Fedora. You can download CD images, or pay a few bucks to get a set of Linux install CDs, or ask around at work and someone will have a set.

  Old laptops also make great Linux machines. I recently converted my old Windows laptop, a Toshiba Satellite. With Windows it was slow and clunky; with Linux it is fast and wonderful!

  Use Cygwin on a Windows Machine

  For those who can't bear to boot a machine without Windows, there is the Cygwin option. Cygwin is a port of Linux to Windows. That is, Linux runs right on top of Windows. Just go to the Cygwin Web site (www.cygwin.com) and click the Install Now button to begin the Cygwin installation.

  This is not the most efficient way to run your hardware but, in general, any machine that can run Windows will be able to handle Cygwin.

  With Cygwin, you get a full set of Linux tools, with the same old Windows desktop and tools, side-by-side.

  This is a great choice if you have recently invested in new Windows hardware, or need to run some Windows applications as part of your work, or if you want a game system that can do Linux on the side.

  The Cygwin distribution is not a cut-down, limited version of Linux—it is the whole enchilada, ported to Windows. There's (almost) nothing that a full Linux box can do that can't also be done on a Cygwin box. You can even allow others to log into the machine remotely, and get a full Linux command line and windowing system.

  And with a Cygwin-based machine, Windows is still in the driver's seat when you need it to be.

  I don't like Cygwin machines, but that is not Cygwin's fault. The problem is that they still are running Windows, and that is taking a lot of resources. Any machine that can run Cygwin at a reasonable speed would be a real screamer with just Linux installed. So why not just cut out the middleman and take all the advantages of Linux?

  Alas, sometimes you can't. If you're imprisoned in Windows, Cygwin at least allows you to use decent tools.

  Buy a Linux Machine

  I'm a big fan of doing things the easy way, especially when it comes to computers. And you can't get any easier than just buying a Linux machine off the shelf. In many ways, this is the best and easiest alternative.

  Are you looking for a new machine? Many Windows users find themselves doing this every few years.

  If so, give serious consideration to buying a Linux box.

  Since it does much more with much less, a new Linux machine can be a lot cheaper than a new Windows machine, yet still do the same job just as well. Buying well behind the curve, Linux users can reap the benefit of mass-produced, commodity hardware, and let Windows users bear the cost of getting the cutting-edge stuff.

  And next year, when that cutting-edge stuff has dropped to half the price, you can always add it if you need it. But you will probably find that a Linux machine remains useful just as it is for years and years. After all, why should a machine not keep working year after year, with reasonable performance? This is not much to ask, but computer users have been conditioned to think it's impossible.

  Not only is it possible, it is now even easy.

  More and more desktops and laptops are being offered with Linux installed. Just take it out of the box, plug it in and turn it on.

  This doesn't just save you the $100 that you would otherwise send to Bill Gates every time you buy a computer—it also ensures that you won't have any trouble using any of the hardware devices on your machine, which can happen when you switch a Windows machine to Linux. (This has not happened to me in a number of years, with modern Linux distros being so good.)

  And believe me, even the chintziness and most economical new machine simply flies with Linux instead of Windows. No need to lay out the cash for the top-of-the-line hardware.

  What You Get With Linux

  All operating systems are layers of software upon software. Each builds on the past, and the nifty new features of today depend on the nifty new features of yesterday.

  This can be done poorly, or it can be done well. When it is done poorly, you get a huge, bloated, poorly organized mass of software. Like a skyscraper built of jello, it can't get too high without spreading out over a huge area. It becomes a giant, cherry-flavored, quivering blob.

  When done well, the system software is like the Eiffel Tower, tall and straight, and occupying a minimum amount of horizontal space.

  In Linux, things are done well, not poorly. That's why Linux users don't have to upgrade their machines every three years just to keep running the same software. And that's why old Windows boxes can be readily converted into useful Linux machines.

  Whichever way you get to a Linux machine, you will be treated to a wealth of high-quality, stable and efficient software. With Linux you can get off the merry-go-round of hardware and software upgrades, and get a system that just works—and keeps on working.

  How to Make the Switch to Linux

  Across the land, from a host of Windows users, the question echoes from the teeming coastal cities to the sweltering central plains: "How the heck do I switch to Linux?!?"

  As one who has reached the Promised Land, let me assure you it is worth the journey. Linux and Windows both do the same things—but Linux breaks much less often. Linux seems to work with you, while Windows seems to be working against you. Windows is the evil twin of Linux.

  Switching to Linux can be accomplished in many different ways, ranging from easy to pretty challenging. You decide how much time and energy you have to put into the process.

  Taking the Easy Way Out

  When I was a young child, my teachers frequently condemned my laziness. "You always take the easy way out," they would wail, as if that is somehow a bad thing. (Little did they know that my love for the low-hanging fruit is exactly the quality needed for effective engineering.)

  With a Windows-to-Linux switch, the easy way out is called Cygwin. Go to the Cygwin (www.cygwin.com) Web site and hit the button on the upper-right, which says, "Install Cygwin now." This
will download a program called setup.exe. Run this program, and you will be directed to answer a few questions. (When asked to select a Cygwin repository, just make a random choice.)

  If you play around with setup, you will see that there is a huge list of available software, with some items already selected. The selected items are the minimum Cygwin installation, and you should just accept them and let setup install them for you. If you have only a dial-up connection, you will have to leave the machine connected for quite a while for this.

  Once the Cygwin setup has worked its magic, you will have Linux installed right on top of Windows. Now you may continue to use Windows, but gradually transition to Linux in the background. That is, you can start using the Linux flavors of your favorite tools. You will find an icon for Bash, a Linux command line, on your Windows desktop. Bash is like DOS (in the same way that the Space Shuttle is like a bottle-rocket.)

  After the basic install, you will want to run setup.exe again and get a bunch more software. (All free, of course.) You can also install X Windows, the Linux GUI, and run X-based Linux programs on your PC. (Or, you may choose to use the Windows-based versions of free software, like OpenOffice, and just use Cygwin for command-line tools.)

  The great thing about Windows boxes is their incredible computing power. With a beefy Windows box, Cygwin programs run at a screaming pace. The bad thing about them, of course, is that they have Windows on them. Which leads to the next path to Linux: buying a Linux computer.

  Buy It, Don't Build It

  Back in engineering school, they told me "never draw what you can copy, never copy what you can tear out and paste." (The expression reveals my age in an era where engineering students think of "drawing" as something done, like everything else, with a computer keyboard.)

  Despite its age, there is a valid principle at work. You can't do everything, after all, so why not let someone else install Linux on a computer, and just buy it?

  If you take this path, you must shell out some money, but you get a ready-made solution that works from day one, right out of the box. And with Linux's svelte runtime profile, much less hardware is needed for a Linux box than would be needed for a Windows box, so you can get something inexpensive, and still get the kind of capability Windows users chase with top-of-the-line hardware. (They chase it, but they don't get it, because it's about the software, not the hardware.)

  If you're looking for a Linux box without a high price tag, take a look at the Asus ultra-portable Eee line. For $400, you can get a nicely powered Linux laptop that weighs about two pounds. Plunk it down on your desktop, and plug in a VGA monitor and USB keyboard and mouse (and don't forget to plug in your speakers). You can use the screen, keyboard and mouse from your Windows computer, in fact. Now you have a very nice Linux desktop machine.

  And when you want to go on a business trip, you fold it up and slip it in your coat pocket or carry-on bag, and you can make it to your destination without an aching shoulder. Once there, you can work on its tiny (but adequate) keyboard and screen until you get back home.

  It comes with wireless networking, but no hard drive. There's a few gigabytes of flash RAM storage, plus a microSD slot where I have another 2GB of storage that I bought for about $25. With the SD slot and the USB ports, you can get a lot of extra storage without a hard drive. The upside is that the Eee is very durable, as it has no moving parts, other than the keys.

  Wipe a Computer, and Install Linux

  Lots of Windows users have old machines lying around that are no longer powerful enough for the software from Redmond. Rest assured that they are more than powerful enough for Linux. In particular, old laptops can be used as power-efficient network servers. If you run Linux without a GUI, you can make use of even very old machines.

  To install Linux, you'll need an install CD. You can make your own by downloading a giant file called the disk image file (or ISO file). With the ISO file, and a read/write CD, you can make your Linux install CD. You must select your Linux distribution and get its ISO. Ubuntu is a good choice for those new to Linux. It has everything you would expect in a home/office workstation.

  The easy way to do it is to wipe out whatever is on that machine's hard drives. Take a copy on some other media, and get ready to hose down the storage. If you have more than one disk, you can disconnect one and retain its contents (Or leave it connected, but be very careful when you are at the installation step where you reformat hard drives.)

  Then you boot with the install CD and follow instructions that pop up on the screen. It's as easy as installing Windows.

  Build a Computer, Install Linux

  Why, oh why, do I always skip all those easy ways and go right to the hardest? Perhaps because it's also the most fun.

  Pick out the parts, put together your own box, and install Linux yourself. You will get a real powerhouse for a very good price. And you will learn a lot. Afterward, your experience level in Linux (and hardware) will be much higher.

  Which reminds me of another lesson from engineering school: Experience is the thing you have just after you need it.

  Linux and the City

  Vespa is Denver's last, best refuge of hipness and original food, but last night someone made the mistake of telling the crowd at the Libertarian convention about it. As a result, the place was crowded with East Coasters, looking nervously about, visibly trying to find something comfortable to look at, like someone in a three-piece suit. Instead, all they could see were relaxed Denverites, in their jeans and sneakers.

  I was waiting for the bartender's attention in this crowd of leisure-suited freedom-lovers when my significant other turned to me and, giving me that look with her beautiful brown eyes that always caused my knees to go weak, asked the question I have been secretly dreading for years: "Will you help me switch to Linux?"

  As the part of my brain responsible for getting along with my sweetie mechanically caused my mouth to accept graciously, neurons in the part of my brain responsible for staying married to her started firing. The path to Linux may be fraught with difficulty, and we don't always love those who lead us along this path. In fact, frequently we get mighty cranky with them.

  Later, when I had had a chance to get used to the idea, I asked myself the question: Is it a good idea to help your significant other switch to Linux, or should you hand him or her off to a competent geek friend, knowing that, however rough the switch, at least you won't get the blame?

  The Brunch Bunch

  The next day I broached the question to my three closest friends at brunch. The first to answer was Sam, the oldest and most experienced member of our group, who regularly took on the task of teaching Linux to friends, acquaintances and people he barely knew.

  "Go ahead and help her switch! Sure, it'll be hard, and she'll do some whining and complaining about how everything was different under Windows, but soon she'll get the hang of things, and learn to love Linux. In the end, she'll thank you. And if she doesn't, probably you weren't right for each other anyway."

  My friend Charlie immediately objected. "No, Sam, you never stop to think about how people feel. The emotional attachment that people feel for their operating system is much more powerful than you think. Right or wrong, her relationship with Windows has lasted over a decade. And now she'll have to break it off and move on. It's never easy!" Charlie turned to me. "You need to take the time to understand her feelings, and validate them."

  Our other friend, Marty, offered the most cynical opinion, as usual. "Don't do it. The first time she hits something that Linux doesn't do well, she'll blame it on you. She'll be at one of her meetings and some dork with Birkenstocks and a ponytail will show her his new Vista laptop from HP, and she'll have a seg fault. Find someone else to take the rap."

  I couldn't help but think that Martyhad a good point. Switching to a new operating system can be challenging, and even if the old operating system is annoying and stupid, it's what you're familiar with.

  After brunch, Charlie and Sam went shopping, as usua
l. A new CompUSA had opened in town, and Charlie was going to splurge on a new quad-core server, and Sam was explaining that he needed to max out his RAM right away. Marty, on his way back to his law office, offered one parting piece of advice.

  "Whatever happens, don't just reuse her old Windows laptop. She'll want something shiny and new, and having the old laptop around will be useful when she does hit something that she can't figure out on Linux, but that needs to get done right away. Also, she's bound to have a lot of files on the thing, and she may want some of those old files in the coming months. Good luck."

  Introspective Scene of Examining Our Relationship (to Operating Systems)

  As I walked down the 16th St. Mall, window-shopping, I reflected on my friends' advice, and our emotional attachment to operating systems. It reminded me of something I had learned from my children: People want to stick with what they know. A child might not like what he is used to, but he knows that he doesn't like what he is not used to. Doubt it? Then try feeding a brand-new recipe to a 6-year-old.

  Windows, with all its faults (and there are many), is what my wife has been used to for these past 12 or 13 years. During that time, it has been with her in good times and bad. (And, computers and software being what they are, that mostly means bad.) She knows its quirks and pitfalls, and she is used to the way things are done on Windows.

  I tried also to think how it would be for someone to try and convince me to give up Linux and Unix-like operating systems. I have been using them daily since the first time I sat down at one in 1986. That's 22 years of Unix knowledge in my brain and fingertips, and that is not something I would give up easily.

  (Thankfully, I use Emacs, so all operating systems look the same to me anyway.)

  Neat Wrap-Up

  In the end, I decided to help my wife switch from Windows to Linux. It all comes down to trust. But while the switch from Windows to Linux may not be wrapped up as quickly and conveniently as an episode of your favorite female relationship drama, it can help to make the computer, and the tasks you use it for, seem a little less threatening and annoying. It can make life better.

  And this, dear reader, is the secret to a good marriage: Try and make your spouse's life a little bit better in some way. Because you surely do cause your share of problems and difficulties. Like a good operating system, it's important to make sure the positives add up to more than the negatives, that the features outweigh the bugs.

  Cygwin: The Perfect Setup

  As a longtime fan of free software, there's not much that I admire about proprietary software. The software that is used with a commonplace operating system originating in Redmond, Washington, has little charm for me.

  Sitting around the lunch table with other free-software geeks, I am always loud and proud in my derision of that other operating system and the software that goes with it. We spend hours talking about software, and never is the software from Redmond given a kind word. (Perhaps we should talk about something other than software sometimes, but we are geeks, after all.)

  However, there is a dark, shameful secret that I have never revealed to my geek buddies. It is something I could never admit to, something they would laugh at.

  I admire the way Windows software installs.

  Easy Installation for Windows Users

  When you get a piece of Windows software, you put a CD in the drive, or run an executable file, and a friendly dialog box pops up. It explains everything in small pieces and small words. It gives reasonable default choices for everything, allowing lazy system admins such as me to just click "OK" again and again, without even reading the one sentence it has put up there for me.

  I can't even be bothered to spend five seconds reading and understanding the installation instructions and choices. How pathetic is that? I blame MTV.

  But on Windows—that operating system that gets so many things wrong—the installation program always works. Perfectly. Every single time.

  Doing Hard Time on UNIX Installs

  UNIX, which is a very capable and professional operating system, can do so much that Windows can't. (continue working for long periods of time, for example). But one thing it does not do well is software installation.

  Instead of getting some nice pop-up box with instructions simple enough for a clever monkey, UNIX software is distributed in a wide array of strange ways, ranging from source code (which you build with your own compiler, kind of like building a radio from a kit), to binary distributions, which are almost, but not quite, as easy as Windows distributions.

  When installing UNIX software, you almost always have to read the instructions. We free software geeks tell each other that we prefer it this way. But, deep in my heart, I don't. MTV has made it very difficult for me to concentrate on anything for long periods of time.

  Come Home to UNIX With Cygwin

  That's why I am always delighted to work with the Cygwin setup program.

  Cygwin is a complete port of UNIX/Linux to the Windows platform, and it is as easy to install as any Windows software.

  Unlike Windows software, Cygwin software is free, and the source code is also available, in case you want to modify it yourself, and it always, always, always works!

  Once you have Cygwin installed on your computer, you click on the Cygwin icon on your desktop, and a bash shell command-line window opens up.

  Immediately, my heartbeat slows and my palms stop sweating. I forget that I'm working on a Windows machine. Cygwin gives me a sane, powerful, and delightfully normal UNIX tool set. It's like coming home again.

  Even the X Window system—the UNIX windowing system that provides windows, menus and tool buttons, and all the other graphical gunk that some people seem to need on their screen—is supported. (As for me, I'm happy with a smart terminal emulator and emacs in ASCII mode. What more could anyone possibly need?)

  UNIX Therapy for Windows Computers

  To get this wonderful injection of quality software into your Windows laptop or desktop box, just go to the Cygwin Web site (www.cygwin.com). Click on the icon "Install Cygwin Now."

  This will download the wonderful Cygwin setup program setup.exe. Save this program somewhere easy to find, such as your desktop. You will run it whenever you want update your Cygwin software, or to get new software packages.

  Run the setup.exe program. Go ahead and accept all the default settings, just like on any other Windows software installation. When asked to choose a download site, I just pick one at random. I try to choose a .edu site, for some reason. I don't know if they are faster than commercial sites or not.

  After you choose the download site, the setup program downloads a list of software packages. Now comes the fun part.

  Ali Baba's Cave of Software

  Maximize the setup window; you will need plenty of room. Here you will see listed the software wealth of our generation.

  You'll find bash, the Bourne Again Shell—the best command-line interface ever developed. Or, for those with different tastes, csh, sh, zsh, or several other varieties of command-line environment.

  You can get emacs, the first and still the best integrated-development environment. For the more primitive, there is pico, nano, joe, and even the old UNIX line editor ed. There is even a vi editor, for those who are seriously brain-damaged.

  Get such acclaimed scripting languages as Perl, Python, and Ruby. There is also the GNU Compiler Collection (GCC), which will handle Java, C, C++, Fortran, Pascal, Ada, and other programming languages. No matter what your preferred (non-Windows) programming environment is, you can find it here.

  There are too many to list—enough tools to keep you busy for a long, long time. There are more than enough to fill an undergraduate computer science curriculum, and add in a graduate-level curriculum as well. More than enough to do very serious work.

  These are not knock-off tools, Windowized in some way. They are the actual UNIX tools, built for you on a Windows box. The scripts that you write for Perl, the C programs you compile with GCC, are exac
tly the same as those you would produce on a UNIX system.

  These are not toys; they are the same tools being used in the cutting edge of industry and research labs.

  All free! Amazing.

  Who Should Get Cygwin?

  Get Cygwin if you are a UNIX programmer with a Windows box. You will be very happy to have your usual tools always at hand, and you will find that you can use the Windows box as a UNIX development platform.

  Get Cygwin if you are a computer science or engineering student, so that you have access to industry-standard tools. These tools will help you in the classroom, and in the interviews when you're job hunting.

  Get Cygwin if you want to know more about UNIX without making the commitment to switching to Linux on your laptop or desktop. Or, if you are sick of the limitations of Windows tools and want something a little more serious.

  Then enjoy it, and remain in the UNIX cocoon of safety instead of venturing into the wilderness of the world of Windows!

  How to Become a Linux Guru

  Software is one of the most poorly taught topics on the planet.

  Get a civil engineer, and he or she will know all about roads, concrete and steel. Get a mechanical engineer, and he will know all about locks, hinges, pistons and pulleys. Get a software engineer and she might not even know what language you're programming in. There is little agreement as to what a software engineer should know — even about what a software engineer is.

  It's an observed fact that some people are good with computers, and other people aren't. We call the good ones gurus and the rest users.

  How do you tell the difference? A guru is the one who can solve unexpected problems, come up with comprehensive solutions, and get the darned computer working the way you want it to work. A user can not. It's the kind of thing that's easy to see, but hard to explain.

  But the guru you seek in times of computer trouble started out just as you did, with no formal training in the Linux operating system or software. How, then, does the guru manage to tame the information wilderness of the computer? Why is it that he or she knows what you don't?

  Dedicate Yourself to Learning

  To become a guru, you must first understand that there is far too much to know, far too many interesting nooks and crannies for even a lifetime of effort to explore. The amount of creativity, effort, time and money that went into the creation of the modern Linux box is staggering. With the almost unique easy reproducibility of computer software, it is possible for software to remain in use for decades, forming the foundations for the next layer of software, which will, in turn, form the foundation for even more layers.

  In every Linux box there are hundreds of thousands of lines of software, created over the last four decades by some of the most talented and productive programmers ever to have lived. This intellectual heritage, this wealth of functionality, is given to everyone free with Linux.

  You can't learn it all. You can't even learn 10 percent of it all. But that doesn't mean you can't try.

  Gurus are always trying to learn more about the operating system. If you want to be a guru, you must cultivate this attitude of study, this dedication to continued learning. There is no end to this road except death. (A bit depressing when you look at it that way, isn't it?)

  Finding the Answers the Old-Fashioned Way

  Given this huge amount of software, how do you find the answer to a specific problem?

  Back before the Internet was invented, the UNIX world had developed several documentation strategies to cope with this problem. The first was the simple but robust man page. ("Man" is short for manual.) This is documentation, distributed with the operating system. It's displayed with the man command. To learn about the find command, type: man find. (To learn more about the man system, type: man man.)

  Although the man system is extremely useful and still widely used to this day, there are limitations. Since each man page is a separate document, there are no hyperlinks between documents. There is also no good search capability.

  To address these issues for the free software projects they were undertaking, the Free Software Foundation (FSF) developed the texinfo system, which is based on the popular text editor emacs, but can be run in stand-alone mode with the info command. The info systems allow documents to be arranged in a book-like organization, with chapters, sections, sub-sections, etc. It supports links between documents and various forms of searching. Emacs users can get the texinfo documentation right within emacs.

  For FSF tools, which make up much of Linux, the texinfo documentation is more detailed and more useful than the man pages. It can also be used to generate nice-looking printed manuals, PDF documents and Web sites, all from the same source. This is very handy when documentation is maintained by busy programmers, as with most free software. You can get at it with emacs or the info program, but it's also available on the Web (see below).

  Finding the Answers the Newfangled Way

  The Internet is the central repository for human knowledge on free software. I don't know any guru who does not use Google or Yahoo to find new information to supplement what can be learned from the formal documentation in the man or texinfo documentation.

  Frequently, the very annoying obscurity of error messages can be used to help find the information you need. When faced with some error message that means absolutely nothing to you, just cut and paste it into your favorite search engine to get a wealth of user comments and experience.

  Since free software documentation is also on the Internet, you can sometimes just skip the man and texinfo documentation and jump right to the Web. The same documents you would get from the man or texinfo systems will be right there in your search results, along with other hits from related Web sites.

  Read Slightly More Than the Minimum

  There is only so much that you can learn in a day, and the best way to continue to learn is to try and learn a little every day, or even every time you interact with the computer.

  One difference that I have noticed between gurus and non-gurus is that the gurus will spend an extra five or 10 minutes reading the documentation, even after they have found the answer they were looking for. The non-gurus will crack the documentation open when they can't figure out how to do something, and as soon as they have found the answer they will close the books and hit the keyboard.

  That extra 10 minutes of reading each time allows the guru to take away a few extra pieces of information about the operating system. Instead of reading the absolute minimum of documentation, the guru reads a little bit more.

  You too Can Be a Guru

  To be a guru, you must adopt the attitude that learning about computers is something you will continue to do your whole career. You must recognize how little you know of the topic, and how much you have to learn. You must try to learn something every day, and from every computer problem you run into. Don't just work at it until the problem is solved; work at it until you have a good understanding of how the problem was solved.

  Over time your knowledge will grow, and your ability to find the right information will grow with it. People will start coming to you for answers, and you will hear yourself referred to as an expert or perhaps even a guru. Yet you still won't feel that you really are an expert, or that you really know all the answers.

  But gurus don't know all the answers; they just know how to ask good questions.

  The Distribution

  One aspect of Linux that confuses new users is the huge number of choices available. On the other hand, the commercial operating system from Redmond comes in two flavors, Pro and Home (which presumably means amateur!).

  Linux is all about giving you choices. It comes in hundreds of flavors.

  But there are only so many choices that your brain can handle before it throws up your arms in disgust and gives up!

  Back in the Old Days

  The computer hardware sitting on your desk is just a big, general-purpose and extremely complicated abacus (but the beads are electrons). It's so
complicated that if you had to start with just the hardware, and tried to write a letter to someone, it would take you years. And add a few more years to make it print.

  This was what life was like for the early users of computers, back in the 1940s and 1950s. But gradually, the body of existing software began to accumulate. It wasn't necessary for every new user to spend a month to write a letter. Once the first word processor was written, everyone could use it.

  It was this ability to build atop the work of others that inspired Richard Stallman to start the Free Software movement and the GNU tools. And it was those tools that allowed Linus Torvalds to develop the Linux kernel in 1991. He built upon a body of software that already existed—which he downloaded, one-by-one, and installed on his computer.

  So it was throughout the world. Excited users would get the Linux kernel and then use it for the core of their system. But, just like the users of the 1940s and 1950s, the kernel alone would get them only a working piece of computer hardware. They still needed to assemble various useful programs (like a word processor) and get them working with their Linux kernel. Most of these useful programs came from the GNU Project. The graphics system came from the MIT X Window project.

  But getting them all working together was up to the user. Sometimes this could be a real challenge. Back in those days, Linux users in many cities would hold meetings where Linux newbies were encouraged to bring their systems in to get Linux installed. Expert volunteers would be on hand, with pre-recorded CDs containing the software, and years of experience to draw on to get recalcitrant systems working.

  This was fun, but it challenged all but the geekiest. It's hard to take a bunch of different software packages and get them all to work together. The more software involved, the more work it could be.

  Distros to the Rescue

  These difficulties slowed the use of Linux considerably. Linux was something for computer science grad students and electrical engineering undergrads, not the average computer user.

  The development of an enthusiastic Linux community made the next step inevitable. Users started packing up their successfully integrated software systems, and mailing the CDs to whomever wanted them, or distributing the disc image over the Internet. Special graphical tools were developed to help new users install Linux, and sophisticated package-management systems took the sting out of getting and installing new software.

  Since it's all free software, you don't have to own it to package it up and give it away, or even sell it. It's free, so go right ahead!

  And that is what many users did. A set of integrated GNU/Linux software is called a distribution, or distro. And there are thousands of Linux users out there who, for commercial or altruistic motives, will take the trouble of putting together their own unique Linux distro.

  Some distros are aimed at new users, and give only the most basic and simple tools. Others are aimed at the hard-core geek, with the latest bleeding-edge version of every cool software you've ever heard of, and hundreds more that you haven't. Some distros are intended to quickly convert a PC, and some to run on embedded platforms or supercomputers. There's a distro for every taste and set of needs and, if there isn't, someone is probably putting it out there right now.

  Companies were formed that sold the distro and its support. They don't sell the software—that's free. They will sell you the smart person who can come down to your office and get it working on your hardware. Some of these companies developed even better tools to help the new user.

  These days, there are hundreds of distros out there, each with its own supporters and (sometimes) detractors. Many distros are related to each other, each based on the same predecessor distribution. Many use the same package management or installation tools, but offer a different set of software tools to the user.

  Which Distro Is Right for You?

  Any distro makes it possible for any ordinary computer user to install the Linux kernel, the windowing system, plus a ton of extra tools and useful toys. No longer do you need to lug your PC to a Linux user meeting to get it working. The install tools are so well developed now that getting Linux working on a new system is fairly simple.

  With all these choices, which distro should you pick?

  A good list of an arbitrarily chosen top 10 best Linux distributions can be found at distrowatch.com/dwres.php?resource=major.

  Unless you have special needs, the choice of distro is not too critical. Any of the top 10 will work about as well as any other. Remember, any distro is just a starting point. It gives you a base of software tools, but you may add any other free software to your system, often with a handy package-management tool that takes all the sting out of finding and installing software.

  So just pick whichever distro strikes your fancy. If you're buying a new machine, take whichever distro they offer. If you have a geek friend who is willing to help, use whatever distro he or she likes (and has CDs for).

  (If you still can't decide, pick Ubuntu (www.ubuntu.com) if you're a newbie, Fedora (fedoraproject.org) if this is a work machine, and Debian (www.debian.org) if you are a computer geek.)

  Where to Get It and How to Use It

  Where and how you get your distro depends on how you want to use it. The most straightforward way to get a distro is to order the CDs through the mail. For a nominal charge (usually less than $5 plus shipping), you'll get your distro in a few days, and can install from CD.

  For the less patient, you can download an install a disc image (about 700MB), write it to CD, and have your installation disc without having to wait.

  For the even less patient, Debian allows you to download a small install utility. Run it on your target machine (which must have an Internet connection), and it will guide you through the installation. I have never tried this, but it sounds so nice that I think I will next chance I get!

  All of the distros have a Web site with instructions and downloads. They, or other third-party companies, offer the CDs for sale through the mail, and there is always the old-fashioned way—ask any Linux geek for a loaner.

  Choices, Choices, Choices

  Choosing a distro is just the first of many choices, but it doesn't limit you as much as you might think. The distributions are just different collections of already free software. If you see some interesting software and you like it, you can get it, whatever distribution you started with. As time goes by, you will install various packages on your machine, adding to those installed from your distro. Your collection of tools will become your own unique set of software.

  Perhaps then, inspired by a need to give something back, and sure that your unique collection of free software offers value, you will produce and give away your own distribution!

  System Administration

  Once a user sets up a Linux system, he or she faces the tasks of system administration. Mundane chores mostly, but due to the highly-developed state of Linux tools there is a lot to learn. Many convenient features are available, but an orientation is required.

  Backing Up the Linux Way: Sticking to tar

  It's one of the oldest pieces of computer lore, handed down from the stone age of electronic computing: Always back up your files!

  Sometimes the old ways are the best ways, and on my very first day of computing (all the way back in the last millennium), I heard a horror story that demonstrated the importance of backups.

  Back at the Dawn of Time

  I was starting work for a pair of atmospheric scientists who had something very rare: their own computer. It was a small computer (only about the size my wife's minivan), and required a room with its own power system, air-conditioning plant, and one of those computer room floors made of sturdy tiles—any one of which could be lifted to access the maze of cables beneath the floor.

  It was about as powerful as my current laptop, only without the graphical user interface. For meteorologists, with their giant models of the atmosphere, it was what you might call a personal computer.

  On the front were mounted two
magnetic tape drives, and one wall of the computer room held row after row of neatly labeled magnetic tapes. These tape drives aren't even built anymore, but they used to decorate the front of every serious computer.

  Those computer tape drives looked pretty high tech in the '60s, but they were a lot of work to use. The tape had to be threaded through the heads and onto the take-up reel. As the tape was written, it would be transferred back to the starting reel. Since they didn't hold all that much data (by modern standards), it was common to have data sets spread across multiple tapes.

  Oh, the Horror!

  My first day in the lab, I saw one of the scientists loading and unloading tapes onto the machine. He was backing up the disks on tape, and he did it every day, though it took almost an hour, and he was always pressed for time.

  When I asked why he didn't have one of the lab's students perform this chore, he told me his tale of woe: A grad student, assigned to the task, had come in early every day to perform the backups. This grad student, seduced by his warm bed one winter morning, had skipped a day, with no untoward consequences, and without even attracting the notice of his late-sleeping co-workers, who assumed the backup had taken place as usual. The exception became the rule, and soon this lazy grad student had not performed a backup for three whole months—a fact that was discovered when the disk crashed, wiping out three months' work for the whole lab.

  Now, this scientist was a bit of a forbidding character—not someone I would like to cross. And the steely glint in his eye made me fear for the fate of the poor grad student. He had forgotten the first rule of computing: Always back up your files.

  The Modern Era

  These days, disks are more reliable. I've never experienced mechanical failure with a modern disk drive, though it does happen, allegedly. The greatest risk to data today seems to be accidental deletion.

  Although disks are quite reliable, the same can't always be said for the software engineer. Mistakes will happen, and when data is mistakenly deleted, a good backup discipline can save the day—and maybe your job.

  There are many different programs to help back up your data, but it's hard to beat the ubiquity and simplicity of the old classic tape-archiving program tar. tar has been with us so long that the tape has vanished from the equation, and most home backup these days seems to be from one disk to another (or even onto one of those newfangled data sticks) rather than to magnetic tape. Yet, tar still gets the job done.

  tar Me Up, Scotty

  The simple idea behind tar is that you provide it with a directory full of files and subdirectories, and tar will package it all into one file, the "tarfile" (which usually has an extension .tar). The tarfile itself is in a very simple format—one that makes no attempt at compressing the data. For this reason, it is frequently run though a compression program, such as gzip, which produces a much smaller compressed tarfile, the "tarball."

  The tarball can be saved to anther disk, perhaps on another machine. This ensures that if the original data is deleted or destroyed, the files can be recovered by uncompressing the tarball and then running it through the tar program again, which will unpack the file into its original directories, with all the contents intact.

  There are many different versions of the tar program out there. Linux users have the GNU tar, one of the best. Not only does it meet all the tar standards, but it also includes command-line options to allow the compression to be done on the fly, without having to invoke the gzip utility separately.

  Wasting Time

  Although the scientist from my story always continued to run his own backups every day, I did manage to save him some time by showing him how to do incremental backups.

  Incremental backups address the problem of backing up an unchanged file. Since most files don't change every day, there's no real need to back them up every day. But how can the poor user easily determine which files need to be backed up without making the tragic error of missing an important file?

  With GNU tar, the problem is handled for you with the incremental backup feature. This means that you do a full backup at some reasonable interval, such as weekly, and then an incremental backup each day. The incremental backup will contain only files that have changed since the last full backup. To restore the files, you would first restore the full backup and then the incremental backup. Although this means you need to do more to restore your data, it also means that the daily backup is a lot smaller—and a lot quicker.

  Using incremental backups, the scientist was able to use just one tape for his daily backups (except on Mondays, when he did his full backups). This meant less time in the computer room, and more time in front of his computer terminal, programming in Fortran.

  More Ancient Wisdom

  These days, with so much storage space available, it's not required to have someone come in early each morning and swap tapes around. However, it's still possible to mess up your backup process in some way that might leave you trying to answer some awkward questions to a fire-breathing boss who has just seen valuable data disappear.

  Listen then, to the wisdom of times past. Always back up your files!

  Who’s Got Your Back?

  We are putting more and more important data on our electronics, but are we taking more and more care that our data are safe?

  Backing up your computer storage, like flossing your teeth, sending thank-you cards, or changing the filter in the furnace, is one of those important activities honored more often in the breach than the observance. It’s just not very exciting - it’s a chore. And this chore has been with us from the earliest days of computing.

  Back in the Olden Times

  One of my University computer jobs involved the backing up of the data on what was then called a mini-computer. These were between the toy-like micro-computers of the day, and the equally toy-like (but much more expensive) mainframe computers of the day.

  The mini-computer looked like a mainframe - a bunch of refrigerator-sized chassis in the refrigerator-temperature computer room. (It was always lovely to go in there on a hot summer day!) To back it up I had to load reels of magentic tape on to the front of the machine, enter a command at the console, and go back to doing my math homework while the computer wrote data to tape after tape.

  The consequences of a mistake could be severe - I was hired for the job after the previous incumbent was ignominiously fired. The disk failed, and it was found that he had not been making proper back-ups for weeks. As a result, weeks of scientific work had been lost.

  Sometimes the Old Ways are the Good Ways

  In the years since then, computational hardware has advanced at a dizzying pace. The giant set of disks (which I spent so much time backing up) were a massive 300 MB. Today I have a half-dozen data sticks much larger than that, just kicking around my desktop and briefcases. They give them away at conferences and trade shows.

  But the need for back-ups has not changed, because people have not changed. Our hardware is more reliable, but we are not. One of the largest causes of lost data is human error. Someone deletes something that they shouldn’t. And even our vastly increased hardware reliablity is not going to compensate for the fact that we now carry our computers everywhere.

  My old mini-computer had disks that would crash if you looked at them the wrong way. My most recent computer, Yum-yum the EEE net-book, doesn’t even have a disk drive - its all flash memory. I can drop it on concrete and it would still work. But unlike my old mini-computer, which sat in a secure building, Yum-yum comes with me everywhere. It’s less likely to break, but far more likely to get lost or stolen.

  Backing up my data is more important than ever - but I have even less time available than that harried student I was in my youth. How can I have the back-ups without the tedium? As with so many questions these days, the answer is the Internet.

  Sometimes the New Ways are Better Ways

  These days, we don’t back up to tape any more. We demand near-instant retrieval of our backed up data, which is hard fo
r tape systems to manage. We also benefit from super-cheap, super-dense disk technology. It’s easier and cheaper to back up data to a disk server than it is to back it up on tape.

  Most of use, though, don’t have any disk servers running in our garage, and this is where the Internet comes in. Why should I run a disk server when there are so many people out there who can do so more reliably than I can? And most of us need to back up only a trifling about of data anyway. Disk servers are most economical in the terabyte range, and I would be lucky to have a gigabyte of data to back up, a mere one-thousandth of the capacity of even the lamest disk server.

  Hence the rise of companies like Mozy, the on-line backup people. For about five dollars a month, you can have your PC backed up over the Internet, on to Mozy servers. If you ever need any of your old files, they will be right there for you. Businesses have to pay more, but, if your business is data-centric, the cost is trivial compared to the benefits. Your data are backed up remotely, and safely, with very little investment of your time.

  What You Must Still Do

  Although you don’t have to sit in front of tape drives waiting, there is still one task that you must pay attention to: specifying what to back up.

  Most of the stuff on your hard drive is not worth backing up, because most of it is software, installed from disk or over the network with complex installations programs. These days, software is rarely just a single executable file. Backing up these files makes no sense, since you usually cannot restore them in a useful way. When these programs get messed up, you reinstall from the disk or over the network, the same way you originally installed the program.

  The only files that need to be backed up are the personal files that you have added to the computer: your documents, pictures, music and videos. In the Windows world, these are all usually kept under a folder called “My Documents”. In the Linux world, these will be under your home directory.

  When setting up a backup, make sure that all the data you want to back up can be written to some directory under My Documents, and, if not, make sure that you add the appropriate directory to the list of what is backed up.

  Cheap, or Just Frugal?

  For those of us that don’t have a data-centric business, but only the usual data-centric lives, is there a solution that is even cheaper? Of course there is, and it is called Google.

  My data-centric life has involved many a lost computer file, and much wailing and gnashing of teeth as a result. When Google docs came out I saw the immediate benefit of having someone else manage all my data files. For me, Google has it all.

  These days, Google will even let you store any old data file on their servers, up to a gigabyte for free. Well, that’s very nice of them. By using Google docs for spreadsheets, word processing, and presentations, using the Google Picasa and You Tube for photos and videos, and the extra Google gigabyte for everything else, the enterprising computer user can get very nice on-line backups for free!

  Privacy? Who Cares!

  This solution sacrifices all notions of privacy to Google. Can they make any use of all these documents? I really don’t know - but I note that they are very good at extracting information from large numbers of documents.

  In my case, I can’t see that it matters. Mostly what Google will find, looking at my large collection of on-line documents, will be a lot of articles for ComputerEdge. Since they are intended for publication anyway, I really don’t mind of Google looks at them. Or, for that matter, my work documents, none of which involve anything secret.

  If I did care more about my privacy, I could encrypt my data before sending it to Google, and be reasonably confident that no one outside the National Security Agency could read it. And if anyone at Google is reading this, how about some more storage? One gigabyte isn’t much!

  Staying on Top of Things

  There is always the waiting. It is always there, always part of the computer experience. From the time that you wait for your computer to boot up until the time you wait for it to shut down, using a computer involves a lot of waiting.

  If you've got a decent system (which means it's probably running Linux), and aren't doing anything too demanding, these waits might be almost impossible to notice. They may last no longer than the blink of an eye.

  But if you are on a system with a little less power—perhaps a beloved ultra-portable—then you may wait a bit longer, a bit more often. And if you are a little more demanding, if you run a few compiler jobs, are editing 50 buffers with Emacs and have two dozen tabs open in Firefox, well, you too might start to notice the waiting.

  What the Heck Is Going On?

  Sometimes it's pretty obvious what you are waiting for, and sometimes it's not at all clear. Sometimes you can do something about it, and sometimes you really can't. The first step is to try and see what is going on—what exactly are you waiting for?

  The best way to do that is with a program called top—just open a terminal and let it run.

  The top program will take the whole window and will update itself every three seconds. When you are done with top, hit the q key to end it.

  The Summary Section

  The top five lines of output show a summary of what is going on with the system. The number of users logged on is usually one for most Linux systems, but remember that Linux also runs on vast multi-user machines, and top must be able to cope with that. The load average is an arcane way of measuring how many things are waiting on the CPU for the last five, 10 and 15 minutes. Interpreting these numbers is something I leave for the experts.

  The next line tells you all about the tasks—that is, the individual programs running on your computer. A Linux computer running X Window will generally have a lot of tasks, but a few should be running. Don't worry about the zombie tasks; your computer is not going to try and eat your brain.

  Next comes the summary of your CPU and how busy it is running user code (that is, programs you launch), system code (operating system stuff), nice code (which someone very pleasantly chose to mark as less urgent with the nice command), the time spent idle (usually the largest number), and the time spent waiting for I/O and serving hardware and software interrupts (which should all be near zero under normal conditions).

  The memory is summarized next, with the total amount available, the amount in use, the amount free, and the number of buffers currently existing. Don't worry about the number of buffers, but the amount of memory in use and the amount left free are important numbers. When you have almost all your memory in use and very little free memory, it means you should buy more RAM.

  Finally, the swap disk is summarized. The swap disk is there to handle overflow from RAM. If you start too many programs, and are trying to use more RAM than you really have, the computer will take something you haven't used for a while and write it to the swap disk, freeing up the RAM. Whenever you try and access that memory, the computer will sneak off and grab it off the swap disk.

  In my case, there is no swap disk. This is a peculiarity of my system, the ASUS Eee.

  The Task List

  Under the Summary list is an ever-changing list of tasks, organized so that the most active are on the top. Each task (or process, as it is sometimes called) is assigned a number when it is started. The number has no particular meaning, it's just a way to uniquely identify any task. The first two columns of the task list show the process ID of each task, and the user who started it. Then comes the priority and niceness of the task, then the amount of memory the task takes up in memory, how much of the task is still resident in memory (as opposed to being sent to the swap disk), and how much of the tasks memory is also shared with one or more other tasks.

  Next comes the most important two numbers: the percentage CPU and the memory that this task is using. These will tell you which task is causing you to wait. The final two columns show the cumulative CPU time used by the task, and the name of the task (the program that is running).

  Interactive Commands

  Top is one of those crazy L
inux command-line programs that is just souped up to the max. There is practically nothing this command can't do, and its bewildering array of command-line choices and options will delight the Unix guru with hours to spend reading the man page and figuring out all the tricks that are possible with top.

  Considering the program is targeted at system administrators, I guess it's not too surprising to find so many features hanging off the software. System admins can be downright obsessive about knowing what is going on in their system. These features can be used by hitting the right keys while top is running (so you will need a man page open in another window to try them out).

  But every now and then, these little extras become just what you need. Recently I was wondering if I was taking full advantage of a dual-core system. How to really know? Turns out that top has an interactive command 1, which shows the usage of each CPU in the system. By seeing that they were both working hard, I was able to confirm that I was really getting the most out of the system.

  Why You Wait

  With a basic understanding of the top program, you are now able to answer the question posed at the beginning of this column: What are you waiting for when you wait for your computer? Just open top and look at the top one or two tasks, and you will have your answer.

  The top program also allows you to see exactly how each program is using the computer—what it is costing you in memory and CPU to run each task on that machine.

  Disk Space--the Final Frontier

  Back in the bad old days, our disks were always full. How well I remember the 300-megabyte platter drive I had to use back in the '80s. It was about the size of a washing machine. This was, of course, part of a large computer, with its own dedicated computer room. After the personal computer came along, I was amazed when I saw my first external hard drive; it was about the size of two large hardcover books stacked on top of each other. And it held a whole 10 megabytes.

  These days, you can get a 500-gigabyte drive for less than $100. We don't even bother cleaning our disks off—it's hard to imagine ever filling up 500GB. (Yet I know that in 10 years, I'll think of 500GB as a small amount of disk space!)

  When disk space was tight, there were several tools we used to figure out which files to delete in order to free up some disk space. It's been many years since I've used them, because of the cheapness of giant disks, but with my little Asus Eee computer, there is no disk, only 4GB of static RAM.

  And while 4GB seems like a lot (4,000 times larger than my first PC hard drive), I have already filled it after only a few weeks of owning the computer. Oh, deary me.

  The Old Standby: df

  In any kind of disk-space problem, the first command of the Linux guru will be the df command. It shows the amount of space on the file systems mounted on your computer.

  When I run df on my computer, it looks a bit funny, because the Eee does some funny things with its disks. The df command lists each mounted file system, its total size (in KB, by default), the amount used and available, the percentage in use, and where the file system is mounted.

  ~ $ df

  Filesystem 1K-blocks Used Available Use% Mounted on

  rootfs 1454700 888792 492012 65% /

  /dev/sda1 1454700 888792 492012 65% /

  unionfs 1454700 888792 492012 65% /

  tmpfs 254164 12 254152 1% /dev/shm

  tmpfs 131072 72 131000 1% /tmp

  If you can' t figure out which filesystem applies to you, go to your home directory, and run the df command with a period as an argument. This will show the information for the current disk, the one that holds your home directory. And using the -h option gives the output in more readable form.

  ~ $ df -h .

  Filesystem Size Used Avail Use% Mounted on

  unionfs 1.4G 869M 481M 65% /