Wednesday 26 March 2008

It just works

My son's PC is very rarely booted into Windows now. He seems to prefer using Ubuntu Linux to Windows, I wasn't expecting that. He commented that it was quite like his friends Apple machine, which I though was a very insightful comment on his part, since Apple's OSX is based on BSD UNIX so could be expected to be more similar to Linux than Windows.

I was expecting a lot of 'Dad how do I make it do...?' type questions, but there's been none of that at all, on the whole it just works. I asked him how he was managing with connecting his iPod and he said that was fine, he just plugged it in and something like iTunes just popped up. He showed me, he plugged it in and sure enough Rythmbox popped up with his iPod contents there ready for use. When I asked what he used windows for when he did boot into it, he said mainly to get to files that were in the windows system (although on install it copied everything over from his windows ID, it seems that items on the desktop do not get pulled across). So I showed him how to mount and unmount the windows drive in the file manager application, now he only rarely needs to access Windows.

My 8 year old daughter is finding her way around the old Celeron box OK too. I added the music player to the launch bar for her so that she could get to the music easily. Later that day I found that she'd added her favourite game to the launchbar too. She's quite capable of using the machine and is enjoying the the Tux Type typing tutor as well as the games and the Internet. So it seems that Linux is easy enough for an 8 year old to get to grips with.

Once its up and running on your hardware, Ubuntu is a very workable alternative to Windows. So in the case of my sons PC this has been totally painless. However if the hardware is not readily recognised by the default installation, as in the case of the wireless and graphics card on the old Celeron machine, it can be challenging for a non expert. That said its currently possible to by a new 'Linux ready' machine for only £130 from eBuyer (at time of writing), that would be perfectly suitable for office use.

Tuesday 18 March 2008

Stuff to read

Ok here are some links that might be of interest.

Monday 17 March 2008

Editing text files in Linux

At some point you may end up having to edit a text file from a command line prompt, perhaps to change a configuration file on advice from a forum post.

There's a joke in the professional UNIX community "users haven't got CLUE", they even walk around in t-shirts with puns on to that effect. CLUE stands for 'command line user experience'. Unlike MS-DOS and the related windows 'command' window, the command shell in a UNIX or LINUX distribution has a very powerful tool set, enabling complex tasks to be undertaken very swiftly with a short series of commands. Using these tools effectively requires knowledge and experience, remembering the subtleties and nuances of these commands is a memory feat in itself. When it comes to editing files the traditional command line editors available at the command line are extremely capable and powerful in the hands of an experienced user, or programmer, but for you or I they are next to useless for the occasional edit to a file. So an old hand at Linux or Unix is likely suggest that you edit a file with 'vi', 'ed', 'Emacs' or 'vim', as does some of the assistance available to you on-line. Instead, in Ubuntu, use 'nano' which is a nice simple text editor that will work pretty much as you would expect, even if, like me, you don't have CLUE.

Sunday 16 March 2008

Souping up an old Celeron

As my candidate for Linux I've used an old PC circa 2001. It had only 128MB of memory, and an 800MHz Celeron processor on an Intel i810 motherboard with internal graphics. It ran XP Home like an absolute dog, even with no anti-virus installed. On top of this the mouse port was working intermittently. I initially assumed that it would not be a suitable candidate for one of the major distributions, so I started playing with some smaller distributions suitable for older machines, before realising that it would run Ubuntu.

Sure enough Ubuntu ran fine on the machine, and recognised the USB mouse. For office use with Open Office it would be fine. But for an 8 year old girl it really didn't meet the mark. It wouldn't run any of the fun Linux games with 3D graphics (2 frames a second is unplayable) and couldn't do all the 3D eye candy tricks that her brother's 1 year old dual core machine would. So without spending a fortune how far could I push this machine? So after sorting out some wireless networking it was time to hit eBay for some second hand kit.
  • Pink light up in the dark, optical USB mouse (new) £4.94
  • 256mb PC133 Memory £8.97
  • Asylum FX5200 256 MB PCI nVidia graphics card £28.01
The mouse was just ideal for an 8 year old girl. The memory took the machine from marginal to adequate (Linux just isn't as memory hungry as XP or Vista). The graphics card is perhaps more than strictly necessary.

The internal graphics capability of the Intel 810 chipset on the motherboard was awful and probably the main flaw with the machine. My old 300MHz P3 machine in 1999, which had a modest 3d card was faster, so I knew that a proper graphics card could deliver some real performance improvement. There was no AGP graphic slot in the machine, and its way too old to take a later PCI-Express graphics card, this meant a PCI card would have to do. So I set to reading the forums, to judge what older 3d graphics cards were likely to work with Linux. This is a generalisation but seems that an ATI card works if you are lucky, but can cause a lot of grief and may not work at all. Most of the forum posts regarding Linux and nVidia cards seem to end in the cards working and there's plenty of advice as to how to achieve that. So it was to be an nVidia card.

Upgrades went fine up to the point of installing the graphics card, but given the amount of advice around the topic this was not a big surprise. After installing the card the machine booted Ubuntu showing the nice logo and progress bar during startup at the expected resolution. However the desktop failed to appear. This meant that the BIOS on the motherboard had detected the card and the card was working fine, but the X11/Xwindows server was failing to pick it up.

If this happens you can boot Linux in recovery mode by pressing [esc] at the boot menu, then use the command console to modify the settings and sort things out. Recovery mode starts up Linux in the command console. The first thing was to re run the configuration for Xwindows to select a sensible driver for the card, and set an 800x600 screen resolution.

: dpkg-reconfigure xserver-xorg

This re-ran the setup routine and enabled me to select the 'nv' driver (Linux community Nvidia driver) and set the lower resolution. After restarting the machine it still was not working. Advice suggested that the Xserver software was not being configured to talk to the new card and was most likely trying to talk to the now disabled motherboard graphics device. The answer was to determine the new card's identity and its PCI bus reference, then update the configuration files to reflect that. The configuration for X11 was in the file '/etc/X11/xorg.conf', which indeed was still pointing to the internal graphics card. To find the information about the card, I ran the 'lspci' (list the devices on the PCI bus) command to get the bus address and device name, then replaced the references to the on-board graphics in the xorg.conf file with this information.

After these changes, the GUI was restored. Using the admin tools in the GUI I managed to select a suitable Nvidia legacy driver to get back to 1280x1024 resolution.

The machine was now upgraded, but to get the full benefit from the card, it would require the manufacturer's driver which is one of the 'restricted' drivers. You can make this happen automatically by selecting the 'advanced desktop effects' in the 'Appearance Preferences' dialogue. After selecting this it asks if you wish to use the restricted driver, on saying yes to this, you have to reboot, then the nVidia driver is installed and you can get the full potential out of the graphics card.

After this I installed the Compiz Fusion 3d desktop effects, more of which in a later post. Its amazing, the crappy 7 year old PC will now play 3d games (penguins sliding down hills etc.) and has desktop eye candy that makes Vista look tame. Well worth £28 of anyone's money.

In conclusion, yes with Linux you can add a few more years useful life to an old PC. However adding high end graphics to an old Intel 810 based PC running Linux is not for the faint hearted or the novice to computing, but once again the community support offered on the Ubuntu forums had all the answers required to make this happen. Perhaps later releases of Ubuntu will sort this out too, they do seem to be getting this distribution more robust and usable every release.

MSN Addicts are supported (another win)!

One of my son's key concerns, when we installed Ubuntu onto his PC, was "can I still use MSN". After all without MSN he'd have no social life! I was pretty sure it was possible so I said yes, then had to find out how. It turns out its really simple, the Pidgin application is installed by default. All that was required was to open Pidgin enter the account details then it works with MSN, it even tells you about the status of your Hotmail account the same as MSN Messenger.

There was only one minor hiccup. The next day it was "Daaad... MSN's not working, I opened Pidgin but nothing happened". On investigation I quickly saw what was happening, Pidgin opens minimised as an icon on the top panel, so unless you are looking for the icon, it appears as if nothing has happened. Panic over, all's fine.

Getting Software and packages; its really not like Windows

When you get MS Windows you don't get MS Office pre-installed, that's an expensive extra. Having installed Ubuntu (my chosen Linux Distribution), there's already a lot of software pre-installed, including Open Office, a whole bunch of games and Internet tools such as Firefox and Pidgin.

When you choose software for your Windows PC, you have to find it, buy it in a shrink-wrapped cardboard box, or download it from the vendors website. The software comes comes with an installer. The installer unwraps all the compressed software and installs all the other little pieces of software and supporting data required to make it run, then spends some time adding information to the Windows registry so that windows will know how to work with it. Each windows installer has to contain (or have available to it) all the supporting software required (the dlls) in case they have not already been installed by other software's installers.

To get software in your Linux distribution you have a tool to add and remove software, which knows about all the software that is available to work readily with your distribution. Software in a Linux distribution is managed differently, as packages. The people who write and maintain the distribution also maintain an up to date archive of software that will run on it, that's prepackaged ready for download and install. So in Ubuntu for instance; when you want additional software, you just go to the 'Add/Remove Software' tool and search for the software. If you want animation software for instance, just type animation into the search and a number of packages are listed to choose from (amongst which, is the software that movie studios use). Each listed item has a more detailed description available, when you click on it. Just tick the box against the item you want and click the 'Apply Changes' button to install.

This built in installation system is a package manager. It manages the installation and removal of software for you. When you are adding a new piece of software, it fetches a list of the required modules, checks what you already have, then just downloads and installs the bits that are required. If you are removing software it removes the software, then checks for any modules that are no longer required and cleans them off the system too.

This in my experience to date all seems to work pretty well. There is a minor complication that may confuse a new user though. Having installed your new Ubuntu system if you look through the available software not everything you thought might be available is there. This is because the Ubuntu team have made a distinction between the types of software available and only by default offer software that is truly open-source, tested to run on the distribution, and has packages supported by the core Ubuntu team. There are three additional pools of software available, which you have to enable via the Administration, Software Sources tool. The software pools available are:
  • Canonical supported Open Source software (main)
    This is what you get out of the box, there's enough supported software to do most common PC tasks in a office, and a bunch more stuff too. (Canonical is the organisation behind the Ubuntu distribution, commercial organisations can buy support for Ubuntu from them).
  • Community maintained Open Source software (universe)
    This is all the rest of the Open Source software thats packaged for this distribution, theres a heap of this stuff. Mostly it will work but there are no guarantees, if you get stuck you are quite likely to find some online help though.
  • Proprietary drivers for devices (restricted)
    Ok this is stuff that you might need in order to get the best out of the hardware on your PC, but it isn't open source. For instance, to get the best performance out of an nVidia graphics card, you really have to use their driver, since only they know how to do it, the open source driver works, but not as well. Realistically nVidia are not going to give the open source community all their commercial secrets about graphics cards, because then their competitors would have them too.
  • Software restricted by copyright or legal issues (multiverse)
    You will probably need some of this stuff, in the real world there's stuff that we take for granted such as listening to MP3s or watching DVDs on the PC that uses software that is patented or is not totally clear cut in legal terms. There's a help page on restricted formats.
So, there are hundreds if not thousands of pieces of software ready to run on your Ubuntu PC, and all for free! Beyond that there is further 'packaged' software that's not packaged for the Ubuntu GUI tools but can still be downloaded and installed from the command console tools by the more expert user. Then beyond the packaged software its possible to download the source code, compile and install pretty much any piece of Linux Open Source software, although most users would never need to or want to.

Friday 14 March 2008

A simple overview of Linux architecture

I guess most users don't care about the architecture of their operating system, and why should they?

I'm just one of those people who has to know, it helps me sleep, its just the way I am. Now I know more on this topic, a lot of the advice given online, and my interaction with Linux does start to make more sense. So I thought I would try to give a simple overview here.

As I mentioned before, to use Linux we use a distribution, which consist of the Linux kernel and a whole bunch of other open source software. Linux is a layer of software underneath layers of other software that are required to help it provide the nice graphical user interface the you point and click in. The picture below show this in a nice concise model.
This picture is from Coops Weblog under a creative commons license the original is here

The Linux kernel runs all the time, it interfaces with and manages all access to the machines hardware. The other key task of the kernel is to manage what programs are allowed to run and how they are run. When you see the computer doing more than one thing at a time it's the kernel which is sharing the processor's time between those programs. Users never interact with the kernel directly.

The CLI (command line interpreter) or Shell provides a means for communicating with the machine and controlling what its doing. The command shell running on top of the kernel provides the minimum working system that users can interact with. When you start up the machine there are scripts that run in the command shell, which start off the programs to create the desktop environment that you use. As a desktop user you will rarely if ever have to use the command shell, except perhaps to make changes to the configuration that the GUI (graphical user interface) has not been designed to make.

Often when Linux is used as a server, to run websites or database for instance, the Console will suffice, and the higher layers of the system need not be loaded. Also if a desktop system has become broken, its possible for a more expert user to start the system in console mode, and use the command line tools available there to fix the system.

The X Window System (often referred to as X11), is the software that provides the basic graphical interface capabilities that are required to draw windows on the screen and interact with a mouse. There are various alternate programs available to suit different hardware, which have different resource requirements and capabilities, such as screen resolution and memory requirements.

The Window Manager is what makes the windows look and behave the way we see them. They manage the buttons frames and basic window behaviors; (maximising, minimising, window labels, creating dialogue boxes etc. It is possible to use a system at this level with out additional layers of software, however thats not typical on a modern installation.

The Desktop provides a rich user experience on top of the facilities provided by the window manager. This encompasses things such as the menus, tool bars, help systems, and an increasing number of other additional support services. There are two main Desktop options available KDE and Gnome. The major distributions offer one or both of these. Ubuntu uses the Gnome desktop, but is also available in the Kubuntu KDE desktop flavour.

So why all the layers (onion boy)? Each of these layers of software communicate with each other in well defined standard ways. So as long as the software used in each of the layers sticks to the rules, different programs can be used in any layer. This is one of the key strengths of Linux; through careful choice of sofware in each layer, its possible to create a Linux system that will run on a pocket sized mobile phone, or one that Dreamworks studios can use to make the next Shreck movie (oh please no! not a Shreck IV), or even more than 75% of the worlds top 500 supercomputers (just thought I ought to include this stat. even though they probably not running the GUI for the most part).

How's that compare to Windows? Windows has a kernel which is similar in function to the Linux kernel, but the layers above that are not so well defined. Although windows comes in a number of flavours and a wide variety of ways to pay for it, if you are not happy with the way that it works, well it kind of tough. You can't choose a different window manager that better suits your purposes, you get what's been chosen for you. You can't even write an improvement you need, since its all proprietary there's no way to know how all the programs inside Windows talk to each other. That said, to be fair, XP is pretty good on a current PC, even if Vista is a resource hungry monster, and there's a Windows version that competes in the Phone arena.

The web has masses of information about Linux here are some links I've found if you want to look a little further into this stuff.

Linux kernel - Wikipedia article
Shell / CLI
X11 - Wikipedia article
Window Managers

Tux Paint, the amazing children's drawing program

What my youngest seems to spend most time on, on her eldest sisters laptop besides the CBBC website and other online games sites for children, is Windows Paint.

We all know its pretty rubbish but its the only piece of art software that's lasted. The others are either too complicated to get to grips with or too limited to do any more than Paint but with a few stencils.

So I was looking for an Linux alternative. What I found was 'Tux Paint' (another pengiun themed piece of Linux software). This is a super little program, not at all intimidating for small children with its child optimised user interface. It has more functional art tools than many of the more grown up 'sophisticated' programs. The 'Magic' button reveals a wide range of clever tools. Small kids will enjoy the noises it makes as you paint (don't know about small kids, I thought that was fun myself), older kids will appreciate the wide variety of tools available to play with.

Tux Paint has also been designed with education in mind, it has an accompanying program that allows it to be configured for really young children. So that more functionality can be revealed as a child is ready for it.

Even if you are not a kid its worth a look. To download it in Ubuntu just go to the 'Add/Remove Applications' application and search for tux paint.

You can read all about it here at http://www.tuxpaint.org/

And if you haven't made the leap to Linux yet, there's no need to feel left out as its available in Microsoft and Apple flavours too.

Monday 10 March 2008

Beware not all up to date wireless cards work

Enthused by the relative easy of configuring wireless on the 1 year old pc. I bought a new Belkin WiFi card for the 7 year old celeron PC. Big mistake! It seems the latest Belkin wireless card (F5D7000uk) has a chip set that is not currently supported even with the ndiswrapper. After burning a few fruitless hours on this card, I abandoned all hope for it. After all when all the search results on the internet come up with fraught posts about the card that remain unanswered, its time to take the hint.

So I ended up switching the card with the older Belkin card from my Daughter's XP machine. This card was based upon an earlier chipset so I thought I ought to have a better chance, the Linux community will have had a crack at it. This card reported as 'Belkin 54g Wireless Desktop Network Card (F5D7000) Rev 03', which has a BCM4306 chipset. This lead me into a false sense of hope as the standard wifi driver in Ubuntu is called bcm43xx. It didn't work though. So I followed the same instructions as previously to use the ndiswrapper with the windows driver from the supplied disk. Still with no success. So then I tried with the drivers from the chip manufacturer, again no joy. Eventually I found this post in the 'tutorials and tips' section of the Ubuntu forum, which leads you through building the correct version of the ndiswrapper from the source code, adding the correct windows driver and changing the configuration to use the new driver. This worked the first time. There are some dedicated people out there who put real effort into making Linux work.

In conclusion, if you don't already have a wireless card order one from a Linux specialist such as the guys at linuxemporium.co.uk and save yourself the trouble.

Sunday 9 March 2008

Setting up wireless networking (even when it doesn't work first time)

The network cards arrived, but I'd stupidly ordered PCMCIA cards rather than PCI cards. Thats what happens when you order things on-line late at night when you really ought to be going to bed.

So do avoid further disappointment, I took a trip to the local computer store only to find that they only card they had was another Belkin card. So I left it and went to the local electrical goods emporium who "do sell wireless cards, but they're out of stock". Uh, what's happening has there been a run on Wireless in our town?

After an hour or so I calmed down and applied logic to the problem. There must be hundreds of thousands of these cards out there, and there's definitely a big community of Linux hackers out there, someone must have a driver that'll work.

This is what my research turned up: The built in wireless network device programs (drivers) in Ubuntu support a wide range of cards, so that cards based on many of the currently common chip sets will work straight off. If your card is not recognised by default its likely that you will need to get an alternate driver. The first thing I had to do was to determine the chip-set on the card.

To find the chipset I pulled the card from the machine and noted the model number on the board so as to look it up on the Internet. Don't do this, as I found out later,you don't need to. Linux has a handy little tool that will tell you which card you have. Opening a console (terminal) window and typing the lspci command (list pci) will show a list of all the hardware connected to the PCI bus, which will include both motherboard devices and any cards installed. From this line in the list, I could confirm the card's chipset:

03:06.0 Ethernet controller
: Marvell Technology Group Ltd. 88w8335 [Libertas] 802.11b/g Wireless (rev 03)

Having found the card I was able to check it against the WirelessCardsSupported list here. I was right, it's not supported by the default install. So the hunt was on for a driver. There are no Linux drivers for that chip set, that I can find. But as it turns out that's still not the end of the line. The guys who write software for Linux are clever, they've written a piece of software that wraps around a windows wireless driver that makes it work with the Linux Kernel. Its called Ndiswrapper, it can be used if the Linux Kernel has been compiled with the correct support; which in the case of Ubuntu, it has. Following the Community Ubuntu Documentation Ndiswrapper instructions had the card working within 10 minutes. This page refers to a list of tested windows drivers for the various chip sets, which you can download. You then follow instructions for disabling the current wireless driver, and installing Ndiswrapper with your Windows driver.

Its this kind of detailed support that's making Ubuntu a workable alternative to Windows.

Saturday 8 March 2008

There's so much to post (so this is a meta post)

Wow, I sat down to post a blog entry, but don't know where to start. There is so much I want to write up and probably not enough time to do it. When you are at the start of the learning curve its amazing how much new stuff you hit in a few evenings of playing.

My initial aim in starting this Blog was just to record my experience, but after trying to talk about Linux to my son I found that there is so much to explain. Don't get me wrong here, its not that you should be a rocket scientist to use it, its just that if you understand a little of what it is, and have a basic understanding of how it works, you'll be able to get more out of it when using it.

Also there are so many great Linux 'How to ...' resources out on the web and much of its of a high quality, that I'm unlikely to be able to add a great deal to that. There's enough information to give you information overload. Often though there is not enough of the soft explanation as to why you are doing a particular step or what it does (everyone out there seems to assume you know what you're doing).

So what I'll be doing is explaining some basics which may help, putting things in context where I can and pointing out where I've found good resources. I'll try not to assume any expertise on the reader's part where possible.

I'm now pretty sure that as far as desktop Linux for the ordinary end user goes, Ubuntu leads the way. So for the most part this blog will be covering Ubuntu.

One thing is quickly becoming apparent with the blog, is that it is not a linear. It won't be thing that can be read from start to end (although I've added a go to the start button at the bottom of the page). So I'm going to have to title entries carefully so that it can be navigated easier. Also they can be retrieved by the 'Labels', which are listed on the right panel (this is labelled as a meta-post since its a post about blog). Therefore all the topics I wanted to post in this entry will be posted separately as I get the time to write them up. Those are:
  • Setting up wireless networking (even when it doesn't work first time)
  • A simple overview of Linux architecture
  • Getting Software and packages; its really not like Windows
  • What's the command console
  • What about anti-virus?
  • Editing files
  • Using on-line resources for advice, and precautions
  • Tux Paint, the amazing children's drawing program
  • 3D desktop effects - Compiz-fusion (Beryl)
  • MSN Addicts are supported (another win)!
  • Playing DVDs on Ubuntu
  • Why have we got Linux (and for free too)?

Wednesday 5 March 2008

Playing with Small Linux Distributions

Last week I started looking into resurecting the old PC that I was given in return for fixing a windows laptop, the goal being to provide a workable PC for my 8 year old daughter to use. Its a 2001 Celeron with 128MB of memory and a 20GB hard drive and a non-functional mouse port.

Older less capable machines do not have enough resources (processor speed and memory) to run one of the big Linux distributions, such as Ubuntu or Mandriva. The Old-Celeron just doesn't have enough memory for this at 128MB (it wasn't running XP Home in a usable manner either). However there are a number of small distributions designed to run on old hardware, so I thought I'd give them a try.

I tried both DSL (DamnSmallLinux) and Puppy Linux. Both these distributions run from a live cd and come with enough tools to install to the harddrive or a USB stick (which I've yet to try).

First DSL
I downloaded it as an iso disk image and burned it to a CD-ROM. DSL is small enough that it will fit on a little credit card sized CD-ROM, in which form it can be bought for a nominal amount from here. Its claimed DSL will run with only a 486DX with 16MB of RAM, but will run entirely in RAM if you have 128MB or more.

DSL booted from the disc, I was immediately impressed that it found the USB mouse with no trouble. Every time it booted from the CD it claimed that it had been passed an unknown video mode before proceding. It seemed not to be able to decide for itself what console resolution to display, which was pretty irrelevent since I was planning to use it in GUI (graphical user interface) mode, so just ignoring this and pressing enter worked for me. The screen resolution once running in the GUI was also an issue. Both the i810 chipset on the motherboard and the monitor were capable of 1280 x 1024 but the best mode that actually worked I could coax out of the system was 800 x 600, 1024 x 800 would suffice.

Having loaded the system there was a choice of two flavours of desktop (or window manager) JWM (John's Window Manager) or Fluxbox. These both had quite a different look and feel to Windows. For a start there was no Start button, right click anywhere on the screen and you get the equivalent of the start menu. One impressive feature of both the windows managers was the abilty to handle multiple desktops, which enable you to have different applications open on different dektops then switch between them (not something you can do out of the box on a Microsoft system but common to most linux GUIs). Of the two I preferred Fluxbox though JWM was closer to windows.

Although only about 50MB in size DSL has an impressive list of applications enabling you to do quite a lot with just the base distribution. I was pleased and surprised to see the Firefox browser there, and apparently configured to use less space than usual. Using firefox on this 7 year old machine was as snappy as on my 2 year old laptop, very impressive. Additional packaged applications can be downloaded using MyDSL. I found that MyDSL was a bit clunky to use but worked. It is possible to create a new disk using DSL to include the downloaded applications (which I didn't try).

I decided to install it to the hard drive and see if I could fix the video problem. I used the cfdisk tool to re format the drive and create partitions with Linux file systems to install to. Two partitions were created a small 128MB partition for the swap file (same size as the system RAM) and a larger one using the rest of the drive set as a Boot partition to install on. The install ran in console window, it asked a few simple questions and took only a few minutes to run.

After installation to the hard drive it booted pretty much as before but loaded a little faster. I was still not able to fix the video resolution though since I'm a Linux novice that's no surprise. I'm sure it could be done, by downloading, installing and configuring alternate software that would enable the higher resolution, but I'm really not going to invest the time in doing that since with my current level of knowledge is not adequate.

Puppy Linux
So then I downloaded and tried Puppy Linux, after all it looks cute and the machine will be for an 8 year old girl. Its claimed that Puppy will run with 586Mhz CPU and 32Mb RAM. Puppy started up fine from the CD-ROM, initially it had the same screen resolution issue as DSL. However a little reading in the Puppy FAQ revealed that Puppy has an altenate more sophisticated X server (the bit of software that does the graphics and mouse stuff) called Xorg as well as the lightweight Xvesa, which I had been using in DSL. Switching to Xorg fixed the screen resolution problem.

Puppy also uses the JWM desktop, which they have configured to look and feel a bit more like Windows 95 (but with a picture of a small dog on the desktop).

There's a different set of software with Puppy than with DSL. Puppy notably includes OpenOffice, the Linux equivalent of Microsoft Office, which will read and write MS office format files. Unfortunately for my purposes it did not have Firefox by default. Puppy as with DSL offers additional software pakaged and tested to run in Puppy, these are supplemented by user supplied packages. Additionally after installing the Debian installer its possible to access an even wider range of software.

I installed Puppy to the hard drive too. It offered me the opportunity to install along side the DSL but I made it replace it. The install tool worked smoothly an Puppy booted OK from the hard disk.

Conclusions
Both of these distributions are remarkable considering their small size and they certainly run applications at a good speed. Overall I think Puppy would be the best choice of the two as its more sophisticated in most respects. DSL is more of a Geek OS, there is elegance in its simplicity (particularly when using FluxBox) there's the live system status imprinted on the desktop and the transparent command console windows. Puppy is a user's OS it has some of the rough edges knocked off and has a good office suite. When run from a CD-ROM both these distributions can store files and configuration on the hard drive or a USB stick, making them practical to use.

As far as my target customer for the machine goes I think that Puppy just feels a little old fashioned, though its probably just that Windows 95 theme, also it runs under the root password so would be easier to break. So if I'm to pass this machine off as an alternative to other kids XP machines, its going to have to scrub up a little better.

With some effort I could probably make Puppy fit the bill, but I have a Plan B to try first. For under £10 (ebay prices) I can add 256M of RAM, which will enable me to install Ubuntu (which is a slick shiny thing of beauty compared to these two). If this works there are two advantages; it looks nice with a more windows like desktop and I'll only have one distribution to support. Also it'll be interesting to compare the 7 year old machine with the 1 year old machine on the same OS.

I'm not sure if I'll use DSL or Puppy again, I might if more limited hardware comes my way.

Arrgh, where's the wireless +<( Ah here it is (^__^)

When we bought a machine for my son I had the XP-Media Center installed rather than Vista Home. Having read a little about how Vista is designed and some of the implications of this, it was a bit of a no brainer, since it cost no more to have a less crippled operating system with more capability. However there may be a time when the Windows XP is no longer in support and this machine will still be viable hardware. Hence making it Dual Boot seemed like a good idea, both my son and I can get used to a Linux OS before making any commitment to jump.

As I said the install went smoothly. Whilst installing Ubuntu I was politely informed that there was no network connection so software updates would have to be carried out later. This was no real surprise though, since although it was a new machine it was an old Belkin wireless card, which was pretty marginal in Windows.

After browsing the Internet for a bit, it seems the advice is to go for a standard card from a big name vendor such as Linksys, D-Link or such. I had a hunt for a card on-line, and strangely enough the prices of wireless network cards in eBay are only slightly lower than buying new from a reputable on-line dealer.

Well if I was going to buy new, I decided to make sure the card I was buying was supported by the major Linux distributions. After a short search I came upon thelinuxemporium who sell both software and hardware, along with Linux supported PCI, PCMCIA and USB wireless adaptors. I went for the Comtrend RT2500 54 Mbps Wireless PCMCIA card, which it says "works 'out-of-the-box' in Ubuntu 7.04 Feisty and 7.10 Gutsy" and only costs £10. If I bought the more expensive Edimax card which they offer at £19.45, and is tested against more versions of Linux, I would be entitled to free support too. At that price I bought one for the little old Celeron box too.

I'm still waiting with anticipation for the wireless cards, but its only been two days.

-----------------------
p. s. It seems that I didn't press the final button on the transaction, all fixed now I'm told the wireless cards should arive on Friday.

Tuesday 4 March 2008

Why there are all these different versions of Linux

For anyone familiar only with MS Windows who's venturing into the world of Linux there are a confusing number of versions available. Here's some of the reasoning behind this.

An operating system is built of layers of software on top of one another. At the heart of the system is the Kernel. The Kernel is a program that runs all the time managing the system, it manages which programs are allowed to run and when. It also manages access to all the hardware (devices) in the computer for those programs. You as the user never really interact directly with the kernel. Other programs manage your interaction with the computer using the services provided by the kernel. Even on systems without a graphical interface the command console is provided by a separate program called a 'shell'. Further additional layers of software provides the nice graphical windows interface, with all the pointy clicky mouse stuff, for the user and their programs. All this additional software comes in a variety of different flavours and versions with varying degrees of sophistication and resource requirement.

Linux is just the kernel of the operating system, when you install Linux on a PC you install a 'distribution' which has all the other required pieces of software and configuration files included.

Most distributions include a number of the more popular applications in the installation such as text editor, file manager, web browser, paint program, word processor etc.

The software that's initially installed with the distribution will depend on the intended use for the operating system, and the likely preferences of the intended user. Does the system require a graphical interface or just need a text based command shell for a simple application, for instance.

Fortunately for us there are some smart and dedicated people out there who build and manage these Linux distributions, keeping them up to date, and doing all the hard work of configuring the software to work together.

Monday 3 March 2008

Installing Ubuntu into VMware and adding VMware tools

I'm using VMware server, because its free and I know how to use it. This was my first Linux install into VMware, I decided to use Ubuntu because it has a nice easy installer, and it's in the list of target operating systems on the VMware new machine wizard.

It was pretty straight forward with everything pretty much 'click next' all the way through. After about 15 mins I had a working Ubuntu machine.

On restarting my machine it asked to be allowed to update itself. I let it do this and it downloaded a whole bunch of stuff from the Ubuntu site then confirmed it was up to date.

It was quickly obvious that were a few quirks to iron out. The screen resolution setting was 1.5 times the size of my screen and the mouse was pretty jerky. This is expected behaviour for a new VM without VMware Tools installed. So I set about installing the VMware Tools. For Windows VMs this is a single click on the VMware menu and the Windows installer does the rest. For Linux its not so simple, all VMware does is mount the software as a DVD-ROM on the desktop, and open it in a browser window, showing a 'RPM Package' and 'tar archive'.

Either of these can probably be used to install from, but as I don't really know what I'm doing yet I decided to go with looking for instructions inside the tar archive. I used the right click 'Archive manager' to look inside the archive for the install instructions, then followed them.

The actual installation involved unpacking the archive to the /tmp directory then following the install instructions to run the install script. This was again pretty straight forward. One service failed to install, but the installer said that it would be fine as long as I didn't want to use VMware shared folders, which I don't so that OK for now. During installation of the tools I was able to select a sensible screen resolution from a long list. After a restart of the VM the mouse was working fine too.

A quick test and it was all systems go apart from the sound, which I'll cover in a separate post.

Fixing the sound in Ubuntu on VMware

Installing Ubuntu on VMware went OK, the only thing was, there was no sound. When clicking on the sound icon to turn on the sound the OS reported that there was no sound hardware.

Investigating the issue, I found that the virtual machine configuration that VMware server supplied for Ubuntu had no reference to sound devices. This was quickly fixed by adding the following two lines into the .vmx file holding the machine definition.
sound.present = "TRUE"
sound.virtualDev = "es1371"
These lines were added just after the definition for the Ethernet device. On rebooting the virtual machine the sound was enabled.

My guess is that VMware defaults are designed for serious server use rather than playing with desktops, hence the lack of a default sound card in the configuration.

Where to run Linux?

Ok I've decided to have a go with Linux, but where should I install it and what options are there, bearing in mind that I don't really want to burn all my bridges with windows (yet)?

There are a number of options:
  • Run Linux from a Live CD (leaves your Windows machine untouched)
  • Install on a second separate machine (make mine a Linux)
  • Make my machine a dual boot machine
  • Run inside Windows
I'm having a punt at all of these so I'll note a bit more on each option

Live CD
This straight forward enough as long as the machine's BIOS is set to boot from a CD-ROM before its hard drive. If it doesn't work straight off, most live CDs appear to offer you some command line options to try to get it to work. Based on a couple of runs at this I've found that the main draw back with the live CD is if your hardware is not a good match with the default configuration on the CD, you cant get full use of the system. For instance I could not get the full screen resolution running a live CD. Also may of the live CDs do not have the codecs for MP3 (since the mp3 codecs are not truly open source), so you cant listen to music. Some time in the future I might have a go at the run from memory stick thing to see if that will overcome this limitation.

Install on a separate machine
This may be the simplest safest option initially, until you get comfortable with Linux. Not only do you not risk your Windows machine, you also have the option of using your windows machine to browse the internet for support if you break Linux.

Dual Boot machine
If you have enough space on your hard drive or can add in a second one, this may be the best option. Ubuntu installer holds your hand very well when making a machine into a dual boot machine. More about this in a later post.

Running Inside Windows
There are a number of ways to approach this. The two key options here are running inside a virtual machine or running a distribution designed to run inside a running windows installation. From what I've been able to make out the distributions designed to run within windows are modified versions of Linux where the Linux kernel is run within a windows service and the storage is all held in a big file in the windows storage. I've decided to use VMware to run a linux virtual machine to play with this idea. I'm going to avoid the 'modified for windows' distributions as they are not standard (if there is such a thing with Linux) and I want to learn about the OS in as real a situation as I can. I've chosen VMware as I'm familiar with VMware from Running Windows Server in it, VMware also appears to be the best option if the host OS is Windows, and last but not least its free.


My first practical use of Linux

While looking into whether Linux could read NTFS file systems (don't ask why, I sometimes just get really curious about stuff like that) I came across Trinity Rescue Kit, which is a Linux Live-CD that has a whole bunch of tools for sorting out dead PCs and getting the data off them. A free tool set like this seems too good an offer to pass up, so I downloaded an iso disk image for version 3.2 (the latest stable release) and burnt a disk, on the basis that I might find a use for it some day, as people are often asking me to sort out their dead PCs.

As it happens I had occasion to use it within a couple of days. I had to set up a demo system at a remote site, which entailed setting up VMware, installing a couple of virtual machines and ensuring all the clients were configured. Because the training room where all this was to run, was unexpectedly not available in the morning, all I had to work with was a windows server standalone from the network. This should have been no problem all, I needed to do to get under way was copy the software and vm images onto the machine from a USB drive. However Windows refused to talk to my USB drive, I tried a number of things to fix it but really needed to get started. At this point I remembered the Trinity Rescue Disk, its NTFS support would enable me to do this. Sure enough it booted first time enabled me to mount all the drives with read/write access, including the USB hard drive, with a single 'mountallfs -g' command. After which I was able to move all the stuff across relatively quickly with the Linux cp command.

So that was my fist practical use of Linux, it saved me a couple of hours of hanging around waiting for a network connection so that was pretty good.

What is a Linux live CD, how do I run a Linux live CD

Many of the Linux distributions have a Live-CD available. This is a whole operating system packed onto a CD (or DVD), which can be used directly from the CD without installing anything on your PC. Live-CDs are used for a number of reasons:
  • Trying out Linux before committing to use it
  • Running a Linux install in a user friendly way
  • Using Linux tools to fix a machine or rescue data
  • Using a machine without changing any data on it
If you are lucky all you need to do to run a live-cd is put it in the drive and restart the machine, however that may not be the case for all machines.

To use a Live-CD the PC must be able to boot the operating system from a cd-rom rather than the hard drive. When a PC is switched on a small piece of software in the mother-board (the BIOS) decides where to load the operating system from (typically the hard drive). On most PCs its possible for the Operating system to be loaded from any storage device, such as the floppy disk, the hard drive, CD-ROM , DVD-ROM or even a memory stick. If the machine looks for the operating system on the internal hard drive before looking at the CD-ROM, then the Linux live-cd does not get a chance to load. To fix this you will need to change the boot sequence in the BIOS settings. The BIOS settings can usually be accessed by pressing F2 when the machine is starting up (it may be another key, in which case look at the manual for the machine).

With some Linux distributions its even possible to boot the operating system from a memory stick. This means that you can have your own operating system and documents on a usb stick and take your computer around with you. If I try this I'll blog about it here.

However its worth noting that not everything always works, this was my Experience: Laptops have not been a great success. My Acer laptop can't boot from a CD since I've lost the BIOS admin password (set in an attempt to teenager proof the machine) and can't change the settings (my fault and very stupid on my part). The Fujitsu Amilo 2010 laptop will not run the Linux Windowing environment because the Graphics chip is not compatible with the standard Ubuntu Live-CD. The 128MB old machine has not enough memory to allow the Ubuntu Linux Windowing environment to start. However my son's PC worked fine first time and the old 128MB PC booted into DamnSmallLinux Live-CD without problems.

Choosing your Linux for a home PC (Ubuntu)

Linux comes in many types (distributions), so you need to choose which one to use. For Linux this is both a strength and a weakness; a strength because there is a Linux operating system configured to suit most purposes but a weakness because it is one more hoop to jump through for a new user.

This is my initial recommendation, for a home user the choice can be simplified, if you have a good enough machine to run Ubuntu then use Ubuntu. Providing you have a fairly standard machine Ubuntu is easier to install than Windows, and installs faster. If you have an older machine (less than 256MB ram and slow processor) that just can't cope with running Ubuntu then you need to look for a small lightweight distribution which I'll cover in a later post.

As with pretty much any Linux distribution Ubuntu is free!

It comes on CD, which you can either download for free, buy online or request for free (if you can wait 6-10 weeks). Downloading is for most the best option, you download an '.iso' file, which if you have cd burner software on your PC will get burned to a real cd when you double-click on it.

Ubuntu is supplied as a 'Live-CD'. Once you have your CD you can try out Ubuntu Linux before committing to install it on your PC. This is good because if it runs from OK the CD its pretty certain to be OK when you install it.

Venture Into Linux - Why am I doing this?

There's a whole bunch of reasons that have pushed me to looking into Linux in a little more depth. Whilst doing this I will Blog my experience for the following reasons:
  • Writing it down will consolidate my knowledge
  • Enable others to use my experience to ease their Introduction to Linux
  • Give something back to the Linux community, albeit in a small way (its the least I can do they're giving me all this free software)
As an IT professional for the last 13 years of so, I've watched Linux come more to the fore, it seems to me that its reaching a critical mass where there is a need for me to understand it in more depth to remain credible. It's no longer just a hobby horse for the command line UNIX jocks (no offense intended, but I guess they're not likely to be reading this), Linux distributions are now being mentioned along side Windows and Unix in the supported platforms for serious software.

At home, I seem to be doing increasing IT service work supplying and maintaining computers for the kids. And teenage daughters are demanding clients. Through using excellent open source and free software, I've managed to rain in the permanently infected Windows PC to a manageable solution to date. I've even managed to persuade them to use OpenOffice for their homework, despite the schools having an almost Microsoft only stance.

So I'm kicking off several projects to get my head around Linux, the idea being a gentle introduction through using Desktops on the home PCs then progressing to playing with server stuff and some more serious applications. With this in mind I've a number of projects to kick off this learning curve.
  1. Old PC with limited resources (Celeron 800Mhz and 128mb of RAM). Now the youngest is of an age where she'll need a computer and the machine I have to hand is a bit old, runs XP like a dog, so lets try a Linux desktop.
  2. Dual core AMD machine 2GB of memory and some bells & whistles. This is my son's machine, he's seen pictures of high end Linux on YouTube and wants his XP machine to do that too. So we're making it dual boot.
  3. Linux Live CD tools. Live CDs will boot up and run Linux without messing up your current install. This should let me play with a bunch of different Distributions of Linux.
  4. Linux running in VMware on XP. I use VMware, which can allow an operating system to run inside another by creating virtual PCs. This will give me a playground to try out stuff without investing in any more hardware.
So that's the start point. This Blog is a little late starting as I've been looking into this for a couple of weekends now, so there'll be a bunch of initial catch up posts to follow this up.