next up previous contents
Next: 5 Security Is Not Up: 10 Reasons Why I Previous: 3 Linux Was Multi-User   Contents


4 The Linux Software Deployment Model Favors Performance

When you install any version of Windows on a computer, one conspicuous software tool is missing: a compiler. The software deployment model for Windows based systems is solely deployment of executable programs. In addition to the flexibility considerations mentioned in Section 1: Linux Is Open Source, this implies that software is 'compatible' with the lowest common denominator for the target audience of the software. This lowest common denominator could be Windows version, CPU type, memory configuration, video/graphics capability, etc. For example, a user running an Athlon XP or Pentium 4 processor is generally forced to use software compiled for a Pentium (or maybe Pentium II). This in turn implies the user cannot upgrade to new and improved libraries and re-compile for better performance (one MS solution to this particular issue is the COM model such as used with DirectX; while reasonable for a graphics library, no one would argue this is extremely complicated approach for simple programs).

To provide a point of contrast, consider the installation of software one obtains from the Internet. To install software obtained from the Internet, one would perform the following steps:

  1. Download the binary package; usually many files in a single, compressed distribution file.
  2. If required, uncompress the package into a working directory. This requires the proper uncompression utility, which is not typically included with Windows.
  3. In the directory into which the package was uncompressed, double click the file setup.exe.
  4. Respond to any input the package installation program requires. This can be easy or complicated, depending on the package.
With this binary-only deployment model, the software vendors choose the 'minimum system requirements' and compile for that target. More advanced systems get little advantage. For an overblown example, imagine you need a program to do a task, but it was written to run on a 80486 processor. While your new Pentium 4 will run the software more efficiently, a large number of the new, highly advanced features of the Pentium 4 are not used. Couple this with the fact that the user cannot customize the software for a particular use (because the source code is not included and Windows does not ship with a compiler) and one is forced to conclude binary-only deployment is not very flexible.

Linux, on the other hand, generally ships and gets installed with a compiler. Many software packages for Linux are therefore available as binary packages or as source code. Users wanting basic features and 'one-click' installation get it; those wishing more flexibility, or higher performance, can compile the code themselves, optimized for their machines. It should be pointed out that software installation on Linux systems is one of those cases many people often criticize; personally, I find package installation as easy as that on Windows. Command line tools such as rpm or apt-get are straightforward, but for those wishing GUI tools, rpmdrake and KPackage are two examples that work well. Further, with KDE and Konqueror, 'one-click' on a .rpm file will begin the installation process much like clicking a setup.exe file in Windows. Installation of binary software using rpm on Linux can be done using the following steps (compare to those for Windows, 4):

  1. Download the binary package; usually many files in a single, compressed distribution file.
  2. In the directory into which the package was downloaded, double click the .rpm file.
  3. Respond to any input the package installation program requires. This can be easy or complicated, depending on the package.
Clearly, this is not any more complicated than installing software on Windows.

I should also emphasize that compiling from source is not hard in many instances. I find myself often preferring to install from source since a generally better performing program results. Also, custom compilation allows the user to compile with certain features enabled or disabled; if there is a feature of a specific application that you know you will never use, you can compile without it, thus creating a smaller and possibly faster application. With modern compilation tools, compiling from source is typically characterized by the following steps:

  1. Download the source code, which is usually many files in a single, compressed file for distribution.
  2. If required, uncompress the package into a working directory. This requires the proper uncompression utility, which is typically included with Linux.
  3. In the source directory (where the files were written after uncompression), read the README and INSTALL files for specific notes and requirements.
  4. Using the Command Line Interface (sometimes called a command prompt Windows-speak), issue the command: ./configure
  5. Again at the Command Line Interface, issue the command: make
  6. At the Command Line Interface, become the superuser: su
  7. At the Command Line Interface, issue the command: make install
Steps 1-2 are essentially the same as for binary distribution on Windows. While compilation may seem more complicated, it is generally quite easy. You download, uncompress, check for specifics in README and INSTALL, do ./configure, make, make install (as root). Compilation itself can take a while (a few seconds to hours, depending on the package and the speed of the computer), so there is a trade-off to the higher performance, more flexible option.

A key step here is Step 4, configuration. At this step, the Makefile is generated that contains commands for the compiler. Here, one can fine-tune the compilation process, specifying compiler optimization flags, compile in features (or disable features you know you will never need, which is also useful), etc. Some details of your system are probed and compiler settings adjusted accordingly. The cost of compiling your own, custom version of the software you download is relatively small and is far outweighed by the benefits you gain.

In summary, with software for Windows, you get what you get. Software vendors cannot assume that you have a compiler (if they want to supply source code, which is often not the case anyway), so 'must' supply an executable program. So that they do not have to maintain many, many copies of the same program, they compile for the lowest end system in their target market. Linux, on the other hand, provides the user with the option to use pre-compiled binaries or to compile from source. This is much more powerful and fits the needs of a far broader spectrum of users.


next up previous contents
Next: 5 Security Is Not Up: 10 Reasons Why I Previous: 3 Linux Was Multi-User   Contents
John S. Riley, DSB Scientific Consulting