Tuesday, November 28, 2006

Monday, November 27, 2006

Free Linux Classes

LinuxBasic.org, is an online community devoted to helping people learn to install and run Linux. They just announced free Linux classes. The "An Introduction to Linux Basics" aims to instill a basic understanding about Linux for beginners who want to know more about how the system works.

Also, advanced Linux users will find an opportunity to dig deeper into some areas they always wanted to know more about or to fill gaps in their knowledge, according to there team.

The course's study guide will be an "LBook," an edited version of the Introduction to Linux: A Hands on Guide by Machtelt Garrels, which is distributed under the GNU GPL open-source free license.

Students will need to join the group's mailing list in order to participate in the course. The class, which will run for six months, opened last month and is available for anybody wishing to join. Students can learn at their own pace.

To join the mailing list click here: http://linuxbasics.org/cgi-bin/mailman/listinfo/qna/

To get started in the courses, click here: http://linuxbasics.org/course/start

LinuxBasics.org is Germany-based and was founded two years ago. In addition to the courses, the site also provides tutorials and links to other sites that offer information needed to install and use Linux. Also available, are very "friendly" mailing lists for questions that arise when people start using Linux, and an IRC (Internet Rely Chat) chat channel.

They said: This course is free (as in free beer). However, a goodwill contribution in the form of active participation, revisions, suggestions or ideas is appreciated. So what are you waiting for? Here's a chance to learn a different operating system that is virtually bug, virus, malware and spyware free.

After all if it's good enough for NASA (being a rocket scientist is not required) its good enough for us.

Thursday, November 23, 2006

Young FrankenSteve



A cool video about Linux


Thursday, November 16, 2006

Linux Adoption, Powered by PC Power Users.

For a number of weeks now, I’ve been pondering exactly who chooses to migrate to Linux and perhaps even more importantly, why. Seriously, what is the motivating factor when it comes to making the move to a new OS? Generally speaking, it comes down to a need for a change.

Whether this stems from the need to try something new, or the fact that Vista is making people in Windows land very nervous, the fact remains that there is a relative flood of new users coming over to the Linux world hoping to find a more effective alternative to proprietary operating systems.

Linux Adoption, Powered by PC Power Users. We might like to think otherwise, but the "great migration" Linux is generally being powered by advanced Windows users. These are people who are already comfortable enough with configuring their computers that the idea of opening up a shell prompt doesn't frighten them off easily.

This is not to say that beginners are not working off of Linux boxes themselves mind you. But in the end, most of the migrants will be switching, thanks in part to the free ISOs (CD'S) that are available for download from various Linux distribution sites.

I personally believe that Windows users are fed up with the need to continuously upgrade their systems with proprietary OS problems. And now that Microsoft has all but shot themselves in the foot with the promise of any rouge application sending the affected PC into a "bluescreen," many end users need a break from this madness.

Making the Switch: Challenges. One hurdle that I’ve seen with a number of people working to make the switch to Linux is the understanding that Linux is, in fact, quite different from what they are used to. Because so much of the Linux world is composed of community efforts, the user interface and unusual hardware is not always as "plug-n-play" friendly as the migrant user might like.

Unfortunately, even to this day, I still see so many instances of forum posts where a recent "switcher" makes a plea for assistance, only to receive some short posting with a URL to another thread in it. You know something - that was one of my biggest pet peeves when I first tried Red Hat a number of years ago. And from what I’ve seen, it's still happening often enough even to this day.

But in fairness to those posting these short responses to Linux support forums, it’s reflective of a frustration that veteran users feel as beginners are not taking the time to look for the answers first. The solution is to utilize clear communication techniques by pointing users to sticky posts with an explanation of how the posting can help them. Everything considered, it makes for a fair compromise.

A Glimmer of Hope. Today, we have looked closely at who specifically is moving to Linux and why. Even though this may not seem too important to the future of Linux for advanced users, I’d beg to differ.

Some of these new Linux migrants could one day become a strong voice in the open source and Linux movement. The impressions they have today could very well shape the Linux distributions of tomorrow. I believe in my heart that it’s damaging to dismiss newer users who may not have a firm grasp on what it truly means to be a user of this fantastic operating system.

As long as we are able to maintain a balance between the user and the needs of the Linux community, I feel very strongly that it will indeed be the newcomer to Linux that decides the operating system's fate in the long run. Just look at the Ubuntu phenomena - I rest my case.

by Matt Hartley

Monday, November 06, 2006

Happy hard drive failures, no more!

Everyone's worst nightmare; the normal comforting hum of your computer is disturbed by clicking, pranging, banging... It happens to everyone because it's inevitable (hard drives are mechanical, as sure as a car will break down your hard drive will fail eventually). However, no matter how often you see it you never quite get used to it happening, the heartache of all the files you lose forever because you were "just about to back it up, honestly". This is not a matter of explaining to you how you can best avoid data loss or how to protect against your hard drive dying.

I have had few hard drives die and it's a painful period where you're without a functional computer. I, like many others find it impossible to survive without computers and the internet and therefore the frustrations of being forcibly cut off from the e-world is one I endeavour to avoid. This led me on a quest for information where it suddenly dawned on me - you can't stop your hard drive dying (everyone knows that) but you can have a solid backup in place, one that cannot die... So let me start from the beginning:

So, your hard drive shows signs of dying, you spend a few days running chkdisk, kneeling on the floor cuddling your rig and listening for faults when it dies. When you're done crying you have to RMA the hard drive (if within warranty, if not order another) and then wait for the new drive to arrive. Traditionally you would have to leave your computer alone and wander off, scared and confused into the real world... Well no longer, Live-CD environments are so good that you can comfortably survive without a working hard drive if need be and provideding you have some way of saving the files.

A Solid Survival Pack:

1) An up to date Live-CD environment like the Knoppix live CD.

2) A secondary hard drive with a Fat32 partition OR a USB flash drive for saving your files

3) Backups of your work, documents, pictures, music etc for convenience.

If you have these basic provisions you can manage sufficiently. You have Firefox for web browsing (The Ubuntu live-CD environment picks up DSL and Cable broadband without issue). Evolution for email, GAIM for messaging, OpenOffice for your office needs (word processing, spreadsheets etc), GIMP for image manipulation and editing... More than sufficient to allow you to "get by". This coincides nicely with the wealth of applications that operate entirely at the web tier, removing the need for installing some applications. WebFTP clients, Meebo for messaging, you can even edit images online. There are even Flash based pseudo online operating systems that give you 1gb of storage - so to say the world ends when your hard drive dies is a gross exaggeration.

The dark ages of being helpless to the god of hard drive failure is over, Live-CD's are the way to go for an emergency.

You can remain connected and in charge from a Live-CD environment in reasonable comfort - you can even continue work on your essays or reports while listening to music (streamed from the internet or otherwise). This is of course an ideal situation, unless the lack of dual screen support leaves your face twisted in a ball of rage, unable to operate on a single screen. Or if your computer doesn't matter to you and you can leave it for days on end without use (the very thought of it makes me shudder) then you may as well wait until you get a replacement hard drive. Otherwise, a Live-CD environment such as the ones offered by Ubuntu, Knoppix and Mepis are ideal for keeping you connected while your beloved drive is replaced.

Monday, October 30, 2006

Microsoft lures 'mom and pop' companies.

Microsoft released on Monday free business-management software aimed at either the smallest of small businesses or at painfully late adopters.

Microsoft Office Accounting Express 2007 is for "starting businesses and home-based businesses that currently use pen and paper or spreadsheets" to run their operations, according to the company's online FAQ.

The software is available for free download at Microsoft's IdeaWins Web site. It resembles the Microsoft Outlook e-mail client and integrates with other Microsoft Office software.

Link: http://www.ideawins.com/faq.html#q15

Functions include creating invoices, quotes, receipts and customizable reports, as well as expense tracking, payroll and tax processing, credit reporting, online sales and monitoring employee time. Office Accounting Express 2007 users can also import data from Intuit QuickBooks, Microsoft Money and Microsoft Office Excel, and they can use Office Live to share information with an accountant.

Office Accounting Express 2007 also links to third-parties that offer additional fee-based services, including ADP for payroll, eBay for online sales, Equifax for credit checks and PayPal for online payments.

The new software is another example of how Microsoft has been forced to change its business model as more software for small-business owners becomes freely available.

Office Accounting Express 2007 will also be included in the Small Business, Professional and Office Ultimate versions of Microsoft Office 2007.

Microsoft recommends that small businesses "with more complex needs such as inventory management, multicurrency invoicing, multiuser access and fixed asset management," use Office Accounting Professional 2007, which is set for release in early 2007 for $149.

Sunday, October 29, 2006

For Penguin Fans. Happy Feet.

Friday, October 20, 2006

Taking Linux for a Spin.

I think that we are going to see one very serious trend here: Microsoft users giving the easy to use distros a very serious look. While some of them may cost money, there are still plenty of cool free options, too.

Microsoft's recent announcement that the long-anticipated new version of its Windows operating system, Vista, has been delayed into January 2007 leaves several questions hanging. Will Vista offer enough benefits to make it worth the cost of upgrading? Will those who hang on to the current version of Windows end up locked out of new software and peripherals? A recent report by research firm Gartner said that as many as half of all PCs will not be able to run many of Vista's most sophisticated features. Given that, is upgrading even a smart option?

But if the confusion over Windows is deepening, the opposite is true of Linux. Linux, of course, is the alternative to Windows that comes from the world of open-source software--meaning no company owns it, it's available for free, and it boasts a worldwide network of programmers constantly trying to perfect it. For the most part, Linux has been used by geeks (me) who enjoy rolling up their sleeves and getting under the hood of their software. Even the mention of Linux is enough to pull most nontechie managers out of their comfort zone. But, on the flip side of that, gives others wood.

But the notion that Linux is a complicated, alien, experts-only operating system is no longer true. Not only is Linux becoming a mainstream product that can be considered a reasonable alternative to Windows for just about anyone, it's actually easy for nontechies to install. Linux looks pretty much like Windows these days, so you won't face a steep learning curve in putting it to work. And you'll probably never have to worry about a big-bang upgrade to a radically new version because Linux gets updated routinely every six months or so, incrementally, at little or no cost..

To see which Linux distrobution is right for you check out the Linux Distribution Chooser at: http://www.zegeniestudios.net/ldc/

Saturday, October 14, 2006

Windows, Linux, Cars and Lego's (Part 5 of 5)

Problem #6: Imitation vs. Convergence

An argument people often make when they find that Linux isn't the Windows clone they wanted is to insist that this is what Linux has been (or should have been) attempting to be since it was created, and that people who don't recognize this and help to make Linux more Windows-like are in the wrong. They draw on many arguments for this:

Linux has gone from Command-Line- to Graphics-based interfaces, a clear attempt to copy Windows

Nice theory, but false: The original X windowing system was released in 1984, as the successor to the W windowing system ported to Unix in 1983. Windows 1.0 was released in 1985. Windows didn't really make it big until version 3, released in 1990 - by which time, X windows had for years been at the X11 stage we use today. Linux itself was only started in 1991. So Linux didn't create a GUI to copy Windows: It simply made use of a GUI that existed long before Windows.

Windows 3 gave way to Windows 95 - making a huge level of changes to the UI that Microsoft has never equaled since. It had many new & innovative features: Drag & drop functionality; taskbars, and so on. All of which have since been copied by Linux, of course.

Actually. . . no. All the above existed prior to Microsoft making use of them. NeXTSTeP in particular was a hugely advanced (for the time) GUI, and it predated Win95 significantly - version 1 released in 1989, and the final version in 1995.

Okay, okay, so Microsoft didn't think up the individual features that we think of as the Windows Look-and-Feel. But it still created a Look-and-Feel, and Linux has been trying to imitate that ever since.

To debunk this, one must discuss the concept of convergent evolution. This is where two completely different and independent systems evolve over time to become very similar. It happens all the time in biology. For example, sharks and dolphins. Both are (typically) fish-eating marine organisms of about the same size. Both have dorsal fins, pectoral fins, tail fins, and similar, streamlined shapes.

However, sharks evolved from fish, while dolphins evolved from a land-based quadrupedal mammal of some sort. The reason they have very similar overall appearances is that they both evolved to be as efficient as possible at living within a marine environment. At no stage did pre-dolphins (the relative newcomers) look at sharks and think "Wow, look at those fins. They work really well. I'll try and evolve some myself!"

Similarly, it's perfectly true to look at early Linux desktops and see FVWM and TWM and a lot of other simplistic GUIs. And then look at modern Linux desktops, and see Gnome & KDE with their taskbars and menus and eye-candy. And yes, it's true to say that they're a lot more like Windows than they used to be.

But then, so is Windows: Windows 3.0 had no taskbar that I remember. And the Start menu? What Start menu?

Linux didn't have a desktop anything like modern Windows. Microsoft didn't either. Now they both do. What does this tell us?

It tells us that developers in both camps looked for ways of improving the GUI, and because there are only a limited number of solutions to a problem, they often used very similar methods. Similarity does not in any way prove or imply imitation. Remembering that will help you avoid straying into problem #6 territory.

Problem #7: That FOSS (Free Open Source Software)thing.

Oh, this causes problems. Not intrinsically: The software being free and open-source is a wonderful and immensely important part of the whole thing. But understanding just how different FOSS is from proprietary software can be too big an adjustment for some people to make.

I've already mentioned some instances of this: People thinking they can demand technical support and the like. But it goes far beyond that.

Microsoft's Mission Statement is "A computer on every desktop" - with the unspoken rider that each computer should be running Windows. Microsoft and Apple both sell operating systems, and both do their utmost to make sure their products get used by the largest number of people: They're businesses, out to make money.

And then there is FOSS. Which, even today, is almost entirely non-commercial.

Before you reach for your email client to tell me about Red Hat, Suse, Linspire and all: Yes, I know they "sell" Linux. I know they'd all love Linux to be adopted universally, especially their own flavor of it. But don't confuse the suppliers with the manufacturers. The Linux kernel was not created by a company, and is not maintained by people out to make a profit with it. The GNU tools were not created by a company, and are not maintained by people out to make a profit with them. The X11 windowing system. . . well, the most popular implementation is xorg right now, and the ".org" part should tell you all you need to know.

Desktop software: Well, you might be able to make a case for KDE being commercial, since it's Qt-based. But Gnome, Fluxbox, Enlightenment, etc. are all non-profit. There are people out to sell Linux, but they are very much the minority.

Increasing the number of end-users of proprietary software leads to a direct financial benefit to the company that makes it. This is simply not the case for FOSS: There is no direct benefit to any FOSS developer in increasing the userbase. Indirect benefits, yes: Personal pride; an increased potential for finding bugs; more likelihood of attracting new developers; possibly a chance of a good job offer; and so on.

But Linus (Linux) Torvalds doesn't make money from increased Linux usage. Richard Stallman doesn't get money from increased GNU usage. All those servers running OpenBSD and OpenSSH don't put a penny into the OpenBSD project's pockets. And so we come to the biggest problem of all when it comes to new users and Linux:

They find out they're not wanted.

New users come to Linux after spending their lives using an OS where the end-user's needs are paramount, and "user friendly" and "customer focus" are considered veritable Holy Grails. And they suddenly find themselves using an OS that still relies on 'man' files, the command-line, hand-edited configuration files, and Google. And when they complain, they don't get coddled or promised better things: They get bluntly shown the door.

That's an exaggeration, of course. But it is how a lot of potential Linux converts perceived things when they tried and failed to make the switch.

In an odd way, FOSS is actually a very selfish development method: People only work on what they want to work on, when they want to work on it. Most people don't see any need to make Linux more attractive to inexperienced end-users: It already does what they want it to do, why should they care if it doesn't work for other people?

FOSS has many parallels with the Internet itself: You don't pay the writer of a webpage/the software to download and read/install it. Ubiquitous broadband/User-friendly interfaces are of no great interest to somebody who already has broadband/knows how to use the software. Bloggers/developers don't need to have lots of readers/users to justify blogging/coding. There are lots of people making lots of money off it, but it's not by the old-fashioned "I own this and you have to pay me if you want some of it" method that most businesses are so enamored of; it's by providing services like tech-support/e-commerce.

Linux is not interested in market share. Linux does not have customers. Linux does not have shareholders, or a responsibility to the bottom line. Linux was not created to make money. Linux does not have the goal of being the most popular and widespread OS on the planet.

All the Linux community wants is to create a really good, fully-featured, free operating system. If that results in Linux becoming a hugely popular OS, then that's great. If that results in Linux having the most intuitive, user-friendly interface ever created, then that's great. If that results in Linux becoming the basis of a multi-billion dollar industry, then that's great.

It's great, but it's not the point. The point is to make Linux the best OS that the community is capable of making. Not for other people: For itself. The oh-so-common threats of "Linux will never take over the desktop unless it does such-and-such" are simply irrelevant: The Linux community isn't trying to take over the desktop. They really don't care if it gets good enough to make it onto your desktop, so long as it stays good enough to remain on theirs. The highly-vocal MS-haters, pro-Linux zealots, and money-making FOSS purveyors might be loud, but they're still minorities.

That's what the Linux community wants: an OS that can be installed by whoever really wants it. So if you're considering switching to Linux, first ask yourself what you really want.

If you want an OS that doesn't chauffeur you around, but hands you the keys, puts you in the driver's seat, and expects you to know what to do: Get Linux. You'll have to devote some time to learning how to use it, but once you've done so, you'll have an OS that you can make sit up and dance.

If you really just want Windows without the malware and security issues: Read up on good security practices (CISSP); install a good firewall, IDS, IPS, malware-detector, and anti-virus; replace IE with a more secure browser like Firefox; and keep yourself up-to-date with security updates. There are people out there (myself included) who've used Windows since the 3.1 days right through to XP without ever being infected with a virus or malware (although I have experenced system crashes): you can do it too. Don't get Linux: It will fail miserably at being what you want it to be.

If you really want the security and performance of a Unix-based OS but with a customer-focused attitude and an world-renowned interface: Buy an Apple Mac. OS X is great. But don't get Linux: It will not do what you want it to do.

It's not just about "Why should I want Linux?". It's also about "Why should Linux want me?"

Tuesday, October 10, 2006

Windows, Linux, Cars and Lego's (Part 4 of 5)

Problem #4: Designed for the designer

In the car industry, you'll very rarely find that the person who designed the engine also designed the car interior: It calls for totally different skills. Nobody wants an engine that only looks like it can go fast, and nobody wants an interior that works superbly but is cramped and ugly. And in the same way, in the software industry, the user interface (UI) is not usually created by the people who wrote the software.

In the Linux world, however, this is not so much the case: Projects frequently start out as someone's toy. He or she does everything himself or herself, and therefore the interface has no need of any kind of "user friendly" features: The user knows everything there is to know about the software, he or she doesn't need help. Vi (MS-Word like) is a good example of software deliberately created for a user who already knows how it works: It's not unheard of for new users to reboot their computers because they couldn't figure out how else to get out of vi.

However, there is an important difference between a Free & Open-Source Software (FOSS) programmer and most commercial software writers: The software a Free & Open-Source Software (FOSS) programmer creates is software that he or she intends to use. So while the end result might not be as 'comfortable' for the novice user, they can draw some comfort in knowing that the software is designed by somebody who knows what the end-users needs are: He too is an end-user. This is very different from commercial software writers, who are making software for other people to use: They are not knowledgeable end-users.

So while vi has an interface that is hideously and unfriendly to new users, it is still in use today because it is such a superb interface once you know how it works. Firefox (IE like) was created by people who regularly browse the Web. The Gimp (Photoshop like) was built by people who use it to manipulate graphics files. And so on.

So Linux interfaces are frequently a bit of a minefield for the novice: Despite its popularity, vi should never be considered by a new user who just wants to quickly make a few changes to a file, use Open Office Writer (MS Word like). And if you're using software early in its lifecycle, a polished, user-friendly interface is something you're likely to find only in the "To Do" list: Functionality comes first. Nobody designs a killer interface and then tries to add functionality bit by bit. They create functionality, and then improve the interface bit by bit.

So to avoid #4 issues: Look for software that's specifically aimed at being easy for new users to use, or accept that some software that has a steeper learning curve than you're used to. To complain that vi isn't friendly enough for new users is to be laughed at for missing the point.

Problem #5: The myth of "user-friendly"

This is a big one. It's a very big term in the computing world, "user-friendly". It's even the name of a particularly good webcomic. But it's a bad term.

The basic concept is good: That software be designed with the needs of the user in mind. But it's always addressed as a single concept, which it isn't. If you spend your entire life processing text files, your ideal software will be fast and powerful, enabling you to do the maximum amount of work for the minimum amount of effort. Simple keyboard shortcuts and mouseless operation will be of vital importance.

But if you very rarely edit text files, and you just want to write an occasional letter, the last thing you want is to struggle with learning keyboard shortcuts. Well-organized menus and clear icons in tool bars will be your ideal.

Clearly, software designed around the needs of the first user will not be suitable for the second, and vice versa. So how can any software be called "user-friendly", if we all have different needs?

The simple answer: User-friendly is a misnomer (example: Animal crackers are not crackers but cookies.), and one that makes a complex situation seem simple.

What does "user-friendly" really mean? Well, in the context in which it is used, "user friendly" software means "Software that can be used to a reasonable level of competence by a user with no previous experience of the software." This has the unfortunate effect of making lousy-but-familiar interfaces fall into the category of "user-friendly".

Subproblem #5a: Familiar is friendly

So it is that in most "user-friendly" text editors & word processors, you Cut and Paste by using Ctrl-X and Ctrl-V. Totally unintuitive, but everybody's used to these combinations, so they count as a "friendly" combination.

So when somebody comes to vi and finds that it's "d" to cut, and "p" to paste, it's not considered friendly: It's not what anybody is used to.

Is it superior? Well, actually, yes.

With the Ctrl-X approach, how do you cut a word from the document you're currently in? (No using the mouse!) From the start of the word, Ctrl-Shift-Right to select the word. Then Ctrl-X to cut it.

The vi approach? dw deletes the word.

How about cutting five words with a Ctrl-X application? From the start of the words,
Ctrl-Shift-Right
Ctrl-Shift-Right
Ctrl-Shift-Right
Ctrl-Shift-Right
Ctrl-Shift-Right
Ctrl-X

And with vi?

d5w

The vi approach is far more versatile and actually more intuitive: "X" and "V" are not obvious or memorable "Cut" and "Paste" commands, whereas "dw" to delete a word, and "p" to put it back is perfectly straightforward. But "X" and "V" are what we all know, so whilst vi is clearly superior, it's unfamiliar. Ergo, it is considered unfriendly. On no other basis, pure familiarity makes a Windows-like interface seem friendly. And as we learned in problem #1, Linux is necessarily different to Windows. Inescapably, Linux always appears less "user-friendly" than Windows.

To avoid #5a problems, all you can really do is try and remember that "user-friendly" doesn't mean "What I'm used to": Try doing things your usual way, and if it doesn't work, try and work out what a total novice would do.

Subproblem #5b: Inefficient is friendly

This is a sad but inescapable fact. Paradoxically, the harder you make it to access an application's functionality, the friendlier it can seem to be.

This is because friendliness is added to an interface by using simple, visible 'clues' - the more, the better. After all, if a complete novice to computers is put in front of a WYSIWYG word processor and asked to make a bit of text bold, which is more likely:

* He'll guess that "Ctrl-B" is the usual standard

* He'll look for clues, and try clicking on the "Edit" menu.

Unsuccessful, he'll try the next likely one along the row of menus: "Format". The new menu has a "Font" option, which seems promising. And Hey! There's our "Bold" option. Success!

Next time you do any processing, try doing every job via the menus: No shortcut keys, and no toolbar icons. Menus all the way. You'll find you slow to a crawl, as every task suddenly demands a multitude of keystrokes/mouse clicks.

Making software "user-friendly" in this fashion is like putting training wheels on a bicycle: It lets you get up & running immediately, without any skill or experience needed. It's perfect for a beginner. But nobody out there thinks that all bicycles should be sold with training wheels: If you were given such a bicycle today, I'll bet the first thing you'd do is remove them for being unnecessary encumbrances: Once you know how to ride a bike, training wheels are unnecessary.

And in the same way, a great deal of Linux software is designed without "training wheels" - it's designed for users who already have some basic skills in place. After all, nobody's a permanent novice: Ignorance is short-lived, and knowledge is forever. So the software is designed with the majority in mind.

This might seem an excuse: After all, MS Word has all the friendly menus, and it has toolbar buttons, and it has shortcut keys. . . Best of all worlds, surely? Friendly and efficient.

However, this has to be put into perspective: Firstly, the practicalities: having menus and toolbars and shortcuts and all would mean a lot of coding, and it's not like Linux developers all get paid for their time. Secondly, it still doesn't really take into account serious power-users: Very few professional wordsmiths use MS Word. Ever meet a coder who used MS Word? Compare that to how many use emacs & vi.

Why is this? Firstly, because some "friendly" behavior rules out efficient behavior: See the "Cut & Copy" example above. And secondly, because most of Word's functionality is buried in menus that you have to use: Only the most common functionality has those handy little buttons in toolbars at the top. The less-used functions that are still vital for serious users just take too long to access.

Something to bear in mind, however, is that "training wheels" are often available as "optional extras" for Linux software: They might not be obvious, but frequently they're available.

Take mplayer. You use it to play a video file by typing mplayer filename in a terminal. You fast forward & rewind using the arrow keys and the Page Up & Page Down keys. This is not overly "user-friendly". However, if you instead type gmplayer filename, you'll get the graphical front end, with all its nice, friendly , familiar buttons.

Take ripping a CD to MP3 (or Ogg): Using the command-line, you need to use cdparanoia to rip the files to disc. Then you need an encoder. . . It's a hassle, even if you know exactly how to use the packages (imho). So download & install something like Grip. This is an easy-to-use graphical front end that uses cdparanoia and encoders behind-the-scenes to make it really easy to rip CDs, and even has CDDB support to name the files automatically for you.

The same goes for ripping DVDs: The number of options to pass to transcode is a bit of a nightmare. But using dvd::rip to talk to transcode for you makes the whole thing a simple, GUI-based process which anybody can do.

So to avoid #5b issues: Remember that "training wheels" tend to be screwed-on extras in Linux, rather than being automatically supplied with the main product. And sometimes, "training wheels" just can't be part of the design.

Wednesday, October 04, 2006

Windows, Linux, Cars and Lego's (Part 3 of 5)

Problem #3: Culture shock
Subproblem #3a: There is a culture

Windows users are more or less in a customer-supplier relationship: They pay for software, for warranties, for support, and so on. They expect software to have a certain level of usability. They are therefore used to having rights with their software: They have paid for technical support and have every right to demand that they receive it. They are also used to dealing with entities rather than people: Their contracts are with a company, not with a person.

Linux users are in more of a community. They don't have to buy the software, they don't have to pay for technical support, although there is also a fee based model for high end technical support. They download software for free & use Instant Messaging (IRC) and web-based forums to get help. They deal with people, not corporations.

A Windows user will not endear himself or herself by bringing his habitual attitudes over to Linux, to put it mildly.

The biggest cause of friction tends to be in the online interactions: A "3a" user new to Linux asks for help with a problem he or she is having. When he doesn't get that help at what he considers an acceptable rate, he starts complaining and demanding more help. Because that's what he's used to doing with paid-for technical support. The problem is that this isn't paid-for support (unless you choose to purchase Linux support from a third party). This is a bunch of volunteers who are willing to help people with problems out of the goodness of their hearts. The new user has no right to demand anything from them, any more than somebody collecting for charity can demand larger donations from contributors.

In much the same way, a Windows user is used to using commercial software. Companies don't release software until it's reliable, functional, and user-friendly enough. So this is what a Windows user tends to expect from software: It starts at version 1.0. Linux software, however, tends to get released almost as soon as it's written: It starts at version 0.1. This way, people who really need the functionality can get it ASAP; interested developers can get involved in helping improve the code; and the community as a whole stays aware of what's going on.

If a "3a" user runs into trouble with Linux, he or she will complain: The software hasn't met his or her standards, and he thinks he has a right to expect that standard. His mood won't be improved when he gets sarcastic replies like "I'd demand a refund if I were you"

So, to avoid problem #3a: Simply remember that you haven't paid the developer who wrote the software or the people online who provide the tech support. They don't owe you anything. Unless you decide to purchase Linux technicial support from an outside source.

Subproblem #3b: New vs. Old

Linux pretty much started out life as a hacker's hobby. It grew as it attracted more hobbyist hackers. It was quite some time before anybody but a geek stood a chance of getting a usable Linux installation working easily. Linux started out "By us geeks, for us geeks." And even today, the majority of established Linux users are self-confessed geeks, and the super geeks of today are like the rock and roll stars of the 80's (but with much more money) and yes, they're also in the fourms to help.

And that's a beautiful thing: If you've got a problem with hardware or software, having a large number of geeks (and sometimes the super-geeks) available to work on the solution is a definite plus.

But, Linux has grown up quite a bit since its early days. There are distrobutions (distros) that almost anybody can install, even distros that live on CDs and detect all your hardware for you without any intervention. It's become attractive to non-hobbyist users who are just interested in it because it's virus, malware, spyware and badware free, it's also cheap to upgrade too. It's not uncommon for there to be friction between the two camps. It's important to bear in mind, however, that there's no real malice on either side: It's lack of understanding that causes the problems.

Firstly, you get the hard-core geeks who still assume that everybody using Linux is a fellow geek. This means they expect a high level of knowledge, and often leads to accusations of arrogance, elitism, and rudeness. And in truth, sometimes that's what it is. But quite often, it's not: It's elitist to say "Everybody ought to know this". It's not elitist to say "Everybody knows this" - quite the opposite.

Secondly, you get the new users who're trying to make the switch after a lifetime of using commercial operating systems. These users are used to software that anybody can sit down & use, out-of-the-box.

The issues arise because group 1 is made up of people who enjoy being able to tear their operating system apart and rebuilding it the way they want, while group 2 tends to be indifferent to the way the operating system works, so long as it does work.

A parallel situation that can emphasize the problems is Lego's. Picture the following:

New: I wanted a new toy car, and everybody's raving about how great Lego's cars can be. So I bought some Lego's, but when I got home, I just had a load of bricks and cogs and stuff in the box. Where's my car??

Old: You have to build the car out of the bricks. That's the whole point of Lego's.

New: What?? I don't know how to build a car. I'm not a mechanic. How am I supposed to know how to put it all together??

Old: There's a leaflet that came in the box. It tells you exactly how to put the bricks together to get a toy car. You don't need to know how, you just need to follow the instructions.

New: Okay, I found the instructions. It's going to take me hours! Why can't they just sell it as a toy car, instead of making you have to build it??

Old: Because not everybody wants to make a toy car with Lego. It can be made into anything we like. That's the whole point.

New: I still don't see why they can't supply it as a car so people who want a car have got one, and other people can take it apart if they want to. Anyway, I finally got it put together, but some bits come off occasionally. What do I do about this? Can I glue it?
Old: It's Lego's. It's designed to come apart. That's the whole point.

New: But I don't want it to come apart. I just want a toy car!
Old: Then why on Earth did you buy a box of Lego's??

It's clear to just about anybody that Lego's is not really aimed at people who just want a toy car. You don't get conversations like the above in real life. The whole point of Lego's is that you have fun building it and you can make anything you like with it. If you've no interest in building anything, Lego's are not for you. This is quite obvious.

As far as the long-time Linux user is concerned, the same holds true for Linux: It's an open-source, fully-customizable set of software. That's the whole point. If you don't want to hack the components a bit, why bother to use it?

But there's been a lot of effort lately to make Linux more suitable for the non-hackers, a situation that's not a million miles away from selling pre-assembled Lego kits, in order to make it appeal to a wider audience. Hence you get conversations that aren't far away from the ones above: Newcomers complain about the existence of what the established users consider to be fundamental features, and resent having the read a manual to get something working. But complaining that there are too many distros; or that software has too many configuration options; or that it doesn't work perfectly out-of-the-box; is like complaining that Lego's can be made into too many models, and not liking the fact that it can be broken down into bricks and built into many other things.

So, to avoid problem #3b: Just remember that what Linux seems to be now is not what Linux was in the past. The largest and most necessary part of the Linux community, the geeks, hackers and the developers, like Linux because they can fit it together the way they like; they don't like it in spite of having to do all the assembly before they can use it.

Thursday, September 28, 2006

Windows, Linux, Cars and Lego's (Part 2 of 5)

Problem #2: Linux is too different from Windows

This next issue arises when people do expect Linux to be different, but find that some differences are just too radical for their liking. Probably the biggest example of this is the sheer amount of choice available to Linux users. Whereas an out-of-the-box-Windows user has the Classic or XP desktop with Wordpad, Internet Explorer, and Outlook Express installed, an out-of-the-box-Linux user has hundreds of distributions to choose from, then Gnome or KDE or Fluxbox or whatever, with vi or emacs or kate, Konqueror or Opera or Firefox or Mozilla, and so on and so forth.

A Windows user isn't used to making so many choices just to get up & running. Exasperated "Does there have to be so much choice?" This question is very common.

Does Linux really have to be so different from Windows? After all, they're both operating systems. They both do the same job: Power your computer & give you something to run applications on. Surely they should be more or less identical?

Look at it this way: Step outside and take a look at all the different vehicles driving along the road. These are all vehicles designed with more or less the same purpose: To get you from A to B via the roads. Note the variety in designs.

But, you may be thinking, car differences are really quite minor: they all have steering wheels, gas-pedals, a manual or automatic transmission, brakes, windows & doors, a gas tank. . . If you can drive one car, you can drive any car, although, to some manual transmissions are a bear.

Quite true. But did you not see that some people weren't driving cars, but were driving motorcycles (yeah baby) instead. . ?

Switching from one version of Windows to another is like switching from one car to another. Win95 to Win98 to WinMe (of which sucked), I honestly couldn't tell the difference. Win98 to WinXP, now that was a bigger change but really nothing major.

But switching from Windows to Linux is like switching from a car to a motorcycle. They may both be operating systems (OSes)/road vehicles. They may both use the same hardware/roads. They may both provide an environment for you to run applications/transport you from A to B. But they use fundamentally different approaches to do so.

Windows/cars are not safe from viruses/theft unless you install an anti virus/lock the doors. Linux/motorcycles don't have viruses/doors, so are perfectly safe without you having to install an anti virus/lock any doors.

Or look at it the other way round:

Linux/cars were designed from the ground up for multiple users/passengers. Windows/motorcycles were designed for one user/passenger. Every Windows user/motorcycle driver (Biker) is used to being in full control of his computer/vehicle at all times. A Linux user/car passenger is used to only being in control of his computer/vehicle when logged in as root/sitting in the driver's seat.

Two different approaches to fulfilling the same goal. They differ in fundamental ways. They have different strengths and weaknesses: A car is the clear winner at transporting a family & a lot of cargo from A to B: More seats & more storage space. A motorcycle is the clear winner at getting one person from A to B: Less affected by congestion and uses less gas.

There are many things that don't change when you switch between cars and motorcycles, You still have to put gas in the tank, you still have to drive on the same roads, you still have to obey the traffic lights and Stop signs, you still have to indicate before turning, you still have to obey the same speed limits.

But there are also many things that do change: Car drivers don't have to wear helmets, motorcycle drivers (Bikers) don't have to put on a seatbelt. Car drivers have to turn the steering wheel to get around a corner, Bikers have to lean over. Car drivers accelerate by pushing the gas pedal, Bikers accelerate by twisting a hand control(Throttle).

A Biker who tries to corner a car by leaning over is going to run into problems very quickly. And Windows users who try to use their existing skills and habits generally also find themselves having many issues. In fact, Windows "Power Users" frequently have more problems with Linux than people with little or no computer experience at all, for this very reason. Typically, the most vehement "Linux is not ready for the desktop yet" arguments come from ingrained Windows users who reason that if they couldn't make the switch, a less-experienced user has no chance. But this is the exact opposite of the truth.

So, to avoid problem #2: Don't assume that being a knowledgeable Windows user means you're a knowledgeable Linux user: When you first start with Linux, you are a novice.

Thursday, September 21, 2006

Windows, Linux, Cars and Lego's (Part 1 of 5)

New Linux users are having some problems making the switch from Windows to Linux. This causes many problems for many people. Many individual issues arise from this single problem.

Problem #1: Linux isn't exactly the same as Windows.

You'd be amazed how many people make this complaint. They come to Linux, expecting to find essentially a free, open-source version of Windows. Quite often, this is what they've been told to expect by over-zealous Linux users. However, it's a paradoxical hope.

The specific reasons why people try Linux vary wildly, but the overall reason boils down to one thing: They hope Linux will be better than Windows. Common yardsticks for measuring success are cost, choice, performance, and security. There are many others. But every Windows user who tries Linux, does so because they hope it will be better than what they've got.

Here lies the problem, it is logically impossible for any thing to be better than any other thing while remaining completely identical to it. A perfect copy may be equal, but it can never surpass. So when you give/gave Linux a try in hopes that it would be better, you were inescapably hoping that it would be different. Too many people ignore this fact, and hold up every difference between the two operating systems (OSes) as a Linux failure.

As a simple example, consider driver upgrades: one typically upgrades a hardware driver on Windows by going to the manufacturer's website and downloading the new driver; whereas in Linux you upgrade the kernel.

This means that a single Windows download & upgrade will give you the newest drivers available for your machine, whereas in Linux you would have to surf to a few sites and download all the upgrades individually. It's a very different process, but it's certainly not a bad one. But many people complain because it's not what they're used to.

Or, as an example you're more likely to relate to, consider Firefox: One of the biggest open-source success stories. A web browser that took the world by storm. Did it achieve this success by being a perfect imitation of IE, the then-most-popular browser?

No. It was successful because it was better than IE, and it was better because it was different. It had tabbed browsing, live bookmarks, built-in search bar, PNG support, ad block extensions, and other wonderful things. The "Find" functionality appeared in a toolbar at the bottom and looked for matches as you typed, turning red when you had no match. IE had no tabs, no RSS (news feeds) functionality, search bars only via third-party extensions, and a find dialogue that required a click on "OK" to start looking and a click on "OK" to clear the "Not found" error message.

A clear and inarguable demonstration of an open-source application achieving success by being better, and being better by being different. Had Firefox been an IE clone, it would have vanished into obscurity. And had Linux been a Windows clone, the same would have happened.

So the solution to problem #1: Remember that where Linux is familiar and the same as what you're used to, it isn't new & improved. Welcome the places where things are different, because only here does it have a chance to shine (And you get to shine to, by trying something new.).

Monday, September 11, 2006

Linux Kernel Delayed By Microsoft's Army of Evil Monkeys

Around the World - Linus Torvalds announced yesterday that the Linux Kernel will be delayed. He blamed the delay on interference from Microsoft's Army of Evil Monkeys. The army has been disrupting the lives of key Linux programmers, and in some cases destroying portions of code. Torvalds himself has been a victim of several Evil Monkey attacks.

Evil MonkeysEvil Monkeys Steve Ballmer denied any involvement by Microsoft in the matter. "We did receive the Army of Evil Monkeys when we purchased evil from Satan, but those monkeys are only temporary employees and are not actual employees of Microsoft. Whatever they do on their own time is their business."

The Department of Justice said, "If this story is accurate, then this is just one more example of how Microsoft is using its monopoly power to stifle competition."

"It has been horrible, horrible," said Linux programmer Andy (last name withheld)., "the evil monkeys were everywhere. Trashing my computer, having monkey business in my bed. I lost several days worth of work."

Internal Microsoft e-mails obtained tell of a secret monkey training camp where the monkeys are trained to seek out and harass Linux programmers. The only comments from Bill Gates have been, "Fly my pretty, fly!"

Saturday, September 09, 2006

Software: It's a Gas

Nathan Myhrvold, the former CTO of Microsoft, is also a bona-fide physicist. He holds physics degress from UCAL and Princeton. He even had a postdoctoral fellowship under the famous Stephen Hawking. Thus, as you might expect, his 1997 ACM keynote presentation, The Next Fifty Years of Software is full of physics and science metaphors.

It starts with Nathan's four Laws of Software:

1. Software is a gas
Software always expands to fit whatever container it is stored in.

2. Software grows until it becomes limited by Moore's Law
The initial growth of software is rapid, like gas expanding, but is inevitably limited by the rate of increase in hardware speed.

3. Software growth makes Moore's Law possible
People buy new hardware because the software requires it.

4. Software is only limited by human ambition and expectation
We'll always find new algorithms, new applications, and new users.

Myhrvold goes on to describe software development as a state of Perpetual Crisis. The size and complexity of software is constantly rising, with no limit in sight. As we develop more advanced software-- and as we develop solutions to manage the ever-increasing complexity of this software-- the benefits of the new software are absorbed by the rising tide of customer expectations.

Software development will never be easy; new software always has to push against the current complexity boundary if it wants to be commercially successful.

This was all written in 1997. Nearly ten years later, are his points still valid? Software is certainly still a gas. Now that we're entering the multi-core era, there is one crucial difference. Historically hardware has gotten more complex because of limitations in the ability of software to scale; now the software needs to get more complex because of limitations in the ability of hardware to scale. The burden of scaling now falls on the software.

Myhrvold then makes an interesting point about the amount of storage required to capture human diversity. If..

* the human Genome is approximately 1 gigabyte of data
* the individual difference between any two humans is 0.25% of their Genome
* we assume a lossless compression rate of 2:1

The individually unique part of the human Genome can be stored in ~1.2 megabytes. Thus, you fit on a 3.5" floppy disk.

In fact, the entirety of human genetic diversity for every living human being could be stored in a 3.7 terabyte drive array. And the entire genetic diversity of every living thing on earth could be stored in roughly the size of the internet circa 2001.

I'm not sure what that means, exactly, but I love the idea that I can fit myself on a 3.5" floppy disk

Saturday, September 02, 2006

Star Trek Next Generation Meets Microsoft

Picard:
Mr. LaForge, have you had any success with your attempts at finding a weakness in the Borg? And Mr. Data, have you been able to access their command pathways?

Geordi:
Yes, Captain. In fact, we found the answer by searching through our archives on late Twentieth-century computing technology.

Geordi presses a key, and a logo appears on the computer screen.

Riker [puzzled]
What the hell is Microsoft?

Data [turns to explain]
Allow me to explain. We will send this program, for some reason called Windows, through the Borg command pathways. Once inside their root command unit, it will begin consuming system resources at an unstoppable rate.

Picard:
But the Borg have the ability to adapt. Won't they alter their processing systems to increase their storage capacity?

Data:
Yes, Captain. But when Windows detects this, it creates a new version of itself known as an upgrade. The use of resources increases exponentially with each iteration. The Borg will not be able to adapt quickly enough. Eventually all of their processing ability will be taken over and none will be available for their normal operational functions.

Picard:
Excellent work. This is even better than that unsolvable geometric shape idea.

. . . . 120 Minutes Later . . . .

Data:
Captain, we have successfully installed the Windows in the Borg's command unit. As expected, it immediately consumed 85% of all available resources. However, we have not received any confirmation of the expected upgrade.

Geordi:
Our scanners have picked up an increase in Borg storage and CPU capacity, but we still have no indication of an upgrade to compensate for their increase.

Picard:
Data, scan the history banks again and determine if there is something we have missed.

Data:
Sir, I believe there is a reason for the failure in the upgrade. Appearently the Borg have circumvented that part of the plan by not sending in their registration cards.

Riker:
Captain, we have no choice. Requesting permission to begin emergency escape sequence 3F!

Geordi: [excited]
Wait, Captain! Their CPU capacity has suddenly dropped to 0% !

Picard:
Data, what do your scanners show?

Data: [studying displays]
Appearently the Borg have found the internal Windows module named Solitaire, and it has used up all available CPU capacity.

Picard:
Lets wait and see how long this Solitaire can reduce their functionality.

. . . . Two Hours Pass . . .

Riker:
Geordi, what is the status of the Borg?

Geordi:
As expected, the Borg are attempting to re-engineer to compensate for increased CPU and storage demands, but each time they successfully increase resources I have setup our closest deep space monitor beacon to transmit more Windows modules from something called the Microsoft Fun-Pack.

Picard:
How much time will that buy us?

Data:
Current Borg solution rates allow me to predict an interest time span of 6 more hours.

Geordi:
Captain, another vessel has entered our sector.

Picard:
Identify.

Data:
It appears to have markings very similar to the Microsoft logo...

[over the speakers]

This is admiral Bill Gates of the Microsoft flagship MONOPOLY. We have positive confirmation of unregistered software in this sector. Surrender all assets and we can avoid any trouble. You have 10 seconds to comply.

Data:
The alien ship has just opened its forward hatches and released thousands of humanoid-shaped objects.

Picard:
Magnify forward viewer on the alien craft!

Riker:
My God, captain! Those are human beings floating straight toward the Borg ship - with no life support suits! How can they survive the tortures of deep space?!

Data:
I dont believe that those are humans, sir. If you will look closer I believe you will see that they are carrying something recognized by twenty-first century man as doeskin leather briefcases, and wearing Armani suits.

Riker and Picard, together [horrified]
Lawyers!!

Geordi:
It can't be. All the Lawyers were rounded up and sent hurtling into the sun in 2017 during the Great Awakening.

Data:
True, but appearently some must have survived.

Riker:
They have surrounded the Borg ship and are covering it with all types of papers.

Data:
I believe that is known in ancient vernacular as red tape. It often proves fatal.

Riker:
They're tearing the Borg to pieces!

Picard:
Turn the monitors off, Data, I cant bear to watch. Even the Borg doesn't deserve such a gruesome death!

Thursday, August 24, 2006

Your Own Apache Web Server.

Having your own Web server goes beyond the need to put your business' information out on the Internet for all to see. While that certainly won't hurt, there are many more ways you can take advantage of such a server.

Whether you are running a department within a large corporation, or your own small business, having access to an HTTP server can quickly improve the way your employees share knowledge.

Groups within an organization of any size generally need to share a great deal of information, even though they may be working toward different goals. One way to accomplish this is to provide groups and individuals with access to an intranet Web server and allow them to publish their own facts and figures. Everyone could share one web server, and have a common user interface to access each others' work. If more space or segregation of information is needed, individual groups could set up their own web server using surplus hardware and Linux.

In another situation, you--the content provider--know exactly what you want, but are unable to find just the right package of features at the right cost. Many ISPs charge extra for even basic logging information. Given the tight margins that all businesses have to operate within these days, occasionally the right move is to set up your own host hardware.

Another reason to set up a Web server is that some Internet-based applications may require performance that only a dedicated server can offer--tuned to your special needs.

It may seem unlikely that you can provide a better service than experienced ISPs, but ISPs are catering to a mass market and tend to charge a lot for special services. Some web applications just don't fit into the general ISP scheme and installing your own web server on a Linux box, could prove to be a cost effective way of getting the Internet services you need.

Making basic documents available is fairly straightforward. First, install a Linux distribution containing Apache. You can check that the server is up by pointing your browser (Mozilla, Konqueror, etc. on the same machine as the server) at http://localhost/. You should also be able to access this remotely with the machine's name. For instance, if the machine's domain name is penguin.org, then the URL http://penguin.org should work. Next look at the server's configuration directory. On SuSE this is /usr/local/http/htdoc/. Look for the directive DocumentRoot in the http.conf or srm.conf files. This will show the server's main document directory. A file (in this case file.html) placed here will appear at the topmost level, such as, http://localhost/file.html

By default the server also looks in users' directories for public HTML directories and makes these available on the web server. For instance, if you have a login code john with a home directory /home/john. Place some files in /home/john/public_html and they will become available at http://localhost/~john/

Next, we will look at a popular method to produce dynamic content based on a web user's input to a form. Normally, web users looking at your site will only see static HTML pages. The web scripting language PHP allows web pages to be built "on-the-fly" according to user input. PHP is very powerful and as programming languages go, is pretty easy to learn.

You will no doubt be aware of PHP scripts and their typical uses for processing forms and serving up dynamic content using databases. To make PHP scripts work, just make sure the it is installed, along with Apache, when you build Linux on your machine. On the server, you will also need to set up a special web directory so you have a place to put the scripts.

Create a "php" directory under your main web server directory (in SuSE 8.0 that's /usr/local/httpd/htdocs). You can also use your ~/public_html directory, if you like. That's what I'll use in my example. In this case it would be /home/rreilly/public_html/php.

Then you can enter the following text for your first php script. Call it inputform.php.

You'll also need an html file to go along with it. Put this file (call it form1.html) in the same directory as your php script.




Call up the form1.html file in your browser (http://localhost/php/form1.html, for example), enter your name and click on "Submit". You should see "Hello World, (your name)" in the browser window.

That's pretty much how PHP scripts work. Nothing complicated.

Web server security, especially with PHP, is another topic that you should investigate. Of course, it's not possible to adequately cover it in this article. Pick up a good book on PHP, if you want to really want start learning the language. I've found Larry Ullman's Visual Quickstart Guide "PHP For The World Wide Web" to be an easy to read and very useful basic text. Then look for some current how-to articles on the Web about security and PHP to fill in the security gaps.

Once you start getting some traffic on your web site you will be asking yourself questions. What files are people getting? Which area of the site is the most popular? What are the total number of megabytes we transferred last month?

To answer questions of this type you need to look at your logfiles. The server generates at least two logfiles: you can tell it to split up the data into more. There is always an 'error' log and an 'access' log, which are located in the /var/log/httpd directory.

The error log records attempts to get files on the server that fail. For instance if a user makes a typing mistake then the misspelled URL will show up in the error log.

The access log is a list of the URLs that were successfully retrieved. Both logs contain dates, number of bytes transferred and some information on where the request was made from. There is a powerful program for doing this under Linux and other UNIX-derived systems. The program is called analog. The simplest way to use analog is to install it as a package and then type:

analog >report.html

Look at report.html with a web browser, and then play around with analog. This program can be configured to extract information in every conceivable way from the server's log files.

Apache is a reasonably fast server. It can saturate a 10 Mb/s line using only a low end Pentium under normal conditions. But it is possible to tweak a little more performance out of it. Here are some things to try:

* In the file http.conf, change the value of MaxRequestsPerChild to 10000
* Add +FollowSymlinks to all your directory sections

Available RAM

Admittedly, this is a no-brainer.

Adding RAM to your machine can have a major effect. RAM is thousands of times faster than even the best hard disk. Because of this, the Linux system uses RAM to cache recently opened files. Apache will then be able to service requests faster. So even without altering any Apache config directives, after you add more RAM to your Linux box, it will be faster.

We have just scratched the surface for using the Apache web server to distribute information via web browsers. If you want to know more take a look at the excellent documentation that comes with Apache. How far you go is limited only by your imagination and time. Putting together a couple of small web servers in your company is an excellent way to learn the technology.

Tuesday, August 01, 2006

Has Linux patching surpassed Mac and Windows?

This may seem like a shock, but is it possible that the Linux patching has surpassed the Mac and Windows operating systems? Recent vulnerabilities in Flash and Firefox that can affect multiple operating systems highlight a weakness in the Mac and Windows auto-update process because they're primarily focused on patching Apple and Microsoft specific issues.

A notification system on third party vulnerabilities would be better than nothing. Most modern Linux distributions on the other hand like Redhat, SuSE and Ubuntu have automatic update mechanisms that patch across the entire spectrum of software since Linux by its very nature is made up of a collection of applications from different sources.

Most regular users don't really think about the patching process and can't possibly keep up with all the security advisories. If we take this particular vulnerability report for a critical flaw in Macromedia Flash, I would bet that the average computer user still hasn't and won't patch this vulnerability until some mechanism forces them to update it.

The Windows and Mac update mechanism will not bother with this particular vulnerability, but Redhat has already released a patch as a part of regular Linux update process. Microsoft has released patches for Macromedia Flash in the past but only because it was the version bundled with Microsoft Internet Explorer. Windows update will not address this particular flash vulnerability which technically isn't Microsoft's fault but it's still a very serious problem for Windows users that can lead to complete system compromise.

Microsoft has made some effort to consolidate the patch process for all Microsoft products with their Microsoft Update site, but this only addresses part of the problem for most Windows users. I'm not necessarily blaming Microsoft and Apple for not dealing with vulnerabilities from third party software vendors since they can't be legally held responsible for someone else's software, but the major Linux distributions have already made the effort to consolidate the update process. At the very least, it's an opportunity for Microsoft and Apple to make life easier for their users.

Perhaps what is needed is a centralized location for approved third party vendors to provide their latest critical updates within the Windows and Mac update systems which should at the very least include common software such as Macromedia flash and maybe even Mozilla Firefox. Then let the users opt in or out of third party patches within the regular auto-update mechanism. Even a notification system on third party vulnerabilities would be better than nothing. Without this, the average Windows and Mac user will simply leave the door wide open on third party applications for hackers to exploit.

Tuesday, July 25, 2006

(OSS)

Open Source Software (OSS) is provided with a license that gives the end user the right to use it freely for private or commercial use. You also have the right to inspect and even modify the underlying source code. You can give away or sell the original version you received or one with your modification, provided that you then in turn pass on the modified source code so that others can benefit from the changes you have made. This last clause protects the developer's work from unfair exploitation by others, while allowing the source code to be released to the community. You are not required to pay royalties to previous developers, but you are still permitted to charge money for the sale of OSS. This disc may have been given to you by a friend; or sold to you for a small fee, and both are permitted.

If you have no interest in source code, you may ask why the availability of this matters. One answer is that the release of source allows external observers to inspect the true functioning of the program, which means that you can be confident that the program treats your private data with respect. A real problem with proprietary software can be that your data is locked in to a software's proprietary file format, which means that you may eventually be forced to upgrade to newer versions of that software to retain access to your data. This does not happen with OSS, because when the source code for opening and saving files is available, a third party can easily write an import filter for the next generation of software, ensuring that your data will always be available.

Finally, the release of source code has in some cases spawned large communities of volunteer developers who have in turn provided the world with highly useful, and entirely free software such as Linux, OpenOffice, and Mozilla. These are then available free of charge to schools or anyone else who may not have a large budget available for software. So, you can see that the freedom of software is important for everyone, not just software developers.