Linux has been with us in one form or another for a very long time now. The first ‘real’ distribution of Linux was released over 16 years ago – in dog years that’s 112, and in computer years that’s like … forever.
The first version of Linux that I remember was Yggdrasil, which was released (according to wikipedia) in December 1992. However, the first one that was really usable was Slackware, released in 1993 (again, according to teh ‘Pedia).
Back then, the whole Linux thing was purely for techies.
I was a young programmer who was caught in the horrors of coding for DOS 3.3x and Windows 2, when a friend at work showed me this new operating system that was available for the Atari ST. As someone who was both an Atari ST owner and a total nerd, I thought this MINIX thing sounded fantastic, and got a copy from my friend. Making it boot from my ridiculously-expensive 20MB hard disk, rather than booting from floppy and then mounting the hard disk, took me weeks – but once it was done I could settle down and look at how to write device drivers, and how to use Unix-like system calls. There was no internet available to mortals back then, and the ‘man’ system had not yet arrived in Minix-land.
Fast forward a couple of years, and I was a young programmer caught in the horrors of coding for DOS 3.3x and Windows 3.1, as well as venturing forth into the exciting new world of Windows NT. My friend showed me Yggdrasil Linux running on a PC. This had advantages, in that PCs were by then cheap enough to build from parts, and those parts had become inexpensive enough that you could put a system together for relatively little money.
So I built my first Linux machine.
By now, we are in 1994, and I was using a 486/66 with 4MB RAM and a 160MB hard disk.
I was a programmer of seven years’ experience then, and had practically taught myself everything I knew. I was fascinated by operating systems, and what made them work. I was enthralled by this new Linux stuff, and subscribed to the Walnut Creek CDROM monthly distributions (by snailmail!).
So Linux was available and usable by tech-nutcases in 1994. Since then, distributions have come and gone – new ones have popped up and old ones have fallen by the wayside – some of the newer ones similarly have disappeared.
The World of Microsoft
“One Microsoft Way. It’s not just a creed, it’s their actual address.”
In the year 1995, Microsoft released their first “consumer-based” 32-bit operating system, actually a horribly bastardised amalgamation of a 32-bit presentation layer built on layers of new-fangled 32-bit goodness based on Windows NT’s Win32 subsystem, and a crufty 16-bit core. They called it Windows 95 and it largely sucked.
We knew it sucked, and they knew it sucked. But they marketed it well, and it was still WAY better than 16-bit Windows, and so it took off. A few years later they released Windows 98, which was much better, but still had some 16-bit skeletons in the closet.
In parallel, Microsoft’s “professional” operating system was Windows NT, which in 1996 was up to version 4 (they “cheated” by releasing the first version as “3.1″). However, NT4 was a hungry beast. It ran best on the fastest, most expensive CPUs, and its RAM requirements were … excessive in the day.
Microsoft had been working to make Windows NT acceptable to the consumer, and this was largely focused around the resource requirements, as well as its rather dismal graphics performance. My memory tells me that the plans were for them to release NT version 5 to the consumer market at the same time as to businesses – but in the end, the graphics (in particular the DirectX layer) were not ready. So they released Windows 2000 (NT v5.0) to businesses only at the end of 1999, and released NT (v5.1) with DirectX and a new front-end shell, as Windows XP in October 2001.
Windows XP is probably the most popular operating system of all time, and I would hazard a guess that every single person in the developed world has used it.
Since then we had the unmitigated disaster that was Windows Vista (NT v6.0), and Microsoft returned to sanity again when they released Windows 7 (NT v6.1 … WTF?) in October 2009.
Windows 7 is a Good OS, and has many vast improvements over previous versions. It has come a long long way since Windows 3.1 and the first version of Windows NT that was around in 1993.
From Apple’s Point of View
If we backtrack again to 1994, while I was playing around with Linux in my spare time, and programming for Windows 3.1 and Windows NT … those clever people at Apple were releasing the PowerPC version of their own operating system (MacOS as it is called now, but then it was System 7).
Apple were in the midst of switching the hardware on their desktop computers away from the Motorola 680×0 range of CPUs to the IBM/Motorola PowerPC (a 32-bit RISC CPU). For the most part, the average consumer did not notice, apart from having to upgrade some of their software to a version that would run on the new OS.
Things progressed nicely, and after the total mess that was Copland, and then the rumours that they were going to buy Be (with its innovative BeOS), they announced in 1997 that they had bought Steve Jobs’ company NeXT and that the next big release of their OS would be based on NeXTSTEP, which in turn is a Mach-based kernel with a Unix-like (FreeBSD) core above it – and a friendly UI at the top, which for the most part is all the consumer ever sees.
Since then, MacOS X (as the new system was called) has gone through a public beta, and seven major releases, 10.0 to 10.6. The hardware platform was changed again to run on Intel-based hardware, and anyone who buys a Macintosh computer these days has the option of running both MacOS X or Windows (XP, Vista or Windows 7).
I contend that MacOS X is the most advanced consumer OS out there (especially with the new OpenCL stuff and the amazing Grand Central Dispatch, both included in v10.6 of the operating system. As a programmer, I get excited just by reading about GCD and how simple it is to use).
MacOS along with its sister that runs on the iPod Touch, the iPhone and now the iPad (which is basically a cut-down version of OSX), Apple’s operating system has come a long way from the co-operative multitasking and limited RAM of system 7 back in the early 90s.
And … back to Linux
So, Windows and MacOS are the two real ‘mainstream’ consumer operating systems, and both have been around for a very long time. The original Macintosh was introduced at the beginning of 1984 and its OS has been growing since. The first version of Windows (actually v1.01) was released at the end of 1985, and it has improved dramatically (albeit with some hiccups along the way – Windows ME or Vista, anyone?) since.
Linux has been around for a very long time, too. Since its inception in 1991, it has turned into a mainstream server OS, with most of the internet relying on Linux or FreeBSD. Several home appliances have a Linux kernel in them (TiVo runs on a Linux core, as do several home WiFi routers), and some netbooks have limited RAM, a tiny hard disk (or SSD) and run a Linux kernel with a custom UI on top of it.
But lets be honest, have you ever tried to set up a Linux machine? Have you?
Even the most modern Linux distribution that are aimed squarely at consumers, become a command-line and config-file nightmare if you want to do something esoteric like plug a WiFi dongle into your machine.
Macintosh has pretty much been plug-and-play for ever, and Windows 95 was heralded as Microsoft’s first foray into the world of plug-and-play (although I contend that they got it right with Windows 2000). But those operating systems are at least fifteen years old. Where’s Linux’s plug-and-play?
If you change the video card on your Linux box, you are pretty much going to have to delve in yourself to make it work – assuming you can get drivers for it.
Yes, the Apple Mac is a more closed platform, and only certain hardware will work for it – but when it does work, you really do just plug it in and then use it.
Windows is not quite as simple, since the range of supported hardware is vastly greater (i.e. pretty much everything) – but being the most popular consumer OS out there by far, as a result you can download a driver, plug your hardware in, and it works.
But with Linux, even if the hardware is supported, and there are drivers that work, you can quickly find yourself in dependency hell if the drivers were compiled for a different version of the C runtime (or gettext or similar) that you happen to have on your machine.
Linux has come a long long way since its inception, but right now it is most useful as a server OS, where the people who use it (and have to maintain it) are basically IT professionals, or computer enthusiasts.
Linux for Consumers
Do you know anyone (yourself included) who uses Linux as their main desktop operating system? Chances are you know a few.
Do you know anyone (yourself included) who uses Linux as their main desktop operating system who either does not have a degree in something to do with computers (programming, computer science, or similar) or does not work as a professional programmer (or IT support person or similar)? I am guessing … not.
Do you know anyone (yourself included) who uses Linux as their main desktop operating system, who would recommend to a member of their own family that they should also use Linux as their main operating system? Never in a month of Sundays – because they know that they will have to try supporting it over the phone, and eventually after trying to explain what “vi” is, have to drive over with their own laptop and spend an afternoon undoing something someone did (which will always be “nothing” when asked) to something somewhere that stopped it being able to connect to the internet.
Despite what most programmers think, the real test of whether a system (any system – be it an OS or a simple program) is really ready for use is not whether they can install and use it every day themselves, but whether or not they would be prepared to give it to a family member to use every day.
In over 16 years, both Windows and MacOS have gone through many generations and have both gone through at least one total rewrite from the ground up. Linux on the other hand, has steadily progressed with thousands of enthusiastic developers – and in my opinion is still nowhere near ready for use as a mainstream consumer OS.