Original Link: https://www.anandtech.com/show/2816
Revisiting Linux Part 1: A Look at Ubuntu 8.04
by Ryan Smith on August 26, 2009 12:00 AM EST- Posted in
- Linux
Back in the early part of 2008 we decided that we wanted to take a fresh look at Linux on the desktop. To do so we would start with a “switcher” article, giving us the chance to start anew and talk about some important topics while gauging the usability of Linux.
That article was supposed to take a month. As I have been continuously reminded, it has been more than a month. So oft delayed but never forgotten, we have finally finished our look at Ubuntu 8.04, and we hope it has been worth the wait.
There are many places I could have started this article, but the best place to start is why this article exists at all. Obviously some consideration comes from the fact that this is my job, but I have been wanting to seriously try a Linux distribution for quite some time. The fact that so much time has transpired between the last desktop Linux article here at AnandTech and my desire to try Linux makes for an excellent opportunity to give it a shot and do something about our Linux coverage at the same time.
After I threw this idea at Anand, the immediate question was what distribution of Linux should we use. As Linux is just an operating system kernel, and more colloquially it is the combination of the Linux kernel and the GNU toolset (hence the less common name GNU/Linux), this leaves a wide variation of actual distributions out there. Each distribution is its own combination of GNU/Linux, applications, window managers, and more, to get a complete operating system.
Since our target was a desktop distribution with a focus on home usage (rather than being exclusively enterprise focused) the decision was Ubuntu, which has established a strong track record of being easy to install, easy to use, and well supported by its user community. The Linux community has a reputation of being hard to get into for new users, particularly when it comes to getting useful help that doesn’t involve being told to read some esoteric manual (the RTFA mindset), and this is something I wanted to avoid. Ubuntu also has a reputation for not relying on the CLI (Command-Line Interface) as much as some other distributions, which is another element I was shooting for – I may like the CLI, but only when it easily allows me to do a task faster. Otherwise I’d like to avoid the CLI when a GUI is a better way to go about things.
I should add that while we were fishing for suggestions for the first Linux distro to take a look at, we got a lot of suggestions for PCLinuxOS. On any given day I don’t get a lot of email, so I’m still not sure what that was about. Regardless, while the decision was to use Ubuntu, it wasn’t made in absence of considering any other distributions. Depending on the reception of this article, we may take a look at other distros.
But with that said, this article serves two purposes for us. It’s first and foremost a review of Ubuntu 8.04. And with 9.04 being out, I’m sure many of you are wondering why we’re reviewing anything other than the latest version of Ubuntu. The short answer is that Ubuntu subscribes to the “publish early, publish often” mantra of development, which means there are many versions, not all of which are necessarily big changes. 8.04 is a Long Term Support release; it’s the most comparable kind of release to a Windows or Mac OS X release. This doesn’t mean 9.04 is not important (which is why we’ll get to it in Part 2), but we wanted to start with a stable release, regardless of age. We’ll talk more about this when we discuss support.
The other purpose for this article is that it’s also our baseline “introduction to Linux” article. Many components of desktop distributions do not vary wildly for the most part, so much of what we talk about here is going to be applicable in future Linux articles. Linux isn’t Ubuntu, but matters of security, some of the applications, and certain performance elements are going to apply to more than just Ubuntu.
Background
I think it's impossible to offer a purely objective review on an operating system – qualitative data like the GUI and nebulous concepts like “ease of use” can't be measured. There is a degree of subjectivity in such a review, and I believe it's important to relate that in this article To that extent a bit of background on myself is probably going to be helpful on relating my point-of-view on matters, before jumping into Ubuntu. This section is being written prior to my even touching Ubuntu, so that it doesn't end up reflecting my experience, rather than my expectations.
Based on the computers I have owned and the operating systems I have used, I would best be classified as a Windows user. Like many of our readers (and our editors) I have lived the Microsoft life, starting from DOS and going straight through to Vista. I have clocked far more time on Windows than anything else, and it's fair to say that's where my skills (troubleshooting and otherwise) are strongest.
With that said, I am by no means limited to just a single OS. As was customary for most American schools in the 90s, I had access to the requisite Apple IIs and Macintoshes. But to be frank I didn't care for Mac OS Classic in the slightest – it was a remarkable OS in 1984 and even in 1993 and the age of Windows 3.1, but by the time Windows 95 rolled around it was more of a nuisance to use than anything else. It's through a cruel joke that when starting work in IT in 2001, I was tasked with using the newly released Mac OS X 10.0 “Cheetah” full-time to gauge its status for use on the organization's Macs.
Apple didn't ship Mac OS X as the default OS on their Macs at that time, which should tell you a lot. Nevertheless, while I abhorred Mac OS Classic, Mac OS X was far more bearable. The interface was better than anything else at the time (if not a bit too shiny), application crashes didn't (usually) take out the OS, and the Terminal was a thing of beauty. Sure, Windows has a command line environment, but it didn't compare to the Terminal. Mac OS X was a mess, but there were nuggets to be found if you could force yourself to use it.
I'll save you the history of Mac OS X, and we'll pick up in 2004, where Apple had improved Mac OS X a great deal with the release of 10.3 “Panther.” At this point I was a perfectly happy Mac user for my day job, and I probably would have used one at home too if it wasn't for the hefty price of a Mac and the fact that it would require having an entirely separate computer next to my gaming PC. A bit later in what was probably a bad idea, I convinced Anand to try a Mac based on the ease of use and productivity features. This resulted in A Month With A Mac, and he hasn't left the platform since.
Finally we'll jump to the present day. I'm still primarily a Windows user since I spend more time on my desktop PC, while my laptop is a PowerBook G4. I would rather be a Mac user, but not a lot has changed in terms of things preventing me from being one. To replace my PC with a Mac would require throwing down money on a workstation-class Mac Pro that is overkill for my processing needs, not to mention my wallet.
I also am not a fan of dual-booting. Time booting is time wasted, and while I am generally not concerned about boot times, dual booting a Mac would involve rebooting my desktop far more often than the occasional software installation or security update currently requires. It also brings about such headaches as instant message logging being split in two places, difficulty accessing documents due to file system/format differences, and of course the inability to simultaneously access my games and my Mac applications. In theory I could game from within Mac OS X, but in reality there are few native games and virtual machines like Parallels and the Mac branch of Wine are lacking in features, compatibility, and performance.
I also find the Mac to be a weak multimedia viewing platform. I'll get into this more once we start talking about multimedia viewing under Ubuntu since much of the underlying software is the same, but for now I'll say that libavcodec, the standard building block for virtually all *nix media players, is particularly lacking in H.264 performance because the stable branch is single-threaded.
So while I'm best described as a Windows user, a more realistic description would be a Windows user that wants to be a Mac user, but can't bear to part with Windows' games or media capabilities.
As for my experience with Linux, it is not nearly as comprehensive. The only time I ever touched Linux was in college, where our department labs were Dells running Linux and the shell accounts we used for assignments were running off of a small Linux cluster. I never touched the Red Hat machines beyond quickly firing up Netscape Navigator to check my email; otherwise the rest of my Linux usage was through my shell account, where I already had ample experience with the CLI environment through Mac OS X's terminal.
My expectations for Ubuntu are that it'll be similar to Mac OS X when it comes to CLI matters - and having already seen screenshots of Ubuntu, that the GUI experience will be similar to Windows. I am wondering whether I am going to run into the same problems that I have with Mac OS X today, those being the aforementioned gaming and multimedia issues. I have already decided that I am going to need to dual-boot between Ubuntu and Vista to do everything I want, so the biggest variable here is just how often I'll need to do so.
It's Free - Gratis
When doing the initial research for this article, one of my goals was to try to identify all of the reasons why I would want to use Ubuntu. While there are many reasons, a lot of them are what amount to edge cases. At the risk of being accused of shortchanging Ubuntu here, after using Ubuntu for quite some time the main reasons came down to this: It's free, and it's secure. That's it. Many of the popular Linux applications can be found for Windows, non-gaming performance largely isn't a concern on a high-end desktop such as mine, and no one is making any serious claims about ease of use when compared to Mac OS X. Ubuntu is free and Ubuntu is secure, but that's about it.
We'll start with “free”, since that's one of the fundamental subjects. When we say Ubuntu is free, there are two elements to that. The first is that Ubuntu costs nothing; it is free (gratis). The second is that Ubuntu's source code is open and can be modified by anyone; it is free (libre). This is expressed in the popular and simplified slogans of “free as in beer” and “free as in speech.” Many software products are freeware (e.g. Futuremark's PCMark) but fewer products are open source. The former does not necessitate the latter or vice versa, although practical considerations mean that most open source software is also freeware in some fashion since you can't keep people from compiling the source code for themselves.
There's fairly little to explain with respect to Ubuntu being freeware. It can be downloaded directly from the Ubuntu website in the form of an ISO disc image, and copied, installed, wiped as many times as anyone would like. Ubuntu's corporate sponsor/developer Canonical also sells it for a nominal price (currently it's listed on Amazon for $12) but there is no difference between the retail version and the download version. It's a free operating system, and free is a very good price.
Being free does mean giving up some things that would normally come with purchased software. Official support is the first element, as since it's a free OS there is no one being paid to support users. We'll dive into support in-depth in a bit, but for now it's enough to remember that Ubuntu does not come with official support. Support options are limited to the Ubuntu Knowledge Base, the forums, and whatever additional help can be found on the internet.
There's more to being able to offer Ubuntu for free than just not offering official support. Incidental expenses of assembling and distributing Ubuntu are covered by Canonical, who expects to eventually make a profit from Ubuntu through selling enterprise support. Development of Ubuntu and the underlying Linux components are done by a variety of volunteers working in their spare time, and paid employees from companies such as Novell, IBM, Red Hat, and others who use Linux in their commercial products and have a vested interest in its development.
However - and this is where we're going to take a bit of a detour - there is also the matter of who is not paid because Ubuntu is free. The United States patent system allows for ideas and methods to be patented, along with the more typical physical devices. What this means is that everything from encryption methods to video codecs to file systems can be and are patented by a variety of companies. As a result a lot of technologies in common use are patented, and those patents must be licensed for use when it comes to the United States (and many other countries with similar patent systems). Ubuntu includes software that uses patented material, but since Ubuntu is free, no one is paying those license fees.
Now I want to be very clear here that the reason I bring this up is because it's interesting, not because it's a problem. The chief example of where patents are an issue is media playback. MP3, MPEG-2, H.264, AAC, and other common formats have paid license requirements. This directly rears its head for the user when you first fire up Ubuntu's movie player and attempt to play a movie using a patented codec. Ubuntu goes through great lengths to point out that it needs to use a patented codec to play the material, and that unless the user has a valid license for the codecs it may be a patent violation to play the material, ultimately giving the user the option to download what Ubuntu calls the “restricted” codec set that is not distributed for legal reasons.
With that said, the legal issues are entirely theoretical for the end user. While using the restricted codecs is technically a patent violation, to our knowledge no individual has ever been sued or otherwise harassed over this, and we don't expect that to ever change. The licensing bodies like MPEG-LA are concerned with commercial products using their property – if someone is making money from their property, they want a piece of it. They are not concerned with home use of their codecs, and quite frankly users have nothing to be concerned about.
It should also be noted that Ubuntu (and other Linux distros) are not alone in this. VLC, Media Player Classic, various Windows codec packs, and many other free media players are also technically in violation of patent law for the same reasons. Even if someone is a Windows user, there's still a good chance they're violating patent law. For all practical purposes it's very hard to avoid being an IP violator, no matter the platform.
Meanwhile for those that absolutely must stay on the right side of the law, there are options, but it's not pretty. Canonical sells licensed software packages that can play back most media formats; Cyberlink's PowerDVD Linux for DVD playback, and Fluendo Complete Playback Pack for everything else. However the price may be shocking: being legit is going to cost you $50 for PowerDVD and another $40 for Fluendo. This makes a small but notable difference from Windows and Mac OS X. It's hard but not impossible to be both free and legitimate on those platforms through legal software that is given away for free – Winamp, Quicktime, DivX, and Flip4Mac all fall under this umbrella. Again, this makes no practical difference – no one who's holding a patent cares – but it's something any Ubuntu user trying to playback media is going to have to pay attention to for a fleeting moment.
Ultimately, the important bits to take away from this are that Ubuntu is free as in beer, and for the price you're only giving up official support. There are some patent issues, but since no one on either side actually cares, it doesn't matter. If nothing else, Ubuntu will be the best-priced operating system you will ever use, and price matters.
It's Free – Libre
While the value of “free as in beer” is easy to describe, the value of “free as in speech – otherwise known as libre – is harder to relate. Nonetheless, rather large books have been written on the subject, so we'll try to stick with something condensed.
Virtually everything distributed with Ubuntu is an open source program in some manner. Many of the components of Ubuntu, such as the Linux kernel and the GNU toolset, are licensed under the GNU General Public License (GPL), which in a nutshell requires that any software distributed under the GPL license either include the source code with the software or a way to get the source code. Other bits of Ubuntu are under slightly different licenses with slightly different legal requirements, but the outcome is effectively the same. Ubuntu is free - you can get the source code to it and modify/distribute it as you see fit.
But when we're talking about Ubuntu, there's more than just being able to access the source, as most of the development teams that are responsible for the programs included in Ubuntu have their projects open for public participation. So not only can you take the code and modify it, but if your modifications are good enough they can be submitted back to the main project and possibly be included in a future version of the software. The fundamental idea of open source software is that users are empowered to see how software works and to modify it as they see fit. Other lesser benefits also exist, such as protecting authors' rights by preventing people from taking the code and improving it without sharing it (the GPL), and making sure all the authors are properly credited.
This does not always make open source relevant for the user however. The fundamental benefits of open source software are for people that are programmers, but most users are not programmers. Being able to see and edit the code is not necessarily useful if you don't know how to use it. Even with a background in programming, I would be hard pressed to be able to quickly contribute significant code changes to most projects; very few programs are small and simple enough to be able to easily jump into these days.
Still, there are some definite benefits for those of us that can't throw out code like Linux's chief architect Linus Torvalds. The most direct benefit of course is that this software exists at all. Since all of the software in Ubuntu is free as in beer, paid developers do not develop many of the programs. Open source as a default state makes it easier for people to contribute to the development of software, and that means it's easier for such gratis software to be continually developed in the first place.
Open source software is also a benefit for the longevity of software. Since no one person has absolute control over a project, no one can terminate it. This means that someone else can pick up a project and continue should the original developer(s) quit, as is sometimes the case with old software. It also allows for software to be forked, which is to take the code from a project and create a derivative separate from the original project – the benefit being that a forked project can be taken in a different direction than the original developer may want. As proof of the importance of forking, there are a number of programs in Ubuntu that are forks of older projects, such as X11 (otherwise known as just X), Ubuntu's base windowing system.
Finally, open source software is beneficial to overall software security. If you can see the source, you can analyze it for possible bugs. If you can edit the source, you can fix those bugs rather than wait for someone else to do so - and we can't even begin to overstate the importance of this. The direct relevance to the average user is once again limited here since most people cannot read or write code, but it does filter down through benefits such as rapid patching of security vulnerabilities in some cases. The security benefits of Ubuntu being open source are some of the most important reasons we consider Ubuntu to be secure.
In short: even if you can't code you benefit from Ubuntu being a free (libre) operating system.
It’s Secure
Security is a tough nut to crack, both with respect to making something secure and judging something to be secure. I’m going to call Ubuntu secure, and I suspect that there’s going to be a lot of disagreement here. Nonetheless, allow me to explain why I consider Ubuntu secure.
Let’s first throw out the idea that any desktop OS can be perfectly secure. The weakest component in any system is the user – if they can install software, they can install malware. So while Ubuntu would be extremely secure if the user could not install any software, it would not be very useful to be that way. Ubuntu is just as capable as any other desktop OS out there when it comes to catching malware if the user is dedicated enough. The dancing pigs problem is not solved here.
Nevertheless, Ubuntu is more secure than other OSes (and let’s be frank, we’re talking about Windows) for two reasons. The first is for practical reasons, and the second is for technical reasons.
To completely butcher a metaphor here: if your operating system has vulnerabilities and no one is exploiting them, is it really vulnerable? The logical answer to that is “yes” and yet that’s not quite how things work. Or more simply put: when’s the last time you’ve seen a malware outbreak ravaging the Ubuntu (or any desktop Linux distro) community?
Apple often gets nailed for this logic, and yet I have a hard time disagreeing with it. If no one is trying to break into your computer, then right now, at this moment, it’s secure. The Ubuntu and Mac OS X user bases are so tiny compared to that of Windows that attacking anything but Windows makes very little sense from an attacker’s perspective.
It’s true that they’re soft targets – few machines run anti-virus software and there’s no other malware to fend off – but that does not seem to be driving any kind of significant malware creation for either platform. This goes particularly for Mac OS X, where security researchers have been warning about the complacent nature this creates, but other than a few proof of concept trojan horses, the only time anyone seems to be making a real effort to break into a Mac is to win one.
So I am going to call Ubuntu, with its smaller-yet user base and lack of active threats, practically secure. No one is trying to break into Ubuntu machines, and there’s a number of years’ worth of history with the similar Mac OS X that says it’s not going to change. There just aren’t any credible threats to be worried about right now.
With that said, there are plenty of good technical reasons too for why Ubuntu is secure; while it may be practically secure, it would also be difficult to break into the OS even if you wanted to. Probably the most noteworthy aspect here is that Ubuntu does not ship with any outward facing services or daemons, which means there is nothing listening that can be compromised for facilitating a fully automated remote code execution attack. Windows has historically been compromised many times through these attacks, most recently in October of 2008. Firewalls are intended to prevent these kinds of issues, but there is always someone out there that manages to be completely exposed to the internet anyhow, hence not having any outward facing services in the first place is an excellent design decision.
Less enthusing about Ubuntu’s design choices however is that in part because of the lack of services to expose, the OS does not ship with an enabled firewall. The Linux kernel does have built-in firewall functionality through iptables, but out of the box Ubuntu lets everything in and out. This is similar to how Mac OS X ships, and significantly different from how Windows Vista ships, which blocks all incoming connections by default. Worse yet, Ubuntu doesn’t ship with a GUI to control the firewall either (something Mac OS X does), which necessitates pulling down a 3rd party package or configuring it via CLI.
Operating System | Inbound | Outbound |
Windows Vista | All applications blocked, applications can request an open port | All applications allowed, complex GUI to allow blocking them |
Ubuntu 8.04 | All applications allowed, no GUI to change this | All applications allowed, no GUI to change this |
Mac OS X 10.5 | All applications allowed, simple GUI to allow blocking them | All applications allowed, no GUI to change this |
Now to be fair, even if Ubuntu had shipped with a GUI tool for configuring its firewall I likely would have set it up exactly the same as how I leave Mac OS X set up – all incoming connections allowed – nevertheless I find myself scratching my head. Host-based firewalls aren’t the solution to all that ails computer security, but they’re also good ideas. I would rather see Ubuntu ship like Vista does, with an active firewall blocking incoming connections.
Backwards compatibility, or rather the lack thereof, is also a technical security benefit for Ubuntu. Unlike Windows, which attempts to provide security and still support old software that pre-dates modern security in Windows, Ubuntu does not have any such legacy software to deal with. Since Linux has supported the traditional *nix security model from the get-go, properly built legacy software should not expect free reign of the system when running and hence be a modern vulnerability. This is more an artifact of previous design than a feature, but it bears mentioning as a pillar of total security.
Moving on, there is an interesting element of Ubuntu’s design being more secure, but I hesitate to call it intentional. Earlier I mentioned how an OS that doesn’t let a user install software isn’t very useful, but Ubuntu falls under this umbrella somewhat. Because the OS is based heavily around a package manager and signed packages, it’s not well-geared towards installing software outside of the package manager. Depending on how it’s packaged, many downloaded applications need to be manually assigned an executable flag before they can be run, significantly impairing the ability for a user to blindly click on anything that runs. It’s genuinely hard to run non-packaged software on Ubuntu, and in this case that’s a security benefit – it’s that much harder to coerce a user to run malware, even if the dancing pigs problem isn’t solved.
Rounding out the security underpinnings of Ubuntu, we have the more traditional mechanisms. No-eXecute bit support helps to prevent buffer overflow attacks, and Address Space Layout Randomization makes targeting specific memory addresses harder. The traditional *nix sudo security mechanism keeps software running with user privileges unless specifically authenticated to take on full root abilities, making it functionally similar to UAC on Vista (or rather, the other way around). Finally, Ubuntu comes with the AppArmor and SELinux security policy features that enable further locking down the OS, although these are generally overkill for home use.
There’s one last issue I’d like to touch on when it comes to technical security measures, and that’s the nature of open source software. There is a well-reasoned argument that open source software is more secure because it allows for anyone to check the source code for security vulnerabilities and to fix them. Conversely, being able to see the source code means that such vulnerabilities cannot be completely obscured from public view.
It’s not a settled debate, nor do I intend to settle it, but it bears mentioning. Looking through the list of updates on a fresh Ubuntu install and the CERT vulnerability list, there are a number of potential vulnerabilities in various programs included with Ubuntu – Firefox for example has been patched for vulnerabilities seven times now. There are enough vulnerabilities that I don’t believe just counting them is a good way to decide if Ubuntu being open source has a significant impact on improving its security. Plus this comes full-circle with the notion of Ubuntu being practically secure (are there more vulnerabilities that people aren’t bothering to look for?), but nevertheless it’s my belief that being open source is a security benefit for Ubuntu here, even if I can’t completely prove it.
Because of the aforementioned ability to see and modify any and every bit of code in Ubuntu and its applications, Ubuntu also gains a security advantage in that it’s possible for users to manually patch flaws immediately (assuming they know how) and that with that ability Ubuntu security updates are pushed out just about as rapidly as humanly possible. This is a significant distinction from Windows and Patch Tuesday, and while Microsoft has a good business reason for doing this (IT admins would rather get all their patches at once, rather than testing new patches constantly) it’s not good technical reasoning. Ubuntu is more secure than Windows through the virtue of patching most vulnerabilities sooner than Windows.
Finally, looking at Ubuntu there are certainly areas for improvement with security. I’ve already touched on the firewall abilities, but sandboxing is the other notable weakness here. Windows has seen a lot of work put into sandboxing Internet Explorer so that machines cannot get hit with drive-by malware downloads, and it has proven to be effective. Both Internet Explorer and Google’s Chrome implement sandboxes using different methods, with similar results. Meanwhile Chrome is not ready for Linux, and Firefox lacks sandboxing abilities. Given the importance of the browser in certain kinds of malware infections, Ubuntu would benefit greatly from having Firefox sandboxed, even if no one is specifically targeting Ubuntu right now.
Ubuntu – Long Term Support
One item of particular interest with Ubuntu is their development schedule. Because a typical Linux distribution is composed of many applications from many different parties, the Ubuntu developers do not directly control or develop a lot of the software included in Ubuntu. Furthermore Ubuntu tries to be a complete desktop environment rather than just an operating system, which means it includes a wider variety of software than what’s found in Windows and Mac OS X.
What this amounts to is that Ubuntu needs to both provide future patch support for included applications, and it needs to compensate for the fact that they don’t develop many of these programs. Coupled with this is the fact that 2nd party application development is not necessarily synchronized to Ubuntu’s release schedule and some applications (and the kernel itself) can have a rather rapid development rate.
Trying to deal with all of these factors, Ubuntu has settled on two classes of releases. Every 6 months – in October and April – Ubuntu takes what’s ready and releases a new version of the OS. For 1st party material this is tied with some goal for the release (such as replacing the audio daemon) while for 3rd party software this may be as simple as grabbing the latest version. This puts regular Ubuntu versions in an unusual position when trying to classify them – it’s significantly more than a Mac OS X point update, still more than a Windows service pack, and yet a single release generally encompasses less than a new version of either OS. But at the same time, there’s no guarantee that any given release of Ubuntu won’t break software compatibility or binary driver compatibility, which puts it up there with major OS releases.
Furthermore because of the need to provide security updates for all these different programs in all of these different versions, Ubuntu has a very short support cycle, and in that cycle only bug fixes and security updates will be issued, software is not otherwise changed as it’s intended to represent a stable platform. A regular release is only supported for 1.5 years; which for example means support for 7.10 Gutsy, the immediate predecessor to 8.04 Hardy Heron, expired in April. This pushes new versions of Ubuntu back towards the idea of them being closer to a service pack or a point release. In essence, it’s intended that everyone using regular versions of Ubuntu will stick to a relatively rapid upgrade treadmill.
But this obviously doesn’t work for everyone, which results in there being two classes of Ubuntu. What we’re looking at today, 8.04, is what Ubuntu calls a long term support (LTS) release. Every 2 years a version of Ubuntu is labeled as a LTS release, which entails a much greater effort on the developer’s part to support that edition of the OS. The standard support period is 3 years instead of 1.5 years, and for the server edition of the OS that becomes 5 years.
This makes the LTS releases more comparable to Mac OS X and Windows, both of which have long support periods in excess of 3 years. This is also why we’re starting with a review of Hardy, in spite of it being over a year old now, because it’s the current LTS release. Regular short-support Ubuntu releases have their place, but they are not intended for long-term use. Coming from Windows or Mac OS X, a LTS release is the comparable equivalent.
Operating System | Mainstream Support | Extended Support |
Windows | 5 years | 5 additional years |
Ubuntu | 1.5 years | None |
Ubuntu LTS | 3 years | None |
Mac OS X | So long as it's the newest OS | So long as it's one version behind |
Unfortunately, in spite of the LTS designation, not all of the applications in a LTS release are intended to be used for such a long period of time, or are their developers willing to support them for that length of time. If we take Firefox for example, the last Ubuntu LTS release, 6.06 Dapper, shipped with Firefox 1.5. Mozilla very quickly ended support for Firefox 1.xx after Firefox 2 shipped, and now you can’t even get support for 2.xx now that 3.xx has been out for quite some time. This leaves the Ubuntu developers in charge of supplying security updates for the older versions of Firefox they still support, which while better than the alternative (no security patches) isn’t necessarily a great solution.
The Ubuntu developers have done a good job of staying on top of the matter (they just published a new 1.5 security patch as recently as last month) but it highlights the fact that the Ubuntu developers do not always have the resources to maintain both a stable platform and the necessary security updates. So while an LTS release is supposed to be supported for 3 years, in reality not every component is going to make it that long.
Digging through the bugs list for Dapper and Hardy, I get the impression that these kinds of cracks only occur on less-used software (particularly that which is not part of the default install, such as VLC), so an option for users who need to stick with the base OS for the entire life of a LTS release, but don’t mind upgrading a few applications can go that route and cover all of their bases. Unfortunately this is easier said than done, and we’ll get to why that is when we discuss the package manager.
What this amounts to is that if you’re the kind of person that intends to run a computer and an OS for a very long period of time – say on the scale of XP, which turns 8 this year – Ubuntu likely isn’t a good fit for you.
What’s the Value of Technical Support, Anyhow?
Besides patching bugs and security vulnerabilities, the other aspect of “support” is technical support; help for when things go wrong. As I mentioned earlier, Ubuntu is free, and one of the conditions of this is that there is no official technical support for Ubuntu for the user. To be fair, there are some purchasable support options for larger organizations that can afford a support contract, but for the average desktop user this isn’t accessible. So as far as we’re concerned, Ubuntu doesn’t have any official technical support.
I spent quite some time gnawing over the idea of just how valuable technical support is. I have never made a technical support call for desktop software, often because I’m capable of finding and fixing the issue myself through the magic of Google, and because calling for technical support seems to be a futile exercise in being fed canned support scripts. So many possible things can go wrong with software that the person on the other end of the line may not be able to help you, which makes me question the value of technical support for software.
Trying to come up with a resolution for this matter, I posted a poll last year in our forums to get some user feedback. The skills of the people who inhabit our forums versus those who read our site means that this poll is not a scientifically valid poll, nor is it even a fair poll; it’s greatly biased towards the techie crowd like myself. Nevertheless, I wanted to know who uses technical support when they have it.
I had theorized that the results of the poll would end up reflecting my own views, and this is exactly what happened. When our forum participants were asked if they had ever called Microsoft for technical support with Windows (excluding activation issues), out of 52 votes only 9 of those votes were a “yes” for 17.3%. Clearly out of our techie crowd, the majority of users do not use their technical support options.
Based on this, I do not believe that technical support for a software product is valuable for the people most likely to be installing Ubuntu on their own. Or in other words: So what if Ubuntu doesn’t come with technical support? It’s not like most of us would use it anyhow.
I would take the time to separate the idea that software technical support is the same as total technical support however. It becomes another matter entirely when you can get support for a complete computer from an OEM. They can support both the hardware and the software, and that means they can (probably) help you solve issues when what looks like an issue with one element is really an issue with the other.
The benchmark here is Apple since they make both their hardware and their software, which puts them a step above Dell and other PC OEMs that are a bit more separated from the software. What I’m getting at is that is that even if Ubuntu came with technical support, it would be of limited value since they cannot help you with your hardware. If you need real support, you’re better off buying a computer from an OEM who can support all that you need (although we should note that even for computers sold with Ubuntu, the OEM does not usually handle the software support…).
Finally, just to throw out an example of how useless technical support can be even when you have it, let’s take a look at Windows (we’d take a look at the Mac, but OS support is bundled with the hardware). Even for a retail copy of Windows, which Microsoft offers direct support for, you only get free technical support for 90 days after activation. After that you’re out $59 per incident. It’s effectively installation and post-installation support, not support for continuing use.
In the end, not only would technical support likely be unbeneficial for most people once they’re past the installation process, but there’s no real precedent for offering technical support on just the OS. As such while there’s no technical support for Ubuntu, it ultimately doesn’t matter because no one else provides cheap extended technical support for just their OS either.
A Word on Drivers and Compatibility
As we mentioned earlier, Ubuntu and the Linux kernel are open source projects, particularly under the GPL license. In large part due to the philosophies of the GPL, compared to Mac OS X and Windows, Linux handles drivers in a notably different fashion.
In a nutshell, the developers of the Linux kernel believe in the open source movement and wish for all related software to be open source. Furthermore they do not like the implications of attaching a closed source “binary blob” driver to the Linux kernel, because if something goes wrong it can be impossible to debug the issue if it occurs in the driver for which they do not have the code for. As such they have moral and technical objections to the Linux kernel supporting external drivers and actively prevent the creation of such drivers. This is done through mechanisms such as not having a fixed API for external drivers, and by not artificially keeping themselves from making changes to the kernel that would break external drivers. Drivers that they do have the code for can usually just be recompiled against the new kernel and are unaffected as a result. The result is that “binary blob” drivers are systematically opposed.
For the most part, this works fine. Not all hardware is supported under Linux because not everyone is willing to share the specifications and data needed to make a driver, but more than enough device manufacturers are willing to share such data that Linux generally supports non-esoteric hardware quite well. There is one class of notable hold-outs here however, and that’s the GPU manufacturers, namely ATI and NVIDIA.
Compared to other drivers, GPU drivers are different for two reasons. First is the sheer complexity of the drivers - besides interfacing with the hardware, the drivers are responsible for memory management, compiling/optimizing shader code, and providing a great deal of feedback. This in essence makes GPU drivers their own little operating system – one that its developers aren’t necessarily willing to share. The second significant difference here is because of the above, GPU drivers are among the only drivers that have a compelling reason to be updated regularly; they need to be updated to better support newer games and fix bugs in the complex code that runs through them.
Complicating matters further is that some intellectual property in GPUs and drivers is not the property of the company who makes the GPU. AMD doesn’t own everything in their Universal Video Decoder, and just about everyone has some SGI IP in their drivers. In the interest of protecting that IP, it is difficult to release the code for those drivers containing other companies’ IP.
Because of all of this, manufacturer-supplied GPU drivers are not always open source. Intel and S3 do well in this respect (largely because they have few tricks to hide, I suspect), but hyper-competitive NVIDIA and AMD do not. AMD has been looking to rectify this, and back in 2007 we discussed their starting work on a new open source driver. Development has been progressing slowly, and for the R6xx and R7xx hardware, the open source driver is not yet complete. Meanwhile NVIDIA has shown no real interest in an open source driver for their current hardware.
So if you want to use a modern, high-performance video card with Linux, you have little choice but to also deal with a binary blob driver for that card, and this becomes problematic since as we mentioned Linux is designed to discourage such a thing. Both AMD and NVIDIA have found ways around this, but the cost is that installing a binary driver is neither easy, or bug free.
The fundamental method that both use for accomplishing this is through the use of a kernel shim. Both analyze the headers for the kernel to identify how the kernel is organized, then they compile a shim against that kernel. The shim resolves the issues with the lack of a stable API, and the other end of the shim provides the stable API that NVIDIA and ATI need.
Ubuntu in particular takes this one step further, and in the interest of promoting greater out of the box hardware compatibility, includes a version of the binary drivers with the distribution. This is unusual for a Linux distribution and has earned Ubuntu some flak since it’s not strictly adhering to some open source ideals, but it also means that we were not forced to play with driver installers to get Ubuntu fully working. Ubuntu had no issues with both our AMD 2900XT and NVIDIA 8800GTX cards, both of which were picked specifically because we wished to test Ubuntu on suitably old hardware which would exist in time for Ubuntu to include support for it. With that said, the drivers Ubuntu includes are understandably old (once again owing to the idea of a stable platform) which means we can’t avoid installing drivers if we want better performance and application compatibility.
And this is where “easy” comes to an end. We’ll first start with AMD’s installer, the easier of the two. They have a GUI installer that puts in a driver along with a Linux version of the Catalyst Control Center. It’s Spartan, but it gets the job done.
NVIDIA on the other hand does not have a GUI installer – their installer is a text mode installer that requires shutting down the X server (the GUI) in order to install. It’s difficult to understate just how hard this makes driver installation. Not only is doing all of this completely non-obvious, but it requires interfacing with the CLI in a way we were specifically trying to avoid. It’s something that becomes bearable with experience, but I can’t call it acceptable.
Driver upgrades are an issue on both sides, because the installers are not completely capable of finding and eliminating older versions of the binary drivers. In one instance, for the NVIDIA drivers we had to track down a rather sizable shell script that automatically deleted the old drivers before installing the new ones, as that was deemed the “right” way to install the drivers. We had less of an issue with ATI’s drivers, but to be fair the primary card I used for my time with Ubuntu was the 8800GTX. I can’t confidently say that there are not other issues that I may have not run in to.
The Ubuntu community does supply tools to help with GPU driver installations, Once such tool is EnvyNG, which reduces the driver installation process to selecting what driver you want to install and it does the rest. This is a far easier way to install drivers, in the right situation it’s even easier than it already is under Windows. But it suffers from needing to have the latest driver data hardcoded in to it, which means you can only use it to install drivers it knows about, and nothing newer. It’s not regularly updated (as of this writing the latest driver versions it has are NV 173.14.12 and ATI Catalyst 8.6) so it’s good for installing newer drivers, but not the newest drivers.
The other tool is access to Ubuntu’s Personal Package Archives, which are a collection of user-built binaries that can be installed through the Ubuntu package manager (more on this later). It’s harder to use than EnvyNG, but anyone can build a PPA, which makes updates more likely. As it’s user-generated however, this still means that there won’t always be the latest drivers available, which means we’re still back to using ATI and NVIDIA’s installers.
As it stands, installing new GPU drivers on Ubuntu is between an annoyance and unbearable, depending on how many hoops you need to jump through. It’s certainly not easy.
The other problem with GPU drivers is that they do not always stay working. Among the issues we encountered was ATI’s driver failing to work after installing an Ubuntu update, and an NVIDIA driver that kept rebooting the system during testing for reasons we never determined (once we wiped the system, all was well).
Our final issue with the state of GPU drivers on Ubuntu is their overall quality. With a bit of digging we can come up with issues on both sides of the isle, so it’s not as if either side is clean here. But with that said, we only ended up experiencing issues with ATI’s drivers. We encountered some oddities when moving windows that was eventually fixed in the Catalyst 9.3 drivers. It turns out that the problem was that ATI’s drivers lacked support for redirected OpenGL rendering; Linux guru Phoronix has a great article on what this is, including videos, that explains the importance of this change.
Ultimately we hate to sound like we’re beating a dead horse here, but we can’t ignore the GPU driver situation on Ubuntu (and really, Linux as a whole). The drivers have too many issues, and installing newer drivers to fix those issues is too hard. Things could be worse, Ubuntu could only distribute driver updates with OS updates ala Apple, but they could also be better. For the moment it’s the weakest point for Ubuntu when it comes to installing it on a high-end system.
The Package Manager – A Love/Hate Relationship
Out of every piece of software in Ubuntu, the package manager is the single most monumental and unique piece in the Operating System. I can tell you about Evolution (Ubuntu’s email client), or Totem (Ubuntu’s media player) and even if you’ve never used these programs, it would be easy to relate them to other things you likely have used. However trying to relate a package manager is a bit harder. The use of a package manager, and going further than that by completely relying on one, changes the OS experience entirely. Some of these changes are good and some are bad, driving what has become a love/hate relationship with apt, Ubuntu’s package manager.
Rather than trying to explain what a package manager is, it would be easier to explain what is a package manager. Package mangers are more common than most people would think, as there are several systems that use package managers without it manifesting itself in an obvious way. My iPhone runs a package manager – two in fact – one being the iTunes App Store and the other being apt (the same as Ubuntu) sitting underneath Cydia. Steam is also a package manager, taking care of its own little microcosm of games, mods, and SDKs. Most people have used a package manager without realizing it.
But none of them take it as far as Ubuntu. Steam only uses package management to install games, the iPhone via apt takes it a little bit further to install a wider base of applications and frameworks, but none of them integrate package management in to the OS like Ubuntu does. Everything in Ubuntu is a package, starting with the kernel and moving to drivers and applications. And the ramifications of this are huge.
When you go to install an Ubuntu application, there is no need to track down an installer for an application, make sure it’s the latest version, make sure it’s not really a Trojan or virus-infected, etc. All of the applications bundled with an Ubuntu release sit on Ubuntu’s servers as a package. Finding software to install (if it didn’t already come on the CD) is as easy as firing up the Add/Remove Applications application, and looking for the application you’d like to install. And if you don’t know what you want to install? Ubuntu will tell you all about whatever application you’re looking at.
Once an application is installed, the package manager will keep track of that application. It can uninstall the application if you need to remove it, or make sure it’s up to date if at some point a newer version (such as a bug fix) is published. The package manager brings everything together.
From an application perspective it’s little different than the iTunes App Store, but compared to what other OSs do it’s a big deal. How many different applications install their own updater service? Even though Microsoft and Apple consolidate updating their software in to their own software update systems, they can’t do that for everyone else’s applications. Chrome, Flash, Java, etc all have updaters running in the background just to keep their respective applications up to date. And while these updater applications are small compared to what they’re tasked to monitor, it’s none the less a waste of resources. Why do you need many applications to do the same job? On Ubuntu, you don’t.
On Ubuntu, the package manager is also in charge of keeping the OS itself updated, which is where we see it significant diverge from our earlier example of the iTunes App Store. Mixed in with application updates are updates to various system components, each one dutifully made in to their own package. This makes it very easy for Ubuntu to distribute component updates as needed (rather than bundling them together as larger updates) but it’s also a bit daunting – there are a lot of updates even when starting from Ubuntu 8.04.3. Nevertheless, for the curious types, this allows you to see exactly what’s being updated, and usually there’s a note attached with a meaningful explanation as to why.
Ubuntu’s package manager is the most foolproof way to install and maintain software I’ve ever used, on a computer. And that’s why I love it.
The package manager is also the outlet of my frustrations with Ubuntu, for many of the same reasons. Everything in Ubuntu is a package. There are no drag-and-drop installs like in Mac OS X, and there are no MSI/NSIS/InstallShield installs like Windows, there is only the package. The problem is that the package manager is an extremely self-limiting device when combined with Ubuntu’s software distribution philosophy as we mentioned earlier. Ubuntu isn’t just distributing an OS on which you run programs, but they’re distributing the programs themselves, and it’s all one stable platform.
You’ll first discover how frustrating this can be when you decide that you would like a newer version of some piece of software than what Ubuntu offers. We’ll take Wine for example, which develops at a rapid pace. If you want to be able to install the latest version of Wine, rather than version 1.0.1 that comes with Ubuntu, you’ll need to follow these instructions, which are composed of adding new repository entries to apt, followed by downloading and importing an authentication key in to apt so that it will trust the packages. Only then can you go back in to the package manager and tell it to install the latest version of Wine.
The Ubuntu project does offer a slightly simpler alternative through the Personal Package Archives, which are packages uploaded by users and hosted by the project. PPA repositories are a bit easier to install than the standard DEB repository, but the primary focus on PPAs is that there’s additional software available as a package for easier upgrading and maintenance. However since PPAs are maintained entirely by users, they’re unreliable as a source of updates, and not everything is made available for Hardy.
As a result of all of this, the package manager has just made software installation on Ubuntu a good deal harder than it is on Mac OS X or Windows if we wanted to do the same thing. And if you want a piece of software that’s not either the default Ubuntu version or the latest version from another repository, good luck, the package manager is designed to make upgrading easy, not necessarily downgrading.
The package manager exists to the detriment of any other way to install software. Technically software packages can be distributed outside of a repository, but in my own experience that seems very uncommon. Followed by that you have the shell script containing a binary blob (which may or may not be recognized and open correctly) and the more bearable-but-rare compressed folder. You are, for better or worse, stuck with the package manager in most cases.
This is why I hate the package manager. To the credit of the developers of it, it’s more of a flaw in the philosophy of Ubuntu than the technology, but the package manager in the minion enforcing the harsh realities of that philosophy. It’s here where the wheels start falling off of Ubuntu. It works well when you want to run software that Ubuntu provides in its main repositories, but only when you want to run software that they provide. Installing any other software is at times a nightmare.
I’ll close out this section reflecting on the iTunes App Store one more time. In spite of being a package manager, I have no qualms with it. Apple doesn’t tie app versions with OS versions, so I can always grab the latest version. Meanwhile if I need an older version it’s not easy, but double-clicking on archived IPA files is still less troubling than trying to pull off something similar with Ubuntu.
True nirvana for software installation and updating lies between Ubuntu’s strict package manager, and Windows’ loose environment of installers. Apple found one solution, though certainly not the only one. Ubuntu would do well to find a similar way to meet in the middle. As much as I love a unified installer and updater, as done by Ubuntu it causes me more frustration than enjoyment. I consider the package manager to be the worst regular experience of Ubuntu.
UI & Usability
To put Ubuntu GUI in the context of existing operating systems, I’d lump it in with Windows XP. If you can use Windows XP, then you’re going to be right at home with Ubuntu. The window layouts are similar, the buttons are the same, many of the shortcut key combinations are the same. Whether it’s intentional or not I can’t say, but with the similarities it’s a very easy transition to Ubuntu coming from Windows.
But there are some important differences between Ubuntu and XP, and they start to make themselves apparent almost immediately. The taskbar and its conjoined twin the start menu (the Menu Bar in Ubuntu) have been separated – the taskbar gets the bottom of the screen and the menu bar gets the top. Because the menu bar is always visible by default this makes it look close to Mac OS X, but due in large part to the fact that applications do not share the menu bar like they do in Mac OS X, it’s functionally much more like XP. Joining it up top are the Ubuntu equivalents of the quick launch toolbar, and the system tray. This leaves the taskbar at the bottom, containing running applications along with the controls for Ubuntu’s virtual desktops implementation.
This is something I find works quite well on narrow screens, but is a wash on larger screens and widescreens. By putting the menu bar and the taskbar on different physical bars, it leaves more space for active applications in the taskbar while not forcing the menu bar to be compacted. Depending on how cluttered your complete taskbar may have been under Windows, this can buy you enough space to comfortably fit another couple of active applications, which may not be much but can make all the difference in some situations. The cost of this however is that you lose additional vertical real estate compared to if everything was on one bar. Hiding the bars can get this space back, but it’s been in my experience that most people hate auto-hiding bars, which may very well be why no OS has them auto-hiding by default.
At first glance, the menu bar is just different enough from XP’s start menu to throw some people for the loop. The contents of the start menu have been broken up a bit: Applications is Windows’ All Programs, Places is My Recent Documents, and System is Control Panels. Coming from Windows, the two biggest changes are that most applications are organized by functionality rather than each application getting its own subfolder in the Applications menu, and that what would be found in Control Panels is now split between the Preferences and Administration submenus under System, based on if it adjusts a per-user preference or a system preference (and hence would need administrative access).
Nautilus, the Ubuntu file manager, really drives home the idea that Ubuntu works like Windows. It takes the “file manager is a web browser” concept just as far Windows ever did, which isn’t necessarily a good thing given how old (not to mention dead) the concept is, making Nautilus feel a bit dated. Beyond that, there’s little that can be said that differentiates it from Windows XP’s Explorer.
Multitasking is also handled in a very XP-like fashion. Beyond the taskbar, alt-tab switches among applications just as it does on Windows (or cmd-tab on Mac OS X). Notably, Ubuntu has copied some of the more interesting quirks of both Windows Vista and Mac OS X. From Windows Vista it inherits the ability to see the contents of a window when alt-tabing, and from Mac OS X it inherits the ability to close an inactive window without needing to focus on it, allowing you to keep focus on whatever you’re working on.
Ubuntu also has one more trick up its sleeve when it comes to multitasking, and that’s virtual desktops. Virtual desktops, or workspaces as they’re called in Ubuntu, allow for the creation of multiple workspaces in a single user session, such that different windows can be in different workspaces, completely hidden when that workspace is not active. It’s been a feature of various *nix operating systems for ages, and Apple added this feature as Spaces in 10.5 Leopard. Windows has no built-in equivalent.
I’ve tried using this method before as Spaces, and again on Ubuntu with their workspaces, and I fully admit I don’t “get it.” The idea of being able to move a window completely out of your way by keeping it in another workspace makes sense, but I have never been able to make it work for me. Ultimately I find that I have to go chase down a window that I need when it’s off in another workspace. I know there are plenty of people out there that can make good use of workspaces, so it may as well just be a personal flaw. It’s a neat concept, but I haven’t been able to make it work for me.
Moving on, one thing I find that Ubuntu does well is that it better bridges the look of the OS with and without eye-candy. Windows Vista does a very poor job of this, and it’s immediately obvious if Aero is running or not. The style choices for Vista clearly were based on Aero, so if for any reason Aero is disabled, you get the 2D-only Vista Basic UI that poorly compensates for the lack of transparency. Ubuntu on the other hand looks nearly identical in static screens, only the lack of subtle window shadows give away when Ubuntu is running without visual effects (Ubuntu’s name for 3D accelerated desktop compositing). Most people will never run Ubuntu with desktop compositing disabled, just as most people will never run Windows Vista with Aero disabled, nevertheless this is one of those subtle design choices that impressed me.
An example of Ubuntu's hardware compositing. Hardware composited on the left, software on the right.
With desktop compositing enabled the experience is similar to that of Windows Vista or Mac OS X. Windows fade out of view, shrink & grow, etc just as they do in the other two. I feel like I should be writing more here, but there’s just not a lot to say; it’s the same desktop compositing abilities everyone else has, including their UI tricks that serve to accelerate user interaction. The one thing in particular that did catch my eye however is that Ubuntu includes a UI feature called Scale that is virtually identical to Mac OS X’s Exposé. As a self-proclaimed Exposé junky I find most welcoming, as this is my preferred way to multitask with a large number of windows. There have been a couple of times, as a result, where I have found my workflow under Ubuntu being smoother than that of Vista, though Mac OS X still surpasses this.
However I’m much less enthusiastic about the icons Ubuntu uses, and there’s one element in particular that nearly drives me insane: executables/binaries don’t even have icons. In Windows executables can be packed with resources such as icons, and in Mac OS X app bundles contain icon files that are used to give the bundle an icon. On Ubuntu however, the executables don’t have their own icons. Ubuntu can assign custom icons to anything, but apparently this is being remembered by file manager, rather than actually attaching an icon. By default, the only thing with custom icons are the Launchers (a type of shortcut) that Ubuntu automatically creates for installed applications. Everything else is either issued a default icon for its type, or certain media types (e.g. images) are thumbnails.
In an ideal world this isn’t an issue because everything is installed to the system and has its own Launcher somewhere in the menu bar, but with software that doesn’t directly install (such as programs distributed in compressed folders) this isn’t something that’s handled automatically. In place of an application specific icon executables have a generic executable icon, which worse yet is shared by more than just executables. As an example of this we have a screenshot of the folder for the demo of Penny Arcade Adventures: Episode 2. Can you figure out which item launches the game?
The right answer, the document-like item called RainSlickEp2 (which is actually a shell script) is completely non-obvious. If this were Windows or Mac OS X, there would be an appropriate custom icon over the right item. Meanwhile not only are we lacking a custom icon, but the binary icon is used directly in 3 different places, and as an overlay on top of a document icon in a 4th place. Only 1 item is even an executable binary. And while I had hoped this was an issue just with this game, it extends to everything else; even Firefox’s actual executable lacks an icon. As it turns out, the Linux executable format, ELF, doesn’t have the ability to contain icons.
I hate to harp on this issue, but I am absolutely dumbfounded by it. Usability goes straight down the tubes the moment you need to use non-packaged software because of this – and because the DEB package format is not a Linux-wide standard, there’s a lot of software like that. On a GUI, there needs to be graphical elements to work with.
On the flip side, I find it interesting that Ubuntu has icons in certain places where Windows and Mac OS X do not. Action buttons such as “open” and “close” have icons embedded in them, while the other two OSs have always left these buttons sparser, containing just the text. The ramifications of this are that with icons in your buttons, you don’t necessarily need to be able to read the text to be able to use the OS so long as you understanding the meaning of the icons. It’s easily the most drastic difference between the Ubuntu and Windows/Mac OS X GUIs that I have noticed. But at the same time, I’ll say that it’s so different that even after a year I still don’t know quite what to make of it – it often results in big, silly buttons when something smaller would do. The jury is still out on whether this is a good difference or not.
I would also like to touch on the directory structure of Ubuntu, as it falls under the nebulous umbrella of usability once you have to start traversing it. Because Linux is a spiritual successor to the ancient Unix systems of years past, it has kept the Unix directory structure. This is something I believe to be a poor idea.
I don’t believe I’ve ever seen a perfect directory structure on an operating system, but there are some that are better than others. As an example of this, here’s a list of some of the more important Linux root directories: bin, boot, dev, etc, home, mnt, opt, sbin, usr, and var. And if this were Windows Vista: Boot, Program Files, Program Data, Users, and Windows.
The problem I have with the Ubuntu directory structure is that the locations of very few things are obvious. Firefox for example is in /usr/lib/Firefox, while on Windows it would be in /Program Files/Firefox. Why /usr/lib/? I have no idea. There’s a logical reason for that placement, but there’s absolutely nothing intuitive about it. Microsoft is no saint here (how many things are in /Windows and /Windows/System32?) but at least the location of user installed programs is completely and utterly obvious: Program Files. And if we’re on Mac OS X it’s even easier, /Applications. This all adheres to a standard, the Filesystem Hierarchy Standard, but that just means the standard is just as confusing.
Thankfully, and to be fair, there’s little reason to be going through the entire contents of the OS partition looking for something, but If you ever need to do so, it can be a frustrating experience. Ubuntu would benefit greatly by using a more intuitive structure, something that I’m convinced is possible given that Apple has pulled this off with Darwin, which also has the *nix directory structure, but avoids it as much as possible. I’d also like to see user data kept in /users like Windows and Mac OS X rather than /home, but Rome wasn’t built in a day… There is much room for improvement here.
Wrapping things up, when I first started with Ubuntu I did not have very high expectations as far as usability was concerned. I expected Ubuntu to be functional, but not necessarily exceptional – GUI design is an ugly and hard job, just how good could it be on a free OS? For all the reasons I like Mac OS X I can’t sing high praises about Ubuntu’s GUI or usability, but it surpassed my initial expectations. Other than the icon issue, there are no glaring flaws in Ubuntu’s GUI or the usability thereof. It’s not a revolutionary or even evolutionary GUI, but it does come off as a very solid facsimile of Windows XP with a few unique quirks and the eye-candy of Vista and Mac OS X thrown in, and that’s something I’m satisfied with. And a satisfactory GUI is not a bad thing, it’s quite an accomplishment given just how difficult GUI design is.
As an aside, I’m not a big fan of the default orange/brown color scheme for Hardy. It can be changed easily enough although I’ve always thought they could do better for a default scheme. I hear 9.10 may finally do away with orange, so we’ll see what we get in Ocotober.
Installation
In terms of difficulty, right up there with making a good GUI is making a good installer. History is riddled with bad OS installers, with pre-Vista Windows being the most well-known example. Text mode installers running on severely castrated operating systems reigned for far too long. Microsoft of course improved this with Windows Vista in 2006, but even as late as the end of 2007 they were still releasing new operating systems such as Windows Home Server that used a partial text mode installer.
The reason I bring this up is that good OS installers are still a relatively recent development in the PC space, which is all the more reason I am very impressed with Ubuntu’s installer. It’s the opposite of the above, and more.
Right now Ubuntu is the underdog in a Windows dominated world, and their installation & distribution strategies have thusly been based on this. It’s undoubtedly a smart choice, because if Ubuntu wiped out Windows like Windows does Ubuntu, it would be neigh impossible to get anyone to try it out since “try out” and “make it so you can’t boot Windows” are mutually incompatible. Ubuntu plays their position very well in a few different ways.
First and foremost, the Ubuntu installation CD is not just an installer, but a live CD. It’s a fully bootable and usable copy of Ubuntu that runs off of the CD and does not require any kind of installation. The limitations of this are obvious since you can’t install additional software and CD disc access times are more than an order of magnitude above that of a hard drive, but nevertheless it enables you to give Ubuntu a cursory glance to see how it works, without needing to install anything. Live CDs aren’t anything new for Linux as a whole, but it bears mentioning, it’s an excellent strategy for letting people try out the OS.
This also gives Ubuntu a backdoor in to Windows users’ computers because as a complete CD-bootable OS, it can be used to recover trashed Windows installations when the Windows recovery agent can’t get the job done. It can read NTFS drives out of the box, allowing users to back up anything they read to another drive, such as USB flash drive. It also has a pretty good graphical partition editor, GParted, for when worse comes to worse and it comes time to start formatting. Ubuntu Live CD is not a complete recovery kit in and of itself (e.g. it can’t clean malware infections, so it’s more of a tool of last resort) but it’s a tool that has a purpose and serves it well.
Better yet, once you decide that you want to try an installable version of Ubuntu, but don’t want to take the plunge of messing with partitions, Ubuntu has a solution for that too. Wubi, the Windows-based Ubuntu Installer, allows you to install Ubuntu as a flat-file on an existing NTFS partition. Ubuntu can then boot off of the flat file, having never touched a partition or the master boot record (instead inserting an Ubuntu entry in to Windows BCD). This brings all the advantages of moving up from a Live CD to an installable version of Ubuntu, but without the system changes and absolute commitment a full install entails. Wubi installations are also easily removable, which further drives home this point.
Now the catch with a Wubi installation is that it’s meant to be a halfway house between a Live CD and a full installation, and it’s not necessarily meant for full-time use. As a flat file inside of a NTFS partition, there are performance issues related to the lower performance of the NTFS-3G driver over raw hard drive access, along with both external fragmentation of the flat file and internal fragmentation inside of the flat file. An unclean shutdown also runs the slight risk of introducing corruption in to the flat file or the NTFS file system, something the Wubi documentation makes sure to point out. As such Wubi is a great way to try out Ubuntu, but a poor way to continue using it.
Finally, once you’ve decided to go the full distance, there’s the complete Ubuntu installation procedure. As we’ve previously mentioned Ubuntu is a live CD, so installing Ubuntu first entails booting up the live CD – this is in our experience a bit slower than booting up a pared down installation-only OS environment such as Vista’s Windows PE. It should be noted that although you can use GParted at this point to make space to install Ubuntu, this is something that’s better left in the hands of Windows and its own partition shrinking ability due to some gotchas in that Windows can move files around to make space when GParted can’t.
Once the installation procedure starts, it’s just 6 steps to install the OS: Language, Time Zone, Keyboard Layout, Installation Location, and the credentials for the initial account. Notably the installation procedure calls for 7 steps, but I’ve only ever encountered 6, step 6 is always skipped. This puts it somewhere behind Mac OS X (which is composed of picking a partition and installing, credentials are handled later) and well ahead of Windows since you don’t need a damn key.
The only thing about the Ubuntu installation procedure that ruffles my feathers is that it doesn’t do a very good job of simplifying the installation when you want to install on a new partition but it’s not the only empty partition. This is an artifact of how Linux handles its swapfile – while Windows and Mac OS X create a file on the same partition as the OS, Linux keeps its swapfile on a separate partition. There are some good reasons for doing this such as preventing fragmentation of the swapfile and always being able to place it after the OS (which puts it further out on the disk, for higher transfer rates) but the cost is ease of installation. Ubuntu’s easy installation modes are for when you want to install to a drive (and wipe away its contents in the process) or when you want to install in the largest empty chunk of unpartitioned space. Otherwise, you must play with GParted as part of the installation procedure.
This means the most efficient way to install Ubuntu if you aren’t installing on an entire disk or immediately have a single free chunk of space (and it’s the largest ) is to play with partitions ahead of time so that the area you wish to install to is the largest free area. It’s a roundabout way to install Ubuntu and can be particularly inconvenient if you’re setting up a fresh computer and intend to do more than just dual boot.
Once all of the steps are completed, Ubuntu begins installing and is over in a few minutes. Upon completion Ubuntu installs its bootloader of choice, GRUB, and quickly searches for other OS installations (primarily Windows) and adds those entries to the GRUB bootloader menu. When this is done, the customary reboot occurs and when the system comes back up you’re faced with the GRUB boot menu – you’re ready to use Ubuntu. Ubuntu doesn’t treat its first booting as anything special, and there are no welcome or registration screens to deal with(I’m looking at you, Apple). It boots up, and you can begin using it immediately. It’s refreshing, to say the least.
The actual amount of time required to install Ubuntu is only on the order of a few minutes, thanks in large part due to its dainty size. Ubuntu comes on a completely filled CD, weighing in at 700MB, while Windows Vista is on a DVD-5 at over 3GB, and Mac OS X is on a whopping DVD-9 at nearly 8GB. It’s the fast to download (not that you can download Windows/Mac OS X) and fast to install.
We’ll get to the applications in-depth in a bit, but I’d like to quickly touch on the default installation of Ubuntu. Inside that 700MB is not only the core components of the OS and a web browser, but the complete Open Office suite and Evolution email client too. You can literally install Ubuntu and do most common tasks without ever needing to install anything else beyond security and application updates. Consider the amount of time it takes to install Microsoft Office on a Windows machine or a Mac, and it’s that much more time saved. Canonical is getting the most out of the 700MB a CD can hold.
Applications: Web Browsing
Windows Default: Internet Explorer 7
What I use: Firefox 3
Ubuntu Default: Firefox 3
Much to the chagrin of Microsoft, the web browser is turning in to a miniature OS of its own, and in the case of anything that’s not Internet Explorer, it’s a miniature OS that has no allegiance to a real operating system. It’s the primary way to retrieve most information from the internet, applications can be created through AJAX and Flash, video can be watched (see: Hulu). A good cross-platform web browser removes a great deal of the need to use any specific OS, and this is something that works in Ubuntu’s favor.
Ubuntu ships with Firefox 3, Internet Explorer’s loyal opposition and currently the #2 browser on the market. So long as a site isn’t built for IE6 Firefox has great compatibility, good speed, and an army of extensions to add features to it. Since many of you already use it, there’s not a lot to say here: it’s a very solid browser, and something I find to be superior to Internet Explorer.
As I already use Firefox under Windows, the transition here was virtually non-existent. Ubuntu doesn’t have any direct Windows to Ubuntu transition tools, but after moving my Firefox profile from Windows to Ubuntu and reconfiguring a few location-sensitive settings, I was up and going. Internet Explorer users are going to have more of a transition obviously, but it’s not much. All of the major browsers’ core behaviors are the same, which makes it easy to switch among them with little fuss.
At the risk of marginalizing the rest of Ubuntu, I consider Firefox to be one of the core components that makes Ubuntu a success story. Because so much computer use these days is inside a browser, it has become a lynchpin for a good OS. If your browser is bad, then it’s probably hurting the usability of your OS if it means that many users cannot do something they regularly do on another browser. One only needs to look at the early versions of Mac OS X to get a good picture of this, as it shipped with the only-bearable Internet Explorer 5.
There are however a few caveats that I’d like to hit on. Something that continues to throw me for a loop is that while it’s the same Firefox I use under Windows and Mac OS X, it doesn’t necessarily look the same. The rendering engine is the same, but OS differences start to play out here. Mac OS X, Windows, and Ubuntu all render text slightly differently, and in the case of Ubuntu come with a significantly different font set. Because Firefox is at the mercy of the OS for fonts, what you get are small but noticeable differences in how the same page looks.
Firefox with default fonts
Firefox with MS Core fonts
Firefox under Windows
Above we have AnandTech rendered in Firefox 3 on Windows, and Ubuntu. On Windows Firefox uses Times New Roman and Arial for its default fonts, but these fonts do not exist on Ubuntu; rather Ubuntu uses what’s called “serif” and “sans-serif”. This along with how the two OSs differ in font anti-aliasing results in the different look of Firefox under Ubuntu. Having used Windows for a number of years, I have never gotten past the idea of Ubuntu looking “wrong” even though the right look is entirely subjective.
Ultimately I ended up adding the missing fonts by installing the msttcorefonts package, which contains Times New Roman, Arial, and the other “big name” standard fonts. With those installed and Firefox configured to use them, text looks much closer, although not quite the same. It’s a shame that Ubuntu can’t include these fonts by default.
The second caveat is one of performance. When using Javascript-heavy sites in particular, Firefox on Ubuntu seems just a bit slower than under Windows. I had never been able to figure out why until I saw this Slashdot article. Firefox for Linux is not compiled with profile guided optimization, a method of improving the performance of binaries by looking at how they’re used. While Ubuntu compiles their own releases of Firefox, they do the same thing. As a result, there’s a speed difference in Firefox – it’s the same code, but the Windows version is compiled in such a way that it’s faster. As I wrote at the start of this article, I’m not concerned with the performance of Ubuntu or its applications for the most part, and this falls under that notion. Firefox is slower, but not to the point that I care. It’s interesting enough that it bears mentioning, however.
Just to give you an idea of what the speed difference is, here’s a breakout of one of our Firefox benchmarks from the benchmarking section later in this article:
As you can see, in this Javascript-heavy test Firefox on Ubuntu is upwards of 17% slower than it is under Windows. As this performance gap manifests itself largely under Javascript-heavy situations; regular browsing doesn’t show nearly the difference. Flash is also slower, but this has nothing to do with Firefox and more to do with Flash’s mediocre performance under any OS that isn’t Windows.
The last caveat is one of how Ubuntu’s distribution model becomes strained when it comes to Firefox. Ubuntu Hardy shipped nearly 2 months before Firefox 3 did. But because Ubuntu is meant to be a stable platform they still needed to package Firefox 3 with the OS, so Firefox 3 Beta 5 was included. If we had done this article a month after Hardy launched as intended, I’d have few nice things to say. Firefox 3 Beta 5 combined with Adobe Flash 9 was buggy, unstable junk. Canonical made the right decision as the final version of Firefox 3 turned out well, but it highlights the pitfalls of including 3rd party software with the OS.
The flip side of this caveat is that Firefox 3.5.x has superseded 3.0.x as the newest Firefox branch, which means that only 3.0.x versions are being pushed out to Hardy. This means if you want to take advantage of any of Firefox’s newest features such as the new javascript engine, you’ll need to install a 3.5.x build separately, ideally through a PPA package so that it cleanly replaces the default version of Firefox.
But even with those caveats, none of them are serious issues. Firefox 3 is still a fantastic browser and there’s nothing else I’d rather have on Ubuntu.
Final Verdict: Meets My Needs
Applications: Communication - Email & Instant Messaging
Windows Default: Windows Mail/Outlook
What I Use: Outlook
Ubuntu Default: Evolution
Separate communication suites are a bit of a dying breed these days, largely due to the aforementioned rise of the web browser. Thanks to services like Gmail, web based email has taken a massive dent out of the need to use an email client, and new services are popping up that are starting to do the same for instant messaging. But they’re not dead yet, and more importantly I’m too old fashioned to give up my dedicated email and instant messaging clients, so this is a matter I consider important.
The default email client on Ubuntu is Evolution, a clone of Outlook. As Outlook is my default email client under Windows anyhow, this worked out quite well for me. Because Evolution is an Outlook clone, it features not just email, but contact lists and calendaring too, supplanting the need for separate applications for those under Ubuntu.
At the same time, because Evolution is an Ubuntu clone, there’s not a lot I can say about it – it’s a clone, there’s not much unique to it. What it is however is a good clone when it comes to my needs. Credit is due where deserved in cloning the monster that is Outlook, because Evolution did email, contacts, and calendaring for me just as well as Outlook does.
The only notable issue I had with Evolution is that it does not have a way to import Outlook PST files. It’s possible to do it, but it involves using Mozilla’s Thunderbird email client as an import/export mechanism. To be clear I’m not faulting Evolution here since PST is a closed Microsoft format.
Users coming from Windows Mail will be a bit less at home, but at the same time Evolution is likely an improvement for them for all the same reasons that Outlook is a better client than (and the de-facto Windows email client in place of) Windows Mail. Perhaps a more direct benefit is that since Evolution is pre-installed with the base OS installation, you don’t need to go hunt down a real email application after the OS installation. Never underestimate the annoyance of having to install more software.
If this description seems short, it’s not for the lack of effort or a dislike of Evolution. In fact I’m plenty happy with it, but as I use it it’s just Outlook with a different GUI. So far as I’m concerned this is a good - if unexciting – thing when coming from Windows and Microsoft Office.
Final Verdict: Meets My Needs
Windows Default: None/Windows Live Messenger
What I use: Trillian
Ubuntu Default: Pidgin
Somewhere along the way to Vista, Microsoft decided to decouple some applications from the OS, MSN Messenger was one of them. As a result Vista does not come with an instant messaging client of any kind, rather it comes with a link to go download the latest version of Windows Live Messenger. Not that it would necessarily be of much use, the last time I saw any statistics for instant messaging network usage, the vast majority of North American users were on AOL’s AIM network. In this case Windows may as well not have an official instant messaging client, because unless you use the MSN network (or Yahoo network) there’s no practical difference.
So it’s a nice change of pace when we note that Ubuntu comes with a multi-protocol instant messaging client as part of the base OS install. Pidgin (née GAIM) supports AIM, Yahoo, ICQ, MSN, and a boatload of smaller networks, thoroughly eliminating any possible problem of not being able to connect to your network of choice. Like Firefox, Pidgin is another significant multi-platform application, and is found on Windows and Mac OS X too.
Moving on to features, Pidgin hits all of the checkboxes as far as requisite features are concerned. Buddy lists, chat logging, file transfers, emoticons, end-to-end encryption, etc are all supported. What helps to set Pidgin apart from other clients, and this one again drawing a parallel to Firefox, is its support for plugins. Plugins aren’t new as far as instant messaging clients are concerned, but many clients don’t support them.
Pidgin comes with 30 such plugins, ranging from tools to integrate Pidgin with Evolution, to adding support for mouse gestures. Some of these are standard features in other IM clients, so clearly not all 30 plugins are by any means unique. I’m also going to throw built-in spell-checking in this category – not new, but sorely missed from a lot of clients.
Coming from Trillian, Pidgin is effectively a drop-in replacement. The two don’t have feature parity (Trillian has more features, specifically audio/visual chatting) but as far as I use either client, I don’t use anything that makes the two notably different once a few plugins are installed. Much like Evolution there’s undoubtedly some missing features once you get deeper that would be of concern to the hardcore users, but it’s nothing that rears its head for me.
Final Verdict: Meets My Needs
Applications: Audio Organization/Playback
Windows Default: Windows Media Player
What I use: iTunes/Winamp
Ubuntu Default: Rhythmbox/Totem
There are 3 things people will never agree on in this world: Politics, the Yankees versus the Red Sox, and what multimedia player to use. It doesn’t take much effort to find someone who hates any given player and has their own idea of what the best player is, so looking at the players included with an OS is somewhat academic. No matter what OS it is, a number of users are going to replace the default with something else. So for our discussion on multimedia playback, I’m going to preface this with a thought: the Ubuntu defaults aren’t the only options, there are other programs out there if the defaults aren’t satisfactory.
With that said, when it comes to audio organization and playback Ubuntu comes with two programs: Rhythmbox and Totem. Rhytmbox is Ubuntu’s dedicated audio organization and playback suite – analogous to iTunes – while Totem is a combined audio/video player, similar to VLC or the classic versions of Windows Media Player. In spite of the fact that Rhythmbox is the dedicated audio suite, I mention both of these since Ubuntu will in fact use both. Attempting to open an audio file from the file browser will default to Totem, while Ubuntu’s application menu calls Totem “Movie Player”, leaving the “Music Player” distinction to Rhythmbox. As a result Ubuntu is a bit schizophrenic about its audio software – it’s inconsistent throughout the OS.
Since Totem is an audio/video player, we’ll save it for our Video section and focus on Rhythmbox. As I mentioned previously, Rhythmbox is analogous to iTunes; even the manual specifically mentioned that the program was “originally inspired by Apple’s iTunes.” In fact there’s not a lot to be said about Rhythmbox: it looks mostly like iTunes, it acts mostly like iTunes, and it does most of what iTunes does. Consider it iTunes-lite, and that’s Rhythmbox in a nutshell.
As iTunes-lite, Rhythmbox holds both the benefits and the downsides to such a design. Monumental among these are the fact that Rhythmbox isn’t nearly as bloated as iTunes can be. Rhythmbox gives you the basic iTunes experience while eating less than half the memory and loading in less than half the time it takes for iTunes to load. iTunes may have a lot of features, but you’re paying for them somewhere. For most people, the complete iTunes feature set is overkill and they would be better served by lighter program like Rhythmbox.
The price of that lightness however is the feature set that Rhythmbox doesn’t implement. Among other things it lacks its own ability to extra audio from CDs, instead relying on another Ubuntu program, Audio CD Extractor (Sound Juicer) to accomplish this. Similarly, it lacks the ability to quickly encode existing songs in to another format. Last, for purchasing music it doesn’t have access to a full-featured store – the included interfaces are for Magnatune and Jamendo, which are best described as indie stores. Purchasers looking for mainstream music would be limited to Amazon’s store, which has a proper web interface and may be a curse or a blessing depending on how much you like the iTunes Music Store being integrated in to iTunes.
Rhythmbox does have the ability to synchronize music with portable media players, however since Apple actively blocks their iPhoneOS based devices from syncing with anything besides iTunes, Rhythmbox can’t actually sync with the portable media players most people have. This meant that I was unable to sync my iPhone with Rhythmbox, and had to dual boot instead. We don’t have a legacy iPod on hand, but it sounds like the latest Classic/Nano models won’t work either. Users with legacy iPods would need to seek out something like GTKPod, which is designed specifically for iPod synchronization and should do the job.
Ultimately the usefulness of Rhythmbox depends on how well you know iTunes and how many of its deeper features you use. For basic music organization and playback it does just fine – you may as well be using iTunes. But power users will probably be unsatisfied. Meanwhile Windows Media Player users will find it a tossup; it still has fewer features than WMP, but WMP has always needed to take a hint or two from iTunes when it comes to layout.
Final Verdict: Satisfactory/Only Meets Some of My Needs
Applications: Video Playback
Windows Default: Windows Media Player
What I use: Media Player Classic – Home Cinema
Ubuntu Default: Totem (Used: VLC)
Moving on to video, we have Totem, Ubuntu’s other media player. As we previously mentioned it’s already the default for audio files opened via the file browser, and along with that it’s also Ubuntu’s only video player. In concept it’s close to VLC or Media Player Classic, as it’s a solitary program that has a single window to play whatever the currently opened file is.
The single biggest strength of Totem is that once the restricted codec pack is installed, it can play anything and everything under the sun. MP3, AAC, MKV, H.264, MPEG-4 ASP, FLAC, and more are all available. This makes both Mac OS X and Windows Vista pale in comparison – the former can play about half of that, the latter even less. Codec hell has always been a nuisance under Windows and Mac OS X, but Ubuntu gets things right and avoids it altogether. I really can’t overstate this; from a fresh install it’s much, much easier to play media out of the box with Totem on Ubuntu than it is any other OS. This is the experience everyone else should be shooting for.
The key to Totem’s ease of use stems from the fact that the restricted codec pack includes the FFmpeg project’s libavcodec library of audio/video codecs. As the project seeks to offer playback support for every significant codec in existence, this gives Totem a clear advantage over Windows and Mac OS X, neither of which use libavcodec. This does mean, however, that Totem is not unique. Its playback abilities can be found in any other application that implements libavcodec, such as Media Player Classic, MPlayer, VLC, and others. As such the real magic is that Totem is the only default media player to include these abilities, rather than that it’s a completely superior media player.
As it stands there are two big kinks in Totem. The first of which is that it’s an extremely simple media player that lacks any kind of advanced features. It offers a single deinterlacing mode, no control over post-processing, and no audio/video filters. As such advanced users are going to find it unsatisfactory, and accordingly it’s one of the only default Ubuntu programs I specifically replaced when using Ubuntu. Instead I ended up using VLC, which has the advanced features I was looking for and I was already familiar with it since it’s a cross-platform media player.
The other kink in Totem is that it’s only as good as libavcodec, which in turn is only as good as the version of libavcodec that came with Hardy due to Ubuntu’s software update policy. As it stands the version of libavcodec that comes with Hardy has issues playing back a small number of Windows Media Video files, something which newer versions correct.
Furthermore it suffers from libavcodec’s continuing weakness: H.264 playback. Only the single-threaded H.264 decoder is considered stable, as such all libavcodec players using it will run in to problems when decoding high bitrate material. Our 30Mbps test clip won’t play back correctly under Totem or VLC 1.01, for example. There is a multithreaded H.264 decoder available in libavcodec, but as it’s not stable (on players that I have that include it, it crashes from time to time) it’s not suitable for general distribution. All of this is compounded by the fact that there’s no other H.264 decoder that can be installed on Ubuntu (e.g. CoreAVC) which means Ubuntu is limited to the best that libavcodec can do. For this reason none of the regular Ubuntu media players are well suited for material such as full quality BluRay rips.
Now we have yet to touch on hardware accelerated playback, which is something we’re going to hold off on until we take a look at Ubuntu 9.04. Linux does not have a common media framework like Windows and Mac OS X have DirectShow/DXVA and QuickTime respectively. Rather the desktop environment that Ubuntu is based off of (GNOME) includes a lesser framework called GStreamer, which is closer to a basic collection of codecs and an interface to them. As such hardware accelerated playback is not as easy to do under Ubuntu as it is under Windows and Mac OS X. We’ll take look at the APIs and the software for this in our look at Ubuntu 9.04.
But so long as you don’t need hardware accelerated playback, then Totem or another libavcodec based player will do the job nicely. Compared to the other applications in Ubuntu, I would put Totem/VLC up there with Firefox in terms of being a jewel of the OS. Like Firefox they may not be OS-exclusive applications that can be used to drive users towards Ubuntu, but they help solidify Ubuntu by giving it the ability to do a common task just as well as (or better than) any other operating system. At least until Windows 7 hits the shelves, no one has a better default media player.
Final Verdict: Meets My Needs
Applications: CD Burning/Image Editing
Windows Default: Drag & Drop
What I use: Nero
Ubuntu Default: Drag & Drop / Brasero
One of my minor annoyances with Mac OS X and Windows is that their default disc burning abilities are insubstantial. Both offer drag-and-drop file burning, audio CD burning through their respective audio suites, and in Mac OS X’s case it offers ISO burning too, but that’s it. As a long-time Nero user, I would rather have the finer level of control a disc authoring suit such as Nero or Toast offer when it comes to building and burning discs.
As it turns out, this is something that Ubuntu gets right. Not only does it include drag-and-drop burning abilities like the other OSes, but it includes a disc authoring suit: Brasero. In a nutshell, Brasero is a clone of Nero Burning ROM, much like Rhythmbox is a clone of iTunes. It sports similar UI elements as Nero, including the handy disc capacity meter towards the bottom of the window. As such, for anyone used to Nero it’s an easy transition to make.
Where Nero users will feel left out is that it’s only a clone of Nero Burning ROM, and little else. It can burn audio CDs, data discs, ISOs, and copy whole discs, but that’s it. It doesn’t have any audio/video mastering abilities like Nero does. In fact there’s nothing on the default Ubuntu install like this – Windows Movie Maker and iMovie have no direct counterpart on Ubuntu. This makes Ubuntu more capable than Windows or Mac OS X for data CDs, but underpowered for most kinds of media disc creation. Depending on how you use Ubuntu, this may or may not be an issue.
Meanwhile for users accustomed to drag-and-drop burning, you’ll find the drag-and-drop CD/DVD Creator satisfactory. As CD/DVD Creator doesn’t support packet writing, you’re limited to traditional building & burning via the GUI. CD/DVD Creator doesn’t support writing multisession discs either, so it’s entirely a write-once operation. Whether this is a problem or not depends on if you use packet writing – a quick check around here didn’t turn up anyone that uses it, so I’m not sure there are all that many people that are going to miss it.
For power users there are other options. The Ubuntu repositories contain other disc authoring suites, and a cut-down version of Nero is available too. We haven’t had a chance to check out either of these, but it looks like neither option provides a solid audio/video authoring package. Anyone needing such abilities may need to look elsewhere. For daily use however, it has everything I need.
Final Verdict: Meets My Needs
Windows Default: Paint
What I use: Adobe Photoshop CS3
Ubuntu Default: GIMP
In doing research for this section of our look at Linux, one thing became abundantly clear: Image editors on Linux have the potential to be a holy war. For Windows and Mac OS X the gold standard for image editing programs is Adobe Photoshop, but as Adobe does not offer it for Linux it’s up to the Linux community to fend for itself. In doing so they came up with the GNU Image Manipulation Program (GIMP) which strives to match Photoshop’s abilities on *nix for free. What results is an interesting situation.
In spite of the fact that I can barely make sense of Photoshop, it’s clear that GIMP is not just a Photoshop clone, for better or for worse. For someone looking for what would amount to a Linux version of Photoshop, they’re clearly going to be disappointed, as GIMP is not Photoshop or even Photoshop-lite. It’s an advanced image editor that is in a class of its own.
As far as default programs are concerned, GIMP is clearly miles and miles ahead of Windows’ Paint, and Mac OS X’s complete lack of a freestyle image editor (iPhoto being the next-closest thing). Even if it’s not up to Photoshop’s level of abilities, it’s a very capable image editor that comes with Ubuntu, rather than needing to be a separate program download on Windows or Mac OS X. This leaves me in a somewhat odd position.
Paint is anything but powerful, but it’s also simple. GIMP and Photoshop have at least one thing in common: they’re both capable of being complex beasts. As such I’m not convinced that it’s a good thing that GIMP is the default image editor on Ubuntu. For a beginner, it may be too powerful for its own good. For those reasons while it’s the most powerful default image editor when compared to Windows or Mac OS X, I’m not sure it’s the “best” if we’re to consider what fits user needs.
At any rate, when it comes to my own uses, I’ve previously mentioned that I’m not particularly competent than Photoshop. For image preparation for our articles it does the job nicely, while it’s clearly overkill for the task. For what little I do with Photoshop, GIMP works quite well, giving it the distinction of being the only default image editor that does what I need. For these reasons while it’s not a perfect replacement for Photoshop overall, it more than meets my needs for day-to-day use.
Final Verdict: Meets My Needs
Applications: Office Suite
Windows Default: None
What I use: Office 2007
Ubuntu Default: OpenOffice
Another thing that sets Ubuntu apart from Windows and Mac OS X is that the default install (and again, it fits on a CD) includes an office suite in the form of OpenOffice 2.4. Unfortunately OpenOffice 2.x is rather awful, which makes for a bad first impression. Admittedly this is the age of Ubuntu 8.04 showing since OpenOffice 3 wasn’t ready for nearly a year, but nevertheless I really, really wish that Ubuntu would inform new users about OpenOffice 3, stable application platform policies be damned.
This is going to be one of the few cases where I’m going to skip right past the Ubuntu default and move to something else. If you install Hardy, don’t waste your time on OpenOffice 2.4, go straight for OpenOffice 3.1.
With that out of the way, OpenOffice 3.1 is the latest version in the long line of the OpenOffice series. As has been the case for a number of years now, OpenOffice is the largest competitor for Microsoft Office, with Apple’s iWork and as of late, Google Docs as the other mainstream competitors in the office suite business. Like Firefox it’s an important cross-platform open source application, and is available on just about everything that has an operating system. Furthermore since Microsoft Office is not available for Linux like it is Windows or Mac OS X, it is the de-facto office suite for Linux.
In spite of its de-facto status, OpenOffice doesn’t have a particularly glorious history. Prior versions have a reputation for being slow and development has been equally glacial at times (e.g. it wasn’t until 3.0 that there was a native Mac OS X version). Furthermore as Microsoft Office’s loyal opposition, the OpenOffice developers have had to play catch-up to Microsoft whenever they do something new – such as introducing the Office Open XML format – which has limited the compatibility of OpenOffice and hence its suitability of a replacement.
With OpenOffice 3.x a lot of that has changed. Right off the bat one of the biggest changes has been much better support for Microsoft’s formats, with better reading and writing ability of the “old” 2003 binary formats, and the ability to read (but not write) the new OOXML format. In our informal testing we had no problem opening up a number of our old DOC/XLS and newer DOCX/XLSX files, with all of them presenting themselves correctly. We do have some 3rd party Excel files though (Intel’s Monte Carlo simulation) that would not open correctly under OpenOffice.
Like Microsoft Office, the core applications of OpenOffice include a word processor (Writer), a spreadsheet (Calc), and a presentation program (Impress). Backing that up are database program (Base), an equation editor (Math), and unique to OpenOffice a vector graphics editor (Draw). OpenOffice does not include an email client, in the case of Ubuntu that task is covered by Evolution.
From a features standpoint OpenOffice fits somewhere between Microsoft Office 2003 and 2007, this being a testament to the developers of OpenOffice given that it’s free and Microsoft Office is bloated. For what OpenOffice can’t do, I suspect you would need to be a hardcore Microsoft Office user to truly appreciate the difference. At this point OpenOffice is well beyond the feature set most home users would need, or even many corporate users.
From a visual standpoint OpenOffice isn’t quite as advanced however. Visually it’s still largely a clone of Microsoft Office 2000 or so. By no means do visuals make an office suite when it comes to word processing or spreadsheets, but this means that certain conventions that have gone out of style for Windows programs are still in use for OpenOffice. Users Microsoft Office 2007’s Ribbon UI will be particularly hard-pressed to move back down.
The lack of visual splendor does put OpenOffice at a notable disadvantage when it comes to Impress though. Where presentations often have a great deal of focus on such matters, OpenOffice doesn’t have the library of art and templates to match PowerPoint. It’s not by any means bad, but if I had a Pointy Haired Boss that loved eye candy, Impress would probably not impress them.
Otherwise Writer and Calc are competent versions of their Microsoft Office counterparts. There are no specific surprises here as both do what they’re supposed to, but nothing more. This article was written almost entirely using Write with no outstanding issues to report. It may not sound impressive, but Microsoft Office is a hard act to follow. Doing so for free when Microsoft Office is $150 or more is even more impressive. It’s something where you’ll never forget that you’re using a clone of Microsoft Office, but for the price tag you can excuse the lack of flair.
On a note about flair, like Firefox the experience is improved if some of Microsoft’s font sets are installed, particularly if you have documents written using them or are accustomed to writing in them. These font sets do not include Cambria, so Word 2007 documents are still going to look off.
Overall, I must admit that I generally did not use OpenOffice for my day-to-day use – the bulk of my use of it was for writing nearly this whole article. Outside of the lack of the ability to write OOXML files I didn’t run in to any specific problems, but I am accustomed to Microsoft Office’s Ribbon UI. Since I already have a copy of Microsoft Office there was nothing stopping me from using it beyond what Wine would do. As Wine is able to run Microsoft Office 2007 well enough that it met my needs, I didn’t have any strong reason to stick with OpenOffice besides experimentation and research for this article. If I didn’t have a copy of Microsoft Office 2007 (such as only having 2003, for example) would have stuck with OpenOffice, but as I did I was not prepared to take the efficiency hit in moving away from the Ribbon UI. This says more about the user than the program, but it’s also a subtle hint that OpenOffice could benefit from moving in the same direction.
Final Verdict: Meets My Needs, But I Didn’t Use It
Applications: Everything Else
Sometimes there is an advantage to not being a large, profit-generating target. If you are Microsoft or Apple, there are some things you just can’t take a risk on doing, the consequences of it backfiring are too great. In this case you would never see either of those operating systems include a BitTorrent client. While BitTorrent is legal, it can be used for many illegal things, making it an enemy of groups like the RIAA and MPAA – both of which Apple and Microsoft need to get along with for business reasons (imagine Windows without a DVD player) and because both have deep pockets should a fight erupt.
Canonical (the company backing Ubuntu) is not a large, profit-generating target and as a result they can get away with more here. We’ve already talked about legality issues encompassing codecs, but BitTorrent is another area where their size lets them get away with more. Ubuntu includes Transmission, a full-featured BitTorrent client, making it wholly unique (at least when compared to Windows/Mac OS X) for doing so. As a regular BitTorrent user, this is a most welcome type of application to include.
With there being so many BitTorrent clients I’m not going to get in-depth with features here other than to say that Transmission is a full-featured BitTorrent client. Instead the fact that it’s included at all is a big deal. Although it’s going to be a slight exaggeration, I would put the inclusion of a BitTorrent client up there with a web browser, an email client, or a media player. I consider BitTorrent an essential function, so a proper client is something that ideally would be included with every operating system. It’s that important.
There is one thing I’d like to add about Transmission in particular though. In my time using it, I’m not convinced that other BitTorrent clients are properly respecting it. I’ve noticed that some other clients appear to be ignoring or blocking it, and while it doesn’t appear to be opposed by a large number of clients, it’s enough that in my completely unscientific testing that this looks to make Transmission slightly slower compared to something like Azureus. Wikipedia notes that a version released over 2 years ago was commonly blocked for not being completely compliant with the BitTorrent specification, but I don’t know if this is related or not.
Moving on, there’s one other thing in Ubuntu that caught my eye, and that’s the inclusion of a Remote Desktop Protocol (RDP) client, going under the name of Terminal Services Client. Not to be confused with VNC, the open source remote desktop system commonly used on *nix systems, Remote Desktop Protocol is Microsoft’s proprietary remote desktop protocol and associated applications. While I had expected Ubuntu to include a VNC client, I had not been expecting a RDP client.
As I have a Windows Home Server for file storage and backing up my Windows machines, I need a RDP client to administrate it and the rest of my Windows machines. By “playing nice” and including a RDP client in spite of the fact that the protocol itself is proprietary and Ubuntu does not use RDP itself, this made Ubuntu much more useful for me straight out of the box. Among other things, with it I was able to immediately connect to my server and diagnose why I was having so much trouble connecting to my SMB shares, something which I’ll explain in greater detail in a moment.
Really the only downside to this is that it’s not as well built of a client as Microsoft’s own Windows client is, which is to be expected. Even on a gigabit LAN Terminal Services Client lags a bit compared to the real thing, but then again so does Microsoft’s official RDP client for the Mac. Ubuntu seems to be at a bit of a disadvantage here since it seems that Windows machines have an inherent advantage in being RDP clients. Nevertheless it’s fully usable, it’s just a bit slower.
Things That Went Terribly, Terribly Wrong
One concern I’ve had for some time when writing this article is that it runs the risk of coming off as too negative. I don’t want to knock Ubuntu just for being different, but at the same time I’m not going to temper my expectations much as far as usability, stability, and security are concerned. If something went wrong, then I intend to mention it, as these are things that can hopefully be resolved in a future version of Ubuntu.
This section is reserved for those things that went terribly, terribly wrong. Things so wrong that it made me give up on using Ubuntu for the rest of the day and go back to Windows. This isn’t intended to be a list of all the problems (or even just the big problems) I encountered using Ubuntu, but rather the most severe.
We’ll start with mounting file servers. I have a Windows Home Server box that I use to store my common files, along with hosting backups of my Macs and PCs. I needed to be able to access the SMB shares on that server, which immediately puts Linux at a bit of a disadvantage since it’s yet another non-native Microsoft protocol that Linux has to deal with, with protocol details that were largely reverse engineered. My Macs have no issue with this, so I was not expecting any real problems here, other than that the network throughput would likely be lower than from Windows.
For whatever reason, Ubuntu cannot see the shares on my WHS box, which is not a big deal since neither do my Macs. What went wrong however is that manually mounting these shares is far harder than it needs to be. Again using the Mac as a comparison, mounting shares is as easy as telling Finder to connect to a SMB server, and supplying credentials, at which point it gives you a list of shares to mount.
Ubuntu, as it turns out, is not capable of mounting a share based on just the server name and credentials. It requires the share name along with the above information , at which point it will mount that share. Browsing shares based on just a user name and password is right out. Worse yet, if you don’t know this and attempt to do it Mac-style, you’ll get one of the most cryptic error messages I have ever seen: “Can't display location "smb://<removed>/", No application is registered as handling this file.” This tells you nothing about what the problem actually is. It’s poor design from a usability standpoint, and even worse error handling.
Unfortunately the story doesn’t end here. Ideally all applications would work as well with files on a network share as they would a local drive, but that’s not always the case – often the problem is that it’s harder to browse for a network shared file than a local file from inside an application. For this reason I have all of my common shares mapped as drives on Windows (this also saves effort on logging in) and Mac OS X takes this even further and immediately maps all mounted shares as drives. So I wanted to do the same for Ubuntu, and have my common shares automount as drives.
Nautilus, which transparently accesses SMB shares, is of no help here, because by transparently accessing SMB shares it doesn’t mount them in a standard way. The mount point it uses is inside of a hidden directory (.gvfs) that some applications will ignore. The ramifications of this being that most applications that are not a GTK application cannot see shares mounted by Nautilus, because they can’t see the mounted share that GTK tells its applications about, nor can they see the hidden mount point. The chief concern in my case was anything running under Wine, along with VLC.
The solution is not for the faint of heart. Highlights include additional software installations, manually backing up files, and a boatload of archaic terminal commands – and that’s just if everything goes right the first time. I love the terminal but this is ridiculous. Once it’s finished and set up correctly it gets the job done, but it’s an unjust amount of effort for something that can be accomplished in a matter of seconds on Windows or Mac OS X. This was easily the lowest point I reached while using Ubuntu.
The other thing I am going to throw in this category is mounting ISO images. I keep ISOs of all of my software for easy access. Interestingly enough, Ubuntu has the file system driver necessary to mount ISOs, but not a GUI application to do this. While it would be nice to have all of that built-in (ala Mac OS X) that’s not the flaw here – I’m perfectly content downloading a utility like I do for Windows (Daemon Tools). The flaw here was the Ubuntu GUI application for this, Gmount-ISO, can’t mount ISOs off of a SMB share. Worse yet, it doesn’t tell you this either.
The first time around, the only solution I was able to find was an another archaic CLI command that involved running the mount command by hand, in the style of “mount file.iso /cdrom -t iso9660 -o loop”. This was a terrible solution.
It wasn’t until some time later that I finally found a better solution. An application that wasn’t in the Ubuntu repository, AcetoneISO, can properly mount files off of SMB shares. Better yet it’s a bit closer to Daemon Tools functionality, since it can mount BIN/CUE, NRG (Nero Image), and MDF images.
I throw this in “terribly, terribly wrong” column because the solution was completely non-obvious. If you search for “Ubuntu Hardy mount iso” or something similar, AcetoneISO is nowhere near the top of the results, and the Ubuntu package repository is of no help. What’s in the repository is the aforementioned useless Gmount-ISO, and what’s at the top of Google’s results are Gmount-ISO and instructions to mount the image via CLI. It’s a success story in the end, but it was uncomfortably painful getting there.
If there’s any consolation in these matters, it’s that these were the only two issues that made me outright stop using Ubuntu, and go back to Windows for the day. Any other problems I had were significantly less severe than this.
Things That Went Right
On the flip side of the things that went wrong, we have the things that went right. Most of the Ubuntu experience went right and has been covered previously, so this is going to be a catch-all for other things about Ubuntu that impressed me, but don’t necessarily fit anywhere else.
One of the nicer features of Mac OS X that you don’t see mentioned very much is the Keychain, a credential management framework for applications to use to securely store passwords and the like. Such systems aren’t rare – even Windows has something similar through its Credentials Manager – but Mac OS X is unique in that its implementation at least gets used, at times.
I had not been expecting something similar in Ubuntu, so it caught my eye when a Mac OS-like password box came up when I was logging in to my file server. As it turns out Ubuntu has similar functionality through the Passwords and Encryption Keys application. And since Ubuntu heavily uses the GNOME desktop environment that this application is a part of, a number of its applications are built against the keyring and use it.
It’s not quite as tightly woven as Keychain is under Mac OS X, but it’s better utilized than Windows and used enough that it makes sense to visit the keyring application. The biggest holdout with a stock install is Firefox, which uses its own password manager regardless of what platform it’s on.
Another thing that caught my eye was Ubuntu’s archive manager, called File Roller here. As we’ve lamented many, many times before, Windows’ archive management abilities are terrible. Files are slow to compress, files are slow to uncompress, and just supporting Zip files isn’t quite enough. Mac OS X does a bit better by being faster, but it also has absolutely no support for browsing Zip archives, it just packs and unpacks them. Most power users I know will have something like WinRAR or BetterZip installed to get a proper archive browser and wider archive support.
File Roller is a complete archive manager, and it supports slightly more exotic archive formats like RAR along with the customary Zip and *nix standard of GZip. The biggest knock against it when it comes to archive formats is that it can read more than it can write, RAR again being the example here.
This also brings up an interesting quirk with archives under *nix that you don’t see under Windows. The Zip format specifies it as being both a container for multiple files and a compressor for those files. GZip on the other hand can only compress a single file – so when it comes time to compress multiple files, they must first be packed in a compressionless tarball (TAR), and then the tarball is compressed, resulting in .tar.gz. The quirk is that the Zip format compresses each file separately, while .tar.gz by its very nature compresses all the files together at once; this is commonly known as solid archiving.
Depending on the files being compressed, solid archives can have significant space advantages over individually compressed files by taking advantage of redundancy between the files themselves, and not just the redundancy in individual files. This is also why WinRAR is so common on Windows machines, since the RAR format supports solid and individual archiving.
Now the downside to solid archiving is that it takes longer to pull a file out of a solid archive than an individually compressed archive, since everything ahead of the file must be decompressed first in order to retrieve the data needed to recreate the desired file. So solid archiving isn’t necessarily the best way to go.
Ultimately with the wider support for archive formats under Ubuntu, in some situations it can achieve much better compression ratios than what can be done under Windows. Windows isn’t entirely helpless since when it comes to installers they can use MSI installers (which use solid compression), but as far as plain archives are concerned the only built-in option is individual archiving. It’s a small benefit that can pay out nicely from time to time for Ubuntu.
Wine
As I mentioned previously, Ubuntu doesn’t always have an application that fits my needs. Sometimes what I really need is a Windows application but I don’t want to have to boot back in to Windows to get it. The surefire solution to this kind of dilemma is to set up a Windows installation in a virtual machine (Parallels, VMWare, or VirtualBox), but virtual machines are slow to boot, and consume fair amounts of both disk space and RAM. As it turns out, there’s a better solution: Wine.
“Wine is a translation layer (a program loader) capable of running Windows applications on Linux and other POSIX compatible operating systems.” Unlike a virtual machine, Wine doesn’t install or run a complete copy of Windows in the background, rather it’s an implementation of the Win32 API designed to sit on top of *nix operating systems. The compatibility isn’t nearly as good as a virtual machine, but the trade-offs of lower resource usage and faster loading times are worth it. If I can use Wine rather than a virtual machine, then that’s the way I want to go.
I should note that Wine is anything but new (it’s some 16 years old now) and it’s pretty common too. Fully supported versions of it are sold as a product focusing on business applications (Crossover) and there’s quite a number of not-quite-native Mac games that are really Windows games with a Wine-based wrapper (Cider). But it’s definitely new to me. And I should note (having learned this the hard way) that Wine is not an emulator – the Ubuntu community really hates having it called that.
I originally intended to use Wine for 3 things:
- iTunes, so that I could sync my iPhone
- Games, in order to avoid the primary reason I dual boot
- Microsoft Office
iTunes was a long-shot in the first place, and it should not come as a surprise that it didn’t work. I had to settle on dual booting whenever I needed to sync (a virtual machine would have also worked, but I didn’t want to have to deal with two sync databases).
Gaming was a crap-shoot. I’m actually not going to spend too much time talking about this because we’re going to go much more in depth on this in our next piece, but I’ll mention it quickly to discuss usability. The two games I had a particular interest in were Supreme Commander: Forged Alliance, and Team Fortress 2.
The performance on both games was below that on Windows. In the case of Team Fortress 2 running it with DirectX 9 graphics (Shader Model 2/3) was unbearably slow, and with DirectX 8.1 graphics it’s unbearably ugly (this being a product of TF2 simply looking a great deal worse without DX9 functionality). Technically I could play TF2, but it was going to be rougher than I could settle for.
As for Supreme Commander, the speed issue can be particularly problematic. The game is a CPU-eating monster, and it takes nearly everything it can get for its intricate simulations and AI routines. The loss here is that for whatever reason when bogged down the simulation speed was noticeably slower than under Windows, which while not technically unplayable can make a game so slow that it’s not practical to finish it. The other issue was minor graphical corruption with the icons; this was not a game breaker, but it was another sign to go back to Windows.
Now to the credit of the Wine development community, there are a number of games that apparently work well under Wine according to their application database, however the games I wanted on the hardware I had were not functioning as well as I’d like. Wine wasn’t going to meet my gaming needs. When we do Part 2 of our Ubuntu series and take a look at 9.04 Jaunty Jackalope, we’ll take a more concentrated look at gaming inside and outside of Wine.
Finally, we have the success story in my use of Wine, Microsoft Office. As I stated previously when discussing OpenOffice, in spite of its abilities I missed Microsoft Office’s Ribbon UI. As Wine supports Word and Excel well enough to meet my needs, I was able to install those applications and use them as I would regularly use them under Windows. Their behavior under Wine isn’t perfect, as Wine’s application database will attest to, but the problems are not something that I encountered on a day-to-day basis. The biggest difference is that Wine + Ubuntu doesn’t have the same fine level of font anti-aliasing that Vista does, which makes it look slightly different. Meanwhile Outlook wasn’t as well behaved, but I already had Evolution which covered my needs in the first place.
Along with Microsoft Office, I also threw a few other assorted applications at Wine, which it handled without an issue. This includes some .Net 2.0 applications, which after installing the .Net framework in to Wine worked, and was not something I was seriously expecting. Although I wasn’t able to use Wine for everything I needed, it had a lot to do with keeping me in Ubuntu more often.
The User Experience
Now that we’ve had a chance to go over the various features of Ubuntu and its included applications, we can get to the burning question: how is it?
In a nutshell, my own experience with Ubuntu has been that it’s capable of meeting 95% of my daily needs, and 75% of my weekly needs. Outside of the lack of the ability to sync my iPhone (which again is Apple’s fault), on any given day I did not need to boot up Windows. However in any given week I would need to boot in to Windows upwards of several times to run various Windows programs that don’t work under Wine or have a doppelganger for Ubuntu, not counting Windows games which also required booting back in to Windows. The result was that there was more dual booting than I would have liked, but it was acceptable.
What worked best for me under Ubuntu were the most common tasks, which makes sense given Ubuntu’s focus. We’ve already hit on how great Firefox is under Ubuntu, but also music playback, email, and word processing worked well. There was never any point where I felt like I could absolutely not accomplish something related to these tasks when using Ubuntu. However with that I will put the disclaimer that I didn’t find Ubuntu to be significantly better at any of these tasks – it was merely good enough.
If this sounds boring, it is. There’s not a lot to be said about otherwise mundane things that work well. Windows and Mac OS X could do these things, and so could Ubuntu. The distinguishing factor here really isn’t functionality; it’s that all of this was free.
In many situations Windows would still offer a better experience than Ubuntu. Sometimes this is a more polished GUI, as it is Ubuntu often looks like an orange version of Mac OS 9 (the bad Mac OS). Other times this would come down to professionally developed programs having an extra feature or two that while not critical, were nice to have. There are numerous little things like these that still keep Ubuntu well-separated from Windows and Mac OS X.
One item where I feel Ubuntu failed in particular is CLI use, which was a condition I outlined earlier. I wasn’t able to avoid using the CLI under Ubuntu, in fact I didn’t even come close. Some of this comes down to the fact that user generated support often uses CLI commands in lieu of instructions for dealing with the GUI, and in other situations such as mounting ISOs and installing video card drivers the situation was completely unavoidable. These are correctable problems.
Along those lines the default configuration of Ubuntu leaves me scratching my head. For example, Ubuntu has a file indexer and search system ala Windows Search and Spotlight. For whatever reason this indexer is not enabled by default and as a result it’s quite easy to miss. By the same token Compiz defaults to not using v-sync, which means windows will tear when moved. This is something hardware accelerated compositing specifically exists to solve. These items, along with finding a way (any way) to install the Microsoft Core Fonts by default so that the font disparity no longer exists would make the initial experience a better one.
The biggest negative influences in the Ubuntu experience were the items we listed under Things That Went Wrong. It’s easy to pick at things that don’t work, but these also happen to be the things that drove me out of Ubuntu for that moment. Meanwhile the biggest positive influences come down to Firefox and Totem. Neither is perfect, but as I discussed in their respective sections they’re great programs that are much better than the default programs found with Windows and Mac OS X.
Overall I found the Ubuntu experience to be decent, but not spectacular. Next to any issues listed out above, there’s a general lack of killer applications. As a result unless you specifically value the fact that it’s free (in either sense of the word) or the security benefits of it not being Windows, then there’s really nothing there that makes Ubuntu compelling compared to Windows or Mac OS X.
Test Setup
Since this first part is more about the applications and the user experience than the performance, we’re going to keep the benchmarks short. Along with looking at 9.04, part 2 will focus on a greater level of benchmarking, particularly graphics benchmarking.
For now we’re going to be taking a look at general situations – encoding, compression, file operations, etc. Because there’s a lack of a solid benchmarking suite that’s compatible with both Windows and Linux (e.g. PCMark), we don’t currently have a way to benchmark multitasking scenarios. So this is largely a look at single application performance.
Due to some initial issues with the 64bit version of Hardy, all of the following testing is done of the 32bit versions of Hardy and Windows Vista respectively. Future articles will be done with 64bit operating systems.
Our test setup was as follows:
Software Test Bed | |
Processor | Intel Core 2 Quad QX6850 (3.00GHz/1333MHz) |
RAM | G.Skill DDR2-800 (2x2GB) |
Motherboard | Gigabyte GA-P35-DR3R (Intel P35) |
System Platform Drivers | Intel 8.1.1.1012 |
Hard Drive | Seagate 7200.11 500GB SATA |
Video Cards | 1 x GeForce 8800GTX |
Video Drivers | NV ForceWare 186.18 |
Power Supply | OCZ GameXStream 700W |
Desktop Resolution | 1600x1200 |
Operating Systems | Windows Vista Ultimate SP2 32-Bit Ubuntu 8.04 "Hardy Heron" |
. |
CPU Benchmarks
We’ll start our short look at Ubuntu’s performance with our CPU intensive benchmarks. Up first is SuperPi, a single-threaded pi-calculating benchmark. Here we time how long it takes to calculate Pi to 1 million digits.
We ran this test several times more than usual just to make sure we weren’t seeing any kind of error. The Linux version of SuperPi really is about 30% faster than the Vista version. Keep this in mind, this will be an important point later.
Meanwhile the situation for LAME is inverted. Vista outscores the Linux version by nearly 20%.
Using the cross-platform X264-based Handbreak for our video encoding test, Vista once again pulls ahead of Linux.
Once more Vista is ahead by a large margin.
From what we can tell, there’s little-if-any innate performance advantage to Vista or Linux in these benchmarks. Our working theory is that the performance difference comes down to the compiler used. Many Linux applications are compiled with GCC, while for Windows it’s either the Visual Studio compiler, or Intel’s own compiler (which is also available for Linux). There’s also a matter of compiler settings, as we saw in our quick breakout of Firefox benchmarks.
Meanwhile SuperPi uses a lot of hand-rolled code, although we’re still not sure why it’s outperforming Vista on Linux by as much as it is.
To shed a little more light on this idea of compiler performance, we have a few benchmarks of Windows application performance under Ubuntu through Wine.
Here we see a most amazing thing: Ubuntu is outperforming Windows at running Windows applications! As we’ve removed the influence of compilers the Photoshop results are particularly interesting. From what we can tell it’s normally as fast under Linux as it is Vista, however there seems to be a short gap of low-CPU usage when running it under Vista that doesn’t occur when running it under Ubuntu. As a result Ubuntu finishes a few seconds earlier.
There are a number of conditional cases that mean that applications running under Wine don’t always match or beat Windows performance, but in our tests there’s no performance hit to using Wine to run Windows applications.
These results also lend a great deal of support to the idea that there’s a significant difference in performance between the two operating systems due to their compilers. This goes particularly for the LAME benchmark, where the performance gap melts away under Wine. This is something we’re going to have to look in to in the future.
Browser & Video Benchmarks
Next up we have our full suite of benchmarks for Firefox, along with a look at video playback performance.
As we discussed in our look at Firefox, the Linux version of Firefox is not compiled with profile guided optimizations, and as a result it underperforms the Windows version in CPU-heavy tasks such as Google’s V8 Javascript benchmark. Running the Windows version under Wine closes this gap, however it’s a limited solution since there are other performance problems (mainly with Flash) in that configuration.
Speaking of Flash, we mentioned previously that it’s slower under Ubuntu (and virtually every other OS) than it is under Windows. This is one of the worst case scenarios, and as GUIMark is capped at 60fps it may actually be worse if we could go higher.
For another look, here’s the CPU usage of Firefox while watching an HD YouTube video. Ubuntu once again underperforms compared to Windows, but not by nearly as much as the worst-case scenario.
In our page loading tests however, this difference melts away. The total loading time of our 4 pages is 12 seconds under both Ubuntu and Vista.
Finally we have VLC as our Linux video playback benchmark. While VLC is not the default media player for Ubuntu, we’re using it instead of Tote due to the fact that it’s cross-platform. Here we’re taking a look at a 30 second section of a 720P H.264 encoded movie.
There’s an interesting phenomena here with respect to CPU usage. VLC uses roughly the same amount of CPU time on both operating systems, however we caught Ubuntu’s X server eating up additional CPU time on Ubuntu, while Windows’ Desktop Window Manager did not move. We’re not entirely sure what’s going on, but it looks like X needs to burn extra CPU time on video playback.
File/Networking Performance
Finally we have file and networking performance. As Ubuntu uses a different file system (ext3 versus Windows’ NTFS) there’s the potential for some significant differences here.
Starting with file performance, we will be using a collection of roughly 1500 files, totaling 380MB.
Even after SP1, file performance has long been a thron in the side of Vista. Here it manages an embarrassing loss to Ubuntu, taking over 50% longer to make a copy of the same folder.
Looking at ZIP compression times, it’s an even larger gap. Vista needs 75% longer to compress the same folder. While compression can be CPU bound, looking at our data this specific test is largely I/O bound. We’ve already established at Windows’ built-in ZIP abilities are pretty bad, but we’ve never figured out why this is.
Decompression is even worse for Windows. It takes nearly 4 times as long to decompress the same archive. It’s not even a contest – Ubuntu wins, if only because it’s the only competent operating system out of the two.
Meanwhile in our network copy tests, we are copying that folder to a server running Windows Server 2003. This gives Windows an advantage since we’re using SMB, but since SMB is the predominant protocol for consumer file server gear, it’s a fair test of such use.
Here it’s nearly a dead heat. Both Ubuntu and Vista need just as long to copy our file collection to the server, meanwhile Windows needs a bit less time to copy that same folder off of the server.
Switching gears, using a 2.6GB ISO we see a clear performance difference. In both copying to and from the server, Ubuntu needs at least 50% longer. Since this test isn’t using a lot of CPU time, our best guess is that Windows is doing some buffering that Ubuntu doesn’t get to do. The transfer rates for Linux are below what the hard drives on either end can manage.
Ultimately for users with lots of local storage, Ubuntu appears to outshine Vista. But for users with lots of remote storage (e.g. a NAS), Vista outshines Ubuntu.
Finally we have the amount of time it takes to start up each operating system, another disk-bound test. Vista is not something I would consider particularly speedy, so I’m a bit surprised that Ubuntu did not manage to outperform it here. The 2 second difference is measurable, but small enough that it won’t make any real impact.
First Thoughts
Since this is Part 1 of a 2 part series, rather than ending on a conclusion, we’ll end on some first thoughts.
In searching for an answer to our question of whether Ubuntu is good enough to convince me to switch, I ultimately have failed to find enough compelling reasons to entice me as a user to switch to Ubuntu for my day-to-day operations. I should make it clear that this is not taking price into consideration – this is only taking into account my current situation as a Windows Vista user. Ubuntu does plenty of things well and I could certainly use it for my day-to-day operations, but there are few things it does better and more things it does worse as compared to Vista, such that using Ubuntu likely hurt my productivity even after I adapted to the differences. It’s hard to fully compete with commercially developed software when you’re giving yours away for free, so I don’t consider this a surprise.
From a performance standpoint, there’s little reason to switch in either direction. As I stated early in this article performance was never a serious condition for evaluation anyhow, and the results don’t change that. Ubuntu outperforms Vista at times, but at other times it looks to be held back by compiler differences and the disadvantage of needing to play nicely with proprietary products that don’t return the favor (e.g. SMB performance). As far as I am concerned, Ubuntu performed no worse than Windows for my day-to-day needs.
Now there are some situations where performance is important enough that it can’t be ignored, and the gap wide enough to make a significant difference. In Part 2 we will be looking for these situations.
I do think there are some niches in which Ubuntu works well, where the operating system itself is the killer app. One such situation is (or rather was) the Netbook market. It’s a market that used to be dominated by Linux operating systems, including Ubuntu’s Netbook Remix. On such devices where you don’t have the resources to do anything fancy, Ubuntu’s weaknesses become less important. Meanwhile price becomes more important. However cheap copies of Windows XP specifically for the Netbook market appear to have killed this idea for now.
For what it’s worth I do have an older laptop (for guest use) that currently runs XP. For the same reason as the Netbooks, I’m considering replacing XP with Ubuntu 9.04 for the security benefits of it not being Windows. I’ve already had to wipe the machine once due to a guest getting it infected with malware.
As I haven’t gone too much in depth yet, let’s talk about user-to-user support. In spite of its user-friendly label, I have not been particularly impressed with the Ubuntu support structure. A lot of this comes down to the difficulty in finding help for existing issues, in spite of colorful names like Hardy Heron to help weed out results. Ubuntu’s Wiki, package archives, and forums all have a great deal of old information that turns up with searching. Results for 7.10 Gutsy Gibbon for example are now a historical curiosity – support ended for Gutsy back in April. Those pages and threads are largely unhelpful, and yet they clutter the search results of Google and the Ubuntu site’s search engine, pushing down more relevant information. Meanwhile the opposite is also true: results for newer versions of Ubuntu are also unhelpful.
The source of the problem comes down to 3 things. 1) Old information still exists and apparently doesn’t go away very easily. 2) Particularly for Ubuntu’s forums, they are divided up by topic but not version. 3) New versions of Ubuntu are published too often.
Now #3 is probably going to be a bit of a touchy subject, but it goes back to why we started with 8.04 in the first place. Either you’re on the upgrade treadmill or you’re not. Ubuntu moves so fast that it’s hard to jump on board. This is good from a development perspective since it allows Ubuntu to improve itself and get feedback sooner, but I don’t believe it’s good for users. A working user-to-user support system needs a lot of knowledgeable users, and the Ubuntu community is clearly full of them, but they seem to be spread out all over the place with respect to what versions they have experience with.
It’s to the advantage of less-knowledgeable users that they stick with a well-tested LTS release rather than be on the bleeding edge, but that’s not where the most knowledgeable users are. Compared to the Mac community where everyone is in sync on Leopard, or the Windows community where everyone is hating Vista and lusting over Windows 7, there’s a lack of cohesion. User-to-user support would be better served by having the community less spread out.
I have mentioned this previously, but the driver and packaging situation needs to be reiterated. While I don’t think the Linux kernel developers’ positions are unreasonable, I do think they’re hurting Ubuntu as a user-friendly operating system. The driver hell I had to go through shouldn’t have occurred, and if there was a stable API for “binary blob” drivers perhaps it wouldn’t have. The pragmatic position is that users don’t care if their drivers are open source or not, they would rather things just work. Ideals can only take you so far.
Along these lines, the packaging/repository system and the focus on it needs some kind of similar overhaul. I like how it allows updating software so easily and how easy it is to install software that is in Ubuntu’s repositories. But software that is not in a repository suffers for it. Installing software shouldn’t be so hard.
Finally, there’s the value of free as in gratis. Ubuntu may not be perfect, but I am still amazed by what it does for the price of $0.00. It’s a complete operating system, entirely for free. This is something that needs to be recognized as a credit to the developers, even if it doesn’t encourage anyone to switch.
Looking forward, coming up in the next couple of months will be the launches of Windows 7, Mac OS X 10.6 Snow Leopard, and Ubuntu 9.10. Compared to where Ubuntu stands with 8.04, there’s a year and a half of time for improvements, along with another LTS release due inside of a year. I think the new releases of Windows and Mac OS X are going to tip the scales away from Ubuntu in the immediate future, but given the lifetimes of those operating systems it’s going to give Ubuntu plenty of time to improve. This is something we’ll take a look at first-hand with Part 2 of this series when we look at 9.04 and more.
As a parting thought, we’d like to hear back from you, our readers, on the subject of Ubuntu and Linux in general. We’d like to know what you would like to see in future articles, both on the hardware and software side. Including some form of Linux in some of our hardware tests is something we’re certainly looking at, but we would like specifics. Would you like Linux-focused hardware roundups? What benchmarks would you like to see in Part 2 of this series (and beyond)? We can’t make any promises, but good feedback from you is going to help us determine what is going to be worth the time to try.