I’d be very ungrateful if I started this article with anything but thanking everyone who has read or commented on my previous post. I also appreciate all the websites who embraced my effort. At first, I was completely overwhelmed by the huge number of visits and comments—either here, on this blog, or on other websites, especially www.osnews.com . Later, when the ecstasy subsided, I started feeling a little ashamed of myself for not doing a better job. In fact, I didn’t expect my article to receive that much attention. It was just a one-man effort with very little knowledge. I would also like to thank my friend Eng. Ahmad Bakdash for his generous support.
I thoroughly read all comments and suggestions, but I was a little bit undecided about repeating the test. I was afraid of being misjudged by readers who tend to take things very seriously. I’ve always been aware that my little experience and resources will always keep me from doing a conclusive benchmark. I stated that clearly in my previous article but many readers missed it. Eventually, I couldn’t resist doing another test with some mistakes corrected to see how it would go. But again, it is nearly impossible for me to do a perfect benchmark, and it is absolutely impossible to satisfy every reader. I can only do my best.
Before I start with the updated results, I have to clarify a few things that had drawn so many misplaced comments to my previous article.
The goal of this test is exclusively to compare Ubuntu Linux to Windows XP from a non-experienced user’s point of view. I only thought about writing this article because Desktop Linux has been growing with some acceleration in the last couple of years, and it looks as if most of this expansion was on low-end PCs, sub-notebooks and mainly for home users. Businesses and professional individuals have also showed interest in Linux but software availability limitations would impede any large-scale adoption.
Microsoft realised that low-end PCs, even with their slim margin of profit, make up a significant part of the market and they also protect higher-end PCs profitability. Moreover, allowing new players to compete for market share is always dangerous. Of course, it is good for the consumer because competition always produces better and cheaper products, but in Microsoft’s eyes, it is very bad. With Windows Vista automatically disqualified (due to high system requirements), Microsoft’s only choice was to keep Windows XP alive and to promote it as the best OS for low-cost computers. Windows XP has very good reputation and 7 years of bug-fixing, not to mention its unparalleled compatibility.
On this Linux side, Desktop-oriented distributions are growing by the day. Ubuntu Linux is the best example of a user-oriented OS that has gained huge popularity even though it is only 4 years old.
Therefore, I think it is important to compare the two operating systems from a normal user perspective. In most cases, what matters for a home computer user is mainly the out-of-the-box experience which is based on the applications performance and the OS responsiveness while doing the most common things.
Here is a sum-up of this article’s scope:
- This test is NOT to compare Linux to Windows in general. It is to compare the most popular Desktop Linux to the most popular Windows edition.
- This test is NOT to see how fast Linux (or Ubuntu) or Windows XP CAN BE. It is to see how Ubuntu 8.04 and Windows XP SP3 handle some of the most common tasks at their near-default setups.
- This test is NOT a pure platform test. It simply compares some specific applications on specific editions of both Linux and Windows.
- This test is NOT for experienced users who can tweak, customise or even build their own systems. Such users can do their own tests and probably don’t need any benchmarks.
- This test is NOT conclusive. The results are certainly reproducible on the test machine and probably indicative, but it is only one machine and there are plenty of variables which can alter the results. However, I personally think that the results may vary on other machines but will stay in the same direction.
All applications in this test (except RAR) are free, and they represent some of the best free software available today; so even though most of them seem to be Linux-related, they actually offer great alternatives for paid proprietary software in Windows, and a legal refuge from software piracy.
After I read so many comments on my previous post, I realised that it was a big mistake to guess the reasons behind the results. Many people with much greater knowledge contributed there and I really hope that they will do the same once again. I’ll simply present the results and let the readers add their own insights. One point, though, is that, for a normal user, the reasons don’t mean that much. It doesn’t matter for the consumer why a product is better or worse than another product. What matters is the current status and sometimes the value (price/value ratio). Independent reviews and benchmarks can help developers to improve the products, and users feed-back should be the most important consideration for any developer.
As in my previous test, I started with some command-line tools.
The first was ClamAV. This time I changed the test folder to include a wider verity of file types and file sizes, and this actually changed the outcome of this test. I discovered that ClamAV speed is very sensitive to the scanned content. Therefore, it is very difficult to have a conclusive result. In this test, though, Windows version of ClamAV was slightly faster than its Ubuntu cousin.
I also changed the RAR test folder. I included more files types and sizes and some deep directory structures. However, the results were largely the same. Windows XP SP3 was faster in compression, while Ubuntu 8.04 LTS was consistently better in decompression. Later, I discovered that Z-zip compression utility was nearly twice as fast as RAR on Ubuntu and only slightly faster than RAR on Windows. It made me think that, for some reason, RAR might be using only one CPU core of my Dual-core Athlon in Ubuntu while using both cores in Windows XP. I monitored my CPU and I was right! I thought it was because I used a trial version of RAR, but I also used a trial version for Windows XP. I read help file and there was nothing about such a limitation. Considering this new information, it is possible to say that RAR would be much faster on Ubuntu if it had used both cores, or in other words, that RAR compression was faster on Windows XP but the speed per core was in Ubuntu’s favour.
Some readers suggested including p7zip as an open-source replacement of RAR. I used the same folder from the RAR test, so it was also helpful in comparing p7zip to RAR as well. The test showed that both Windows XP and Ubuntu versions of p7zip were very close in terms of performance; while against RAR, p7zip was faster in compression and tangibly slower in decompression. I also verified that it did use both processor cores.
Next was the GIMP. I did the very same test again and the result was almost the same. GIMP was always faster on Linux and the difference became massive when the script included difficult selections or required excessive screen re-drawing. It was also noteworthy that both Windows XP and Ubuntu 8.04 were slightly faster than the original test. It might be because of the updated kernel in Ubuntu and the updated nVIDIA driver in Windows XP.
I also used the same sample files for Blender (in addition to a new one), but I used the latest version (2.46) on both systems. I downloaded the binaries from the official website and did not use the ones in Ubuntu repository. Some comments mentioned that the new version of Blender was faster than the older one, and yes it was. Actually, I was very impressed with the huge improvement in rendering times on both Windows XP and Linux. Ubuntu-supplied version of Blender 2.45 was 20% slower than its Windows counterpart, but the new 2.46 official versions were closer is this test and Windows XP’s advantage was cut to less than 10%.
Then I moved to digital media tests, starting with the usual Avidemux.
Encoding process priority was set to “above normal” and the sample files were the same from my previous attempt. The results were very similar too. Windows XP SP3 easily won, with just about the same 20% speed margin over Ubuntu 8.04. For some reason, MJPEG encoding to x264 was slightly slower on Ubuntu than my original test, even after 5 additional runs, while converting a standard-definition DV stream into x264 was somewhat faster.
In LAME MP3-related benchmarks, Encoding a PCM WAV stream to high-quality MP3 was faster on Windows XP, and it also was the case with down-sampling high-quality MP3 file into CD-quality audio. On the other hand, Ubuntu was a clear winner in decoding MP3 files to WAV format. LAME decoding was almost 3 times faster on Ubuntu than it was on Windows XP.
LAME is a built-in component in many MP3-capable software, including Windows Media Player, iTunes and jetAudio, in addition to almost all MP3-players management programs.
I wanted to do my test with a newer version of LAME. I downloaded LAME 3.98b8 binaries for Windows but I couldn’t find any for Ubuntu. I tried to compile my own using the sources, but I failed to make the ASM optimisations work. I tried everything that I knew, but to no avail. I later learned that it was a known bug in LAME 3.98 beta8 on both Linux and Mac OS, so I gave up and went back to LAME 3.97.
AAC is an audio format that was intended to replace MP3. It is the standard audio format in the iTunes store and iPods. AAC is capable of much better sound quality per kilobyte and it is fast-growing in popularity. FAAC is a command-line AAC encoder that is very similar to LAME (can be found in other software). In this test, it was slightly slower than LAME for the same audio quality. FAAC encoding speed was almost the same in both Windows XP and Ubuntu.
Also new in this test was the OpenOffice.org 2.4 suite—the excellent free replacement of Microsoft Office. OpenOffice.org is the perfect example of free-and-good software for all open-sources advocates. It does most of what MS Office does and it has been widely adopted on Windows and Mac OS in addition to its usual Linux stronghold.
There are plenty of ways to benchmark OpenOffice.org, but I only did 4 simple things: Load OpenOffice.org programs, open some files, export them as PDF, and convert some MS Office files into native OpenOffice.org formats. I was surprised how even Windows XP and Ubuntu 8.04 were in this test. It looked like OpenOffice.org performed almost the same in both Windows XP and Ubuntu with only one exception: Cold start-up. When loading OpenOffice.org for the first time after reboot, Ubuntu would load it much faster. I tried it with Writer and Calc and it was the same. However, once this first time was done, it seemed that Windows XP would load it again just as fast as Ubuntu. I also should mention that, for this test, the timer resolution was 0.1sec, so it was not very accurate for such short times, but I simply thought that any smaller differences wouldn’t mean a lot for normal users.
In Ubuntu 8.04, OpenOffice.org Writer was a little faster in converting MS .doc files into .odt and in exporting PDF files than it was in Windows XP, but opening a long document (a few hundreds of pages) was always faster in Windows XP.
OpenOffice.org Calc was slightly faster in Windows XP, but I couldn’t find any difficult-to-open files to make the test harder. The only note here was that if a spreadsheet had unrecognised formats or even just some heavy formatting, it would open much faster in Windows XP than in Ubuntu 8.04, especially for the first time.
On the other hand, OpenOffice.org Impress (MS PowerPoint counterpart) was faster in Ubuntu, and the difference was somewhat clear in converting MS PowerPoint slideshows into Impress’s native format (ODP).
The last individual test was Firfox 3.0 RC1. I got the RC1 Windows binaries from the official website, while for Ubuntu 8.04, I updated Firefox 3.0 beta5 using the PPA unofficial repository.
In the loading time benchmark, Ubuntu again was clearly faster than Windows XP for the first-per-boot start-up, and they, again, were tied for the hot start-up test.
Webkit’s SunSpider Benchmark gave Windows XP a slender 4% advantage over Ubuntu 8.04 using the very same Firfox 3.0 RC1. Both runs reported 2-2.5% error rate which makes the difference even less significant.
Finally, the time came for the multi-tasking test. As always, I like to stress on the importance of multi-tasking benchmarks because they are the closest to represent the real life.
Multi-tasking is inevitable, and it goes on all the time on every computer and every OS. Even my previous benchmarks were actually run in a multi-tasking environment since several system process and services were running in the background. There is no pure mono-tasking and the real issue is the size and the level of multi-tasking workload. In the benchmarks above, I tried to minimise the multi-tasking impact on my tests by closing all other applications and turning off all unnecessary system services.
In my previous article, I wrote a script to start all the tests at the same time. That was very intensive multi-tasking and practically impossible for users to attempt. Therefore, I decided to lighten up the test by creating 6 batches of work. All would start at the same time and each batch consisted of one or more tasks. For example, instead of having LAME encoding and decoding at the same time, it would encode first and then decode and then trans-code the MP3 file into a lower quality format; and rather than telling RAR and UnRAR to start simultaneously, one would wait for the other to finish before it started.
This graph shows the multi-tasking impact on test times. One can see the percentage increase in each task’s duration compared to the single-task benchmark. Each of my 6 batches was run separately and the times were recorded and then compared to the durations reported in the multi-tasking test. Only Avidemux was run on a higher priority in all tests and it showed up very clearly here as it was the least affected test. It’s also important to notice that while Windows XP gave Avidemux most of the CPU time and forced all other tasks to wait, Ubuntu seemed to be less biased and nearly ignored the “above normal” priority settings of Avidemux. Other than that, both Windows XP and Ubuntu 8.04 suffered under the multi-tasking pressure especially in disk-intensive tasks.
The overall result of the multi-tasking test was, once again, a big win for Ubuntu. Windows XP showed a positive response to the lighter workload as it was ‘only’ 5 times slower (it was 7 times slower with my previous script), but ultimately lost to Ubuntu 8.04 which conceded a 350% increase in test times, compared to the single benchmarks. Despite winning this round, Ubuntu 8.04 results were a bit worse than the original test, which may reflect some changes in the test setup. I switched the EXT3 journaling to the default “data=ordered; realtime” mode instead of “data=writeback; notime”. I knew for sure that it would cost Ubuntu a few points in I/O tests, but I had to respect the “out-of-the-box” test conditions. The changes in ClamAV and RAR test folders could also affect the results.
If one looks at the details, it is clear that Windows XP kernel scheduler responded to process priority with ultimate respect. Avidemux batch was barely slower than the single benchmark. It was the main reason why RAR and LAME batches were held for such a long time. Disk-intensive tasks like RAR (decompression) and ClamAV suffered the most, and GIMP almost suffocated under the pressure on the GUI and struggled to get any CPU time.
On the Ubuntu Linux front, the multi-tasking impact on various tests was more even. GIMP and ClamAV were relatively the biggest losers even though they had the shortest durations in absolute numbers. Blender was the last to finish but in comparison to the single-task test it was the second-best performer. Linux definitely handles the multi-tasking workload differently from Windows. I don’t have the knowledge to say whether it is better or worse, but, looking at the test result, Linux approach seemed to be smarter and more efficient.
My last benchmark was not an application. It was a simple read/write test from/to a USB flash key. It is one of the most common daily tasks with all those camera memory cards, USB keys and mobile phone cards around. I prepared a test folder of 1GB of files and folders that varied from 1KB to 500MB in size. I also made sure to include a big number of files because I already knew that file-copy performance generally depends on the size as well as the number of the items.
Reading from my Sony Microvault 4GB USB2 stick was practically the same in both Windows XP and Ubuntu 8.04. However, copying files into the stick was considerably faster on Ubuntu. The test was done without any real-time virus scanners installed. This means, in the real world, that Windows XP would be even slower since Ubuntu doesn’t need any on-access anti-virus agents.
That’s the end of my test. I won’t rush to any conclusion and I will let the blog readers to judge for themselves. I can only re-iterate that your contribution is welcomed and much appreciated. I am really looking forward to your comments, which I am sure will enrich my article and help converting this benchmark into something useful.
- CPU: AMD Athlon64 X2 5600+
- Board: MSI K9N Neo V3 nForce560
- RAM: 2GB Dual Channel DDR2-6400 (800)
- HDD: Western Digital, WD1600JD, Capacity: 160GB, Cache: 8 MB, SATA150, 7200rpm (Dual boot, Windows XP was on the first primary partition, Ubuntu’s EXT3 partition was the second).
- VGA: GeForce 8600GT
- Windows XP Professional SP2 (32bit) with SP3 on top and all updates up to 01-June-2008
- Ubuntu 8.04 LTS 32bit (hardy heron) with all security and recommended updates up to 01-June-2008
Test technical details:
- Engine version: 0.92.1
- Known viruses: 301908
- Single Task:
- Scanned directories: 187
- Scanned files: 1638
- Data scanned: 3518 MB
- Scanned directories: 14
- Scanned files: 171
- Data scanned: 1387.55 MB
- Avidemux 2.4.1
- Test1: Converting 03:17 DV (MPEG2) clip to MP4 (x264 (2-pass, 1024kbps) /AAC)
- Test2: Converting 05:32 MJPEG clip to MP4 (x264 (2-pass, 1024kbps) /AAC)
- LAME 3.97
- Encoding Test: 44:58 495MB WAV to VBR MP3
- lame -V 0 -m j -q 0 -s 48 –vbr-new 1.wav 1.mp3
- Decoding Test: 44:58 72MB MP3 256Kbps VBR to WAV
- lame –decode b.mp3 decode.wav
- Transcoding Test: 44:58 72MB MP3 256Kbps VBR to standard MP3
- lame -h b.mp3 b2.mp3
- Encoding Test: 44:58 495MB WAV to VBR MP3
- FAAC 0.26.1
- Encoding 44:58 495MB WAV to Standard m4a stream
- Faac -o 1.m4a 1.wav
- RAR 3.71
- Compression: Compressing 172 folders, 1469 files, totaling 660 MB into a RAR archive.
- Decompression: UnRAR the result archive from the RAR compression test.
- p7Zip 4.57
- Compression: Compressing 172 folders, 1469 files, totaling 660 MB into a 7z archive.
- Decompression: the result archive from the 7zip compression test.
- Blender 2.46:
- Single Task: Rendering 25 frames of each one of sample files:
- Multi Task: Rendering 25 frames of sample file FPSTemplateLightMap.blend
- GIMP 2.4.5:
- Source Image: 2816×2112 JPEG image.
- Single Task: Applying these filters with the default settings:
- Filters>render>nature>IFS Fractal
- Filters>render>Line Nova
- Multi-Task: Applying this filter with the default settings:
- Filters>render>nature>IFS Fractal
- OpenOffice.org 2.4 Writer:
- Test file: Thinking in C#, 798 pages, 3.31 MB in size.
- OpenOffice.org 2.4 Calc:
- 8 files, 4 MB in size.
- OpenOffice.org 2.4 Impress:
- 2 files, 85MB in size.
- USB2 I/O test:
- Copied Folder: 620 files, 43 folders, 1GB in size
- USB Stick: Sony Microvault 4GB (model: USM4GJ) USB2.
- Tests were run on fresh installs of Ubuntu 8.04 and Windows XP SP2 with SP3 on top with only latest updates, media codecs, hardware drivers and test programs installed.
- Desktop effects on both Windows XP and Ubuntu were OFF.
- nVIDIA-supplied VGA drivers on both (v175.16 for Windows XP and v173.14 on Ubuntu)
- Disabled Swapping on Ubuntu.
- Windows XP Caching optimisations enabled (though TweakNow PowerPack utility).
- Disabled NTFS last-access time stamp on the NTFS hard drive.
- All test read/wrote from/to the same hard disk partition: EXT3 for Linux and NTFS for Windows
- Each test was repeated 3 times and the average time was calculated.