A While ago, I was doing my usual “preaching” about the merits of Linux in general and Ubuntu in particular. I was telling a friend about the modest system requirements of Linux and how efficient Linux is in preserving system resources. My friend is a professional who used to work on powerful workstations, so he quickly interrupted my “sermon” to ask a simple yet meaningful question:
So Linux does not need a high-spec machine to run, but what if you have got one? Does Linux utilize the extra resources to get the best out of such a machine? Or that the simple requirements mean less efficiency in terms of full-throttle performance.
Of course, I am fully aware of Linux modularity and I could easily answer with “yes” if my friend was asking about data centers or even super-computers. However, in the Desktop PC realm the answer is not straight forward (at least for me). This is because Desktop Linux distributions are based on a specific incarnation of the Linux kernel which is usually patched by each distribution’s development team. Other system elements are also built around the kernel to work on a wide variety of machines; so I truly wondered whether the same setup has enough scalability to drive any machine at full throttle.