Is the Linux Desktop getting slower and more bloated?

In his "failure of logic" post "K. Mandla" writes that the Linux Desktop is not getting faster even though all the hardware around it is getting faster, pointing to an article with the same argument 10 years ago. The ensueing discussion was much too abstract for my taste. ("Why are clouds?" "What clouds, where?") But if you split up the question you get answers:

1) Application start
2) Application processing information
3) Linux boot
4) Desktop "boot"
5) System responsiveness
6) Power efficiency

1) This is due to bottlenecks in hardware and e.g. more and more libraries being used, applications growing in size and complexity.

2) This is actually not the case here. Video or audio encoding data from back then nowadays would be a laugh with a modern computer. A modern libav/ffmpeg uses less CPU for decoding a certain file than it did back then I bet.

3) Actually the Linux boot times increased significantly since then I'm pretty sure. I don't remember 10 second boot being even discussed.

4) Desktop Environments get larger, 1) applies even more here. Upstart, systemd etc. make a lot of difference here.

5) System responsiveness with Linux improved a *lot*. Esp. since new schedulers and the 1000 Hz tick was introduced, the big kernel lock removed, the "wonderpatch" applied, everything became a lot smoother. Linux 2.6 is much snappier than 2.4. Compare the two. I still remember first getting a new computer with a CPU twice as fast. Some time later I upgraded my Linux. Now I could no longer really notice the felt speed too much: Applications started similarly fast (much faster on the old computer than before). I was thinking to myself: Ok, now why did I get that now PC again? If I could have I would have returned it.

Though there is a general slow gradual downwards trend in some areas, check Phoronix kernel benchmarks.

6) Power efficiency
is also important to keep in mind. 10 years ago people didn't care too much except in notebooks. Nowadays any CPU, GPU and other components like sound cards, network cards, support all kinds of low power modes. Of course this means that pure performance is diverted somewhat in favor of efficiency. Just check my power management tag for more.

But...
yes, e.g. Android shows that of course with minimalism a lot is possible - even in a virtual runtime environment. In my programming experience I found that often it's really better to "reinvent the wheel" (if it's a small one) to analyze a few very simple blocks of XML instead of using a library for it. It's much smaller, leaner, less code and faster. I think library dependences are a bit issue: Libraries are used by tons of programs and thus use tons of code to catch all cases.  You often run better by just taking what you need. Why are they open source in the first place? (But as always watch your licenses for compatibility!)

Physics
And in some areas physics is just not changing quickly enough for us. An FM signal will always take a certain time to travel a certain distance etc. New technologies in that way are often a bit more difficult to develop and take more time, but are required for an increase in speed. This is a major issue with hard disks, though the new technology in form of SSDs is being put out.

Noticeability
I think a major problem is also the noticeability: Developers usually get new computers reguarly to wait less for compilation, try out new stuff, etc. The problem is that the faster your hardware the less you'll notice speed problems. I bet most programmers don't extensively profile their software and optimize the code without seeing the need. If they all had an old computer and had to use the software on it I think the situation would be different. They would be more aware and see the need and use of optimizing the code  for better speed, though it might not be noticeable on modern computers at all. At least that one tiny bit. Of course they add up and up and then become noticeable. But then there's no single issue to be seen. So in part human perception is to blame as well.

But all in all the problem "make it "free, even if you don't need all of it or even if you can't afford a pretty fast machine" was seen and mostly solved by a combination of the above hints with e,g. XFCE. And then if you prefer that you can still use KDE3 or other things. The nice thing is that most old lean software should run flawlessly on a modern system and kernel.


Update:
Phoronix confirms in numbers that recent Linux kernels actually boot much faster (~ -30%). 

If you like this post, share it and subscribe to the RSS feed so you don't miss the next one. In any case, check the related posts section below. (Because maybe I'm just having a really bad day and normally I write much more interesting articles about theses subjects! Or maybe you'll only understand what I meant here once you've read all my other posts on the topic. ;) )

No comments:

Post a Comment

I appreciate comments. Feel free to write anything you wish. Selected comments and questions will be published.