If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
Announcement
Collapse
No announcement yet.
KDE's KWin Compositor Sees Near Total Rewrite Of Compositing Code.
I've read comments on that merge request and latest was "its a good idea to look into mutter or kwin-lowlatency to see how they avoided stuttering". So no changes for me - I will continue using kwin-lowlatency
Ya know, I've never had one issue with kwin if am honest. Is it just certain hardware that does like I recall when I had a GTX1080 the desktop experience was that of FGLRX in mid 2000's ?
So why not just write it properly for the first time? People say the Linux world is full of duplicate work. This is the duplication of duplicate work.
why didn't ethereum introduce proof of stake the first time around, way back when?
why didn't kde release kde4 at 4.10 when it really got good, rather than waste our time with all the pre .10 releases?
why did elon bother making those piddly falcon rockets, when his engineers could have begun stacking starship superheavy sections in boca chica years before?
So why not just write it properly for the first time? People say the Linux world is full of duplicate work.
The concept of duplicate work is true in some contexts, the various desktop / GUI systems can be an example here. However this has nothing to do with improving an existing code base.
This is the duplication of duplicate work.
Nope! It is taking an existing bit of code and improving it. There is a huge difference here. There is almost always room for improvement in non trivial software.
My system runs code from a different camp (Fedora/GNOME) which could be considered duplicate effort. However all the effort that goes into my distro every year is not a duplicate effort. It is pretty much standard operating procedure in the world of software engineering. The key thing to grasp here is that there is always room for improvement.
So why not just write it properly for the first time? People say the Linux world is full of duplicate work. This is the duplication of duplicate work.
Because originally it was based on single threaded X11 architecture which is completely different to both how Wayland and even modern hardware work (i.e. almost all modern desktops/laptops are dual core at bare minimum now)
2 more things and the next release could be near perfect for me...
1. Right-click on ISO files also should show "compress" menu and not only "extract" as it is now
2. Make Ark very fast when opening big archived files (several Gigabytes, and thousands of files), as it seems now every time you open that archive to see what's inside it decompresses to show you basic structure, which is very fast with command line
Meanwhile the news for the next release are very-very exciting, this is a release that i am looking forward to see, Thank you very much!
So you refused to do the work and now you're complaining that the other guys didn't get it right the first time. Maybe you should be thankful instead.
Why do all the re-tards on this forum always bring up the nonsensical argument of "do the programming yourself"? It's obvious that you guys understand so little about computers that there's zero possibility that you were even Linux users.
Now this is the kind of stuff that makes me tempted to give KDE another try, I remember having a lot of issues with it's compositor in the past, it being one of the key reasons why I could never stick with it.
Comment