Announcement

Collapse
No announcement yet.

GNOME's Window Rendering Culling Was Broken Leading To Wasted Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    everytime I try out GNOME, I keep trying to make it look like PLASMA.. lmao I just can't NOT hate tablet desktop designs.

    Comment


    • #42
      Originally posted by theriddick View Post
      everytime I try out GNOME, I keep trying to make it look like PLASMA.. lmao I just can't NOT hate tablet desktop designs.
      I just can't use distracting non-simplistic look,.. I need to focus on work, and fancy look just distracts me from the work. KDE before version 4 was usable also, but since version 4 it's all way too fancy too disturbing.

      Comment


      • #43
        Originally posted by mos87 View Post
        Tell this to us in 2001 gasping at Windows XP's wopping 128 MB abs min RAM requirement.
        Yeah, the same winxp that could draw fades ins/out, shadows, and a bunch of other stuff to boot.
        Not many had something called "a 3D accelerator" installed back then. Let alone heard of such thing as "compositing".
        Pretty mind-boggling eh?

        Actually, in 2001, most people had a 3D accelerator of some sort, even a typical S3 with 2MB of VRAM for example. Your memory fails you. Windows XP required a SVGA gpu and those included many 2D hardware functions. And the windows xp flashy effects even though not 3D were very expensive for the day. And weren't that impressive. Yes 2D has always been expensive. It is just that in 2020, hardware became much more powerful due to Moore's Law, but screen resolutions didn't really increase in the same way. We went from something typical to 1024x768 to 1920x1080p and now to 3840x2160. This is what, 8 times higher amount of pixels at best, when we are talking 4K? CPUs and GPUs (and RAM) were upgraded far more than just 8 times in the last 20 years. That is why 2D does not seem "expensive" these days, and why gpu vendors removed much 2D dedicated functionality. Still, modern GUIs are inefficient, because they don't actually fully exploit the 3D pipeline to render the desktop. They should be using the gpu to actually draw everything, not just for compositing. In fact, compositing shouldn't exist at all. "Compositing" means that you take the 2D windows and composite them into a 3D accelarated screen. You composite 2D graphics with 3D graphics. We should advance to processing everything on the gpu, literally everything, the OS should handle the screen like a 3D game pipeline, just send instructions to the gpu for rendering. I don't think we are doing it now. Hell, even just ditching X11 has taken ages.

        Comment


        • #44
          Okay, well, the shit is real. I've built Mutter / GS from gnome-3-36 branch and it works better than Windows 10 did on my X1C6 with 4K external display (esp. with multiple Firefox windows opened).

          Comment


          • #45
            Originally posted by TemplarGR View Post
            Actually, in 2001, most people had a 3D accelerator of some sort, even a typical S3 with 2MB of VRAM for example.
            Yes, I was going to to say the same thing... 3d accelerator cards first started appearing in the mid-90s, from names like 3dfx and Matrox, and a little later ATI and NVidia. They were luxuries for gamers to start with, of course, but by the time Windows XP came along, some form of hardware acceleration was pretty much standard on any PC...

            Comment


            • #46
              Originally posted by Delgarde View Post

              Yes, I was going to to say the same thing... 3d accelerator cards first started appearing in the mid-90s, from names like 3dfx and Matrox, and a little later ATI and NVidia. They were luxuries for gamers to start with, of course, but by the time Windows XP came along, some form of hardware acceleration was pretty much standard on any PC...
              I might still have a pci voodo3 2000 around here somewhere... Maybe time to try it out again.. ;-p.

              Comment


              • #47
                Originally posted by kravemir View Post

                Budgie is bugged, at least last time I tried it (half a year ago)... However, Budgie goes for more elegant simplistic design, than GNOME 3. So, if it wasn't bugged, I would have stayed with it.
                Don't feel like it's so buggy, a couple of things here and there but barely more than Gnome. Although we might have different workflows of course.

                What I like is that you can configure it in several different ways to fit your workflow, because we all have a different one.

                And opposite to Gnome who assumes you are a dumbwit and decides what's supposedly good for you, in Budgie you are empowered to decide on many things, they give you responsibility and it's up to you to adapt it to your workflow (in the limits of their small user base and developers that come with it).
                Also, you can decide where to put your applets on each panel, a very simple thing that should be default in Gnome. They acknowledge the variety of use cases and are flexible enough to offer a solution to a decent number of them, and that's a good thing in my books.

                Comment


                • #48
                  Originally posted by intelfx View Post
                  Okay, well, the shit is real. I've built Mutter / GS from gnome-3-36 branch and it works better than Windows 10 did on my X1C6 with 4K external display (esp. with multiple Firefox windows opened).
                  About time GNOME wins at something.

                  Comment


                  • #49
                    Is the UI rendering capped at 60 FPS? Because 60 is the new 30 these days.

                    Comment


                    • #50
                      Just imagine if Daniel van Vugt had bought a 4K laptop two years ago instead of about a month ago.

                      Comment

                      Working...
                      X