Announcement

Collapse
No announcement yet.

GNOME's Mutter Flips On Its New Monitor Config Manager By Default

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • GNOME's Mutter Flips On Its New Monitor Config Manager By Default

    Phoronix: GNOME's Mutter Flips On Its New Monitor Config Manager By Default

    GNOME's Mutter has flipped on its new "Monitor Configuration Manager" by default as it seeks to improve the multi-monitor and multi-GPU experience...

    http://www.phoronix.com/scan.php?pag...Config-Manager

  • #2
    Typo:

    Originally posted by phoronix View Post
    which Rui Matos says is needed in order to allwo for better HiDPI and multi-GPU support.

    Comment


    • #3
      Owning a 4k and a low res screen, I'm very looking forward to gnome 2.26's non-integer scaling!

      Comment


      • #4
        > multi-GPU support

        Why would you want to use multiple GPU for desktop output? Is that intended for something like laptops where the internal display might use an iGPU and external displays the dGPU? I think I read something about laptops with pascal and above no longer supporting optimus or something?

        Comment


        • #5
          Originally posted by debianxfce View Post

          Gnome3 is so shit that probably they run other shit desktop (win virus hoover) in a low level kernel virtualization and that needs 2 gpus.
          I guess Gnome3 is so shit that I currently run it with two [email protected] mixed in with an old Dell [email protected] and it handles all three monitors just fine. Only issues I've really had are with Qt apps (which on my laptop like to try to rescale the fonts and fail miserably so they come out huge), and he other is power management, which I fault on the two new monitors, since my set of old 1920x1080 monitors worked fine with waking back up after I move my mouse around.

          These two new ones have to be powered off and powered back on to be detected again. It's like they go asleep a little too deeply... I need to test that in KDE or XFCE or something to see if it's Gnome at fault, the monitors or the nVidia driver.

          Definitely the last time i used KDE, the HiDPI was complete shit.

          As for the iGPU mixed with another card. Technically my motherboard (desktop) could run that way, and I'd have 6 video outputs.

          Comment


          • #6
            Originally posted by debianxfce View Post

            Gnome3 is so shit that probably they run other shit desktop (win virus hoover) in a low level kernel virtualization and that needs 2 gpus.
            I guess this kind of nonsense response is to be expected from an xfce fanboy like yourself... You fanboys are pretty funny in how you always turn useful features into flaws and general negatives because your beloved competitor lacks said feature.
            "Why should I want to make anything up? Life's bad enough as it is without wanting to invent any more of it."

            Comment


            • #7
              Originally posted by debianxfce View Post

              Gnome3 is so shit that users do not know what is the difference between gpu monitor outputs and separate gpus.
              You are so shit that you keep shitting shit. Go back to your "precious" shitty Xfce already instead of bashing other DE's all of the time.

              Comment


              • #8
                Originally posted by debianxfce View Post
                Gnome3 is so shit that probably they run other shit desktop (win virus hoover) in a low level kernel virtualization and that needs 2 gpus.
                Uhh? Are you referring to hardware passthrough like talked about on r/VFIO? This would have nothing to do about that, you also don't need 2 GPU for it. I was asking if the feature with mutter is intended for something like running a single Gnome session on two separate GPUs at once where both GPU output a desktop.

                Other than that I don't quite understand what was meant by multi-gpu support, maybe hybrid graphics switching since that's something Redhat has been focusing on with Fedora/Gnome to be more seamless experience. Using low power iGPU for display and a dGPU(headless) for applications/games that need the extra oomph that bumblebee has been doing with an additional x server in the past.


                Originally posted by leech View Post
                Only issues I've really had are with Qt apps (which on my laptop like to try to rescale the fonts and fail miserably so they come out huge)

                Definitely the last time i used KDE, the HiDPI was complete shit.

                As for the iGPU mixed with another card. Technically my motherboard (desktop) could run that way, and I'd have 6 video outputs.
                What do you mean technically? How do you go about that now? Far as I understood desktop session display was only output to a single GPU and it's monitors. I know of things like bumblebee but the dGPU runs a 2nd x server headless and forwards the display renders to the GPU outputting the display for the desktop. On a laptop dGPUs tend to not be like a desktop and their output is routed to iGPU framebuffer. See my prior response to quote above about how the multi-gpu support is probably intended to be used.

                You're not going to use KDE because of a bad HiDPI experience? That's still something being worked out across DEs isn't it? Sure Gnome seems to be in the lead at the moment I guess? Hasn't it for quite some time had an issue with integer only scaling(1x 2x 4x, no 1.5x 2.3x etc)? That's caused issues for quite a few people, I think floating point scaling is arriving in the next release though? It's great that it's working well for you though without that. KDE will no doubt get good support too, far as I understood it's already there with 5.10, not sure if it's enabled, something to do with settings and waiting on Qt I think.

                Qt issues will no doubt be resolved without GTK devs having to do anything about it. On KDE side of things, GTK apps often have problems integrating into other DEs that aren't built primarily for GTK like Gnome as the GTK devs are against KDE devs contributing code to support other platforms. That lack of support is unfortunate since I really like KDE over my Gnome experience.


                Comment


                • #9
                  Originally posted by oleid View Post
                  Owning a 4k and a low res screen, I'm very looking forward to gnome 2.26's non-integer scaling!
                  That is not enough, we need per monitor scaling.

                  Comment


                  • #10
                    Originally posted by cen1 View Post
                    That is not enough, we need per monitor scaling.
                    That already is possible in 2.24 when you use Wayland.

                    Comment

                    Working...
                    X