Announcement

Collapse
No announcement yet.

QEMU's EDID Support Coming Together, Allowing For Eventual HiDPI Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • QEMU's EDID Support Coming Together, Allowing For Eventual HiDPI Support

    Phoronix: QEMU's EDID Support Coming Together, Allowing For Eventual HiDPI Support

    Support is coming together within the Linux kernel and QEMU for this important piece of the open-source Linux virtualization stack to handle Extended Display Identification Data (EDID) for the virtual displays to handle some practical improvements moving forward...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Can this tech also allow for dynamic resolutions (automatically changing the resolution based on the size of the VM window)?

    Comment


    • #3
      I don't think HiDPI should be supported at all, as it's the opposite of how to sanely handle high resolutions. If a screen is too dense, it's a waste of resources and should not be used. I won't buy a display where I can't tell individual pixels apart, it's that simple.

      Comment


      • #4
        Originally posted by CuriousTommy View Post
        Can this tech also allow for dynamic resolutions (automatically changing the resolution based on the size of the VM window)?
        EDID Is used when the monitor is connected so the OS can understand what the native resolution is. It could be used to provide hints to the guest OS to switch to another resolution by sending display hotplug events, but that won't create capabilities in the guest OS that it doesn't already have... ie if the guest only supports VGA/SVGA/XGA/WGXA then switching to a 3840x2160 window won't give the guest the ability to resize to that resolution. But it does mean that for example MacOS which is notorious for not following window resizes, will now have an opportunity to do the right thing.

        Comment


        • #5
          Originally posted by DoMiNeLa10 View Post
          I don't think HiDPI should be supported at all, as it's the opposite of how to sanely handle high resolutions. If a screen is too dense, it's a waste of resources and should not be used. I won't buy a display where I can't tell individual pixels apart, it's that simple.
          I understand your feeling, but with time the cost of having 'retina' displays and scaling applications to a comfortable reading size will pale to insignificance. I remember feeling that switching to 24-bit display depth was a terrible waste of resources because with dithering and a decent resolution it's easy to fool our eyes into perceiving 24-bit depth with only 8 bits. But now that memory is astronomically cheaper than it was back then (8G costs less now than 8M did then) and now that we have so much SSE/GPU bandwidth available and using it to do compositing with 24-bit 'truecolour' is actually cheaper than doing it with 8-bit 'metacolor' ... well... the times they are a-changin.

          But yeah, of you want decent battery life in a notebook you should still opt for an FHD model, today.

          There's one key benefit to 'retina' displays - font resolution. With a typical FHD display and sub-pixel RGB rendering you effectively get 5760x1080 resolution for your fonts and that makes a huge difference that you can actually see. A 'retina' display gives you that kind of resolution in both axis without needing sub-pixel rendering. That's important in publishing because if you take a screenshot on an RGB display and cut something out of it and paste it into a PDF, the hard-coded phase shifts of the fonts are saved in that image. When they're displayed on a BGR display they look terrible. When they're displayed on a vertical BGR display they look fuzzy. When they're printed on paper they can have colour fringing. These are things you can actually see, today, if you're into those workloads.
          Last edited by linuxgeex; 11 March 2019, 08:56 AM.

          Comment


          • #6
            Originally posted by linuxgeex View Post
            but that won't create capabilities in the guest OS that it doesn't already have... ie if the guest only supports VGA/SVGA/XGA/WGXA then switching to a 3840x2160 window won't give the guest the ability to resize to that resolution.
            I am personally okay with that. i915-GVTg_V5_4 provides the maximum resolution of 1920x1200, which is large enough to fill my laptop screen nicely. What I am mostly concern about is being able to shrink the windows (so instead of it taking the full screen, I would want it to take half of the screen).

            Originally posted by linuxgeex View Post
            EDID Is used when the monitor is connected so the OS can understand what the native resolution is. It could be used to provide hints to the guest OS to switch to another resolution by sending display hotplug events
            Can the OS hand EDIDs with abnormal resolutions (assuming that the EDID does not go over the max resolution limit)?

            Comment


            • #7
              Originally posted by linuxgeex View Post
              I understand your feeling, but with time the cost of having 'retina' displays and scaling applications to a comfortable reading size will pale to insignificance. I remember feeling that switching to 24-bit display depth was a terrible waste of resources because with dithering and a decent resolution it's easy to fool our eyes into perceiving 24-bit depth with only 8 bits. But now that memory is astronomically cheaper than it was back then (8G costs less now than 8M did then) and now that we have so much SSE/GPU bandwidth available and using it to do compositing with 24-bit 'truecolour' is actually cheaper than doing it with 8-bit 'metacolor' ... well... the times they are a-changin.

              But yeah, of you want decent battery life in a notebook you should still opt for an FHD model, today.

              There's one key benefit to 'retina' displays - font resolution. With a typical FHD display and sub-pixel RGB rendering you effectively get 5760x1080 resolution for your fonts and that makes a huge difference that you can actually see. A 'retina' display gives you that kind of resolution in both axis without needing sub-pixel rendering. That's important in publishing because if you take a screenshot on an RGB display and cut something out of it and paste it into a PDF, the hard-coded phase shifts of the fonts are saved in that image. When they're displayed on a BGR display they look terrible. When they're displayed on a vertical BGR display they look fuzzy. When they're printed on paper they can have colour fringing. These are things you can actually see, today, if you're into those workloads.
              I don't think having more resources means you can get lazy and waste them to get the same thing done in a less efficient manner. When I upgrade, I want to be able to do things I couldn't before rather than being more or less on the same level with a new bloated Electron "app". I refuse to use a compositor, as I can't find a case where it makes things better, and that was the case even back in the day when I used a stacking window manager.

              As for font rendering, I like having it crisp (heavy hinting, font size adjusted to render as well as it can, but with antialiasing to make round edges look nice) and be able to see single pixels. There's a certain DPI where fonts just start looking nice and they don't get that much better (seems like the bottom of it is ~1600x900 at 14 inches, 1080p panels seem to look about as nice, the only benefit is that the pixels are large enough to see them, and you get more real estate on your screen). I've seen retina displays, and with macOS font rendering they simply look awful, but that's because of how Apple approaches font rendering, which just so happens to be the opposite of the approach I prefer. Even if hardware is sufficient to handle these high resolution displays without much effort, it would use even less resources and be capable of even more at lower resolutions.

              When it comes to printing, when you have to care about how things look, you probably won't just take screenshots. There are ways to use dummy X servers to render at high resolutions (and probably optimize font rendering for print) if you want a bitmap, and there's a utility to take vector screenshots of GTK programs.

              Comment


              • #8
                Somewhat related, using QEMU v3.1.0 I get error "corrupted double-linked list" in the console and dmesg error "qemu-system-x86[16757]: segfault at 40 ip 00007f1e07edd129 sp 00007ffd5fa36890 error 4 in libvirglrenderer.so.0.2.0" on the host when trying to use Linux 5.0 kernels in VMs. Kernels up to and including 4.20.14 in VMs works just fine. This is with "-vga virtio -display gtk,gl=on". I am guessing these virtio-gpu changes in Kernel 5.0 are to blame. Regardless, don't update VMs to kernel 5.0 just yet. If you're using QEMU v4.x you're probably fine, though.

                Comment


                • #9
                  Linux 5.0 is working in guest, if virgl is not used.

                  Comment


                  • #10
                    Originally posted by CuriousTommy View Post
                    Can the OS hand EDIDs with abnormal resolutions (assuming that the EDID does not go over the max resolution limit)?
                    That's a question to ask of the OS, not of me.

                    Comment

                    Working...
                    X