Wayland Protocol Finally Ready For Fractional Scaling

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • darkbasic
    Senior Member
    • Nov 2009
    • 3086

    #71
    Originally posted by holunder View Post

    Huh? You can just use e.g. Plasma under X.org and manually set the global scale factor to something fractional (or adjust the font DPI setting + icons size setting separately). No reason to not upgrade to 4K, I did my upgrade from QHD to 4K seven years ago and even back then Plasma’s X.org scale tricks did work great. It will affect all Qt and GTK programs, Firefox may need manual adjustments and/or special variables (at least in the past).
    You said it: X.org. You can't get per-monitor scaling on X11 and that's the biggest blocker.
    ## VGA ##
    AMD: X1950XTX, HD3870, HD5870
    Intel: GMA45, HD3000 (Core i5 2500K)

    Comment

    • billyswong
      Senior Member
      • Aug 2020
      • 710

      #72
      Originally posted by darkbasic View Post

      I don't care if its gamut superimpose P3 exactly or not, what I care is having a gamut whose **volume** is big enough to show me additional colors from my DSLR compared to the more traditional sRGB. A volume slightly bigger than Adobe RGB/P3 (without necessarily superimposing them in their entirety) would be plenty enough for my use case.
      I want to be able to edit my photos in a linear color space in darktable while tone mapping the huge dynamic range of my reflex to the fairly big dynamic range of an HDR1400 monitor. I don't give a f***k about poorly mastered HDR movies: I want to do my own tone mapping and be able to appreciate my photos without having to compress their dynamic range too much. I don't care if almost nobody else would be able to appreciate them: time will come where this won't be considered an alien workflow.
      So you tone map your photo according to your colour *inaccurate* monitor and ignore if it looks right when displayed on a different monitor in the future after your current one die? It is indeed an compromise that we are hard to avoid due to the current market. But it is still an unfortunate compromise.

      Comment

      • darkbasic
        Senior Member
        • Nov 2009
        • 3086

        #73
        Originally posted by billyswong View Post

        So you tone map your photo according to your colour *inaccurate* monitor and ignore if it looks right when displayed on a different monitor in the future after your current one die? It is indeed an compromise that we are hard to avoid due to the current market. But it is still an unfortunate compromise.
        Why would it be color inaccurate? Color management doesn't work by hardware calibrating your monitor's gamut to any other existing gamut: you first calibrate it to a certain gamma/color temperature, then you measure the color reponse into a profile and then your applications are responsible to map the source color space into the destination one. Hardware calibrating your monitor to a specific gamut severely limits it capabilities and basically means working around color management.

        You cannot expect any image to be displayed correctly on any monitor with a gamut smaller than the image itself, period. Even if the gamut of the monitor is big enough to contain it, it will look dull if you edited it on monitor with a higher contrast (ex HDR1000) and show it on a monitor with a lower one. There is no way around it and you can mitigate the issue by compressing the low/high parts of the spectrum in order to linearly preserve the middle ones as much as possible, but that's it. The accepted workflow has always been to limit the already small dynamic range of your monitor to the even smaller dynamic range of the prints in order to get similar results across different monitors (you can always calibrate a monitor to underperform but not viceversa). I don't give a *** about prints, if I want to print a photo I will process it accordingly. I want to get the best possible results when viewed on an high dynamic range HDR monitor and the photo must be tonemapped to SDR if you don't have one.
        ## VGA ##
        AMD: X1950XTX, HD3870, HD5870
        Intel: GMA45, HD3000 (Core i5 2500K)

        Comment

        • holunder
          Senior Member
          • Jan 2013
          • 164

          #74
          Originally posted by darkbasic View Post

          You said it: X.org. You can't get per-monitor scaling on X11 and that's the biggest blocker.
          That’s what this news is about, fractional scaling in Wayland in the future… If you want to use a notebook and a monitor, QHD on the notebook is a good resolution to drive the same DPI/HiDPI setting like your 4K monitor with X.org

          Comment

          • bug77
            Senior Member
            • Dec 2009
            • 6521

            #75
            Originally posted by Charlie68 View Post

            Surely not, but of course many users here are a bit spoiled, I don't know anyone who has a 4K monitor and I assure you I know a lot of people. However good for them ... let's say that is not really the norm.
            As it happens, I have one of the tougher setups right in front of me: 32" 4k, next to a 24" 1920x1200. The perfect proving ground for Wayland and yet, here I am, just increasing the font size in an X session. I've been trying to log into Wayland, but I've given up at some point. Really, really nothing to see over there, besides stuff that works on X, but doesn't on Wayland.

            Comment

            • Myownfriend
              Senior Member
              • Mar 2021
              • 1044

              #76
              Originally posted by bug77 View Post
              I've been trying to log into Wayland, but I've given up at some point.
              What do you mean by "log into Wayland"?

              Comment

              • MorrisS.
                Senior Member
                • Feb 2022
                • 652

                #77
                Originally posted by Myownfriend View Post

                What do you mean by "log into Wayland"?
                I assume that he means that it's impossible to use Wayland in PLASMA with Nvidia card and with both proprietary and mesa drivers.

                Comment

                • darkbasic
                  Senior Member
                  • Nov 2009
                  • 3086

                  #78
                  Originally posted by holunder View Post

                  That’s what this news is about, fractional scaling in Wayland in the future… If you want to use a notebook and a monitor, QHD on the notebook is a good resolution to drive the same DPI/HiDPI setting like your 4K monitor with X.org
                  I've carefully chosen my laptop to have integer scaling, unfortunately I cannot do so for external monitors because 5K ones aren't available.
                  ## VGA ##
                  AMD: X1950XTX, HD3870, HD5870
                  Intel: GMA45, HD3000 (Core i5 2500K)

                  Comment

                  • holunder
                    Senior Member
                    • Jan 2013
                    • 164

                    #79
                    Originally posted by darkbasic View Post
                    I've carefully chosen my laptop to have integer scaling, unfortunately I cannot do so for external monitors because 5K ones aren't available.
                    So just use X.org and its better scaling method with an external 4K monitor then? You could write a script, which is changing the global DPI settings whenever you (un)plug your monitor. Certainly not great but better than having to use a QHD monitor like a caveman.

                    Comment

                    • LinAGKar
                      Senior Member
                      • Nov 2014
                      • 267

                      #80
                      Originally posted by darkbasic View Post

                      Yes and no. Freesync 2 is a different beast AFAIK, with proprietary technology.
                      No. Freesync 2 doesn't even exist anymore, having been replaced by Freesync Premium Pro. But when it did, it was just a certification for freesync (adaptive sync) monitors also supporting HDR. And maybe there were some more requirements, not sure.

                      Comment

                      Working...
                      X