Wayland Protocol Finally Ready For Fractional Scaling

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • darkbasic
    replied
    Originally posted by holunder View Post

    So just use X.org and its better scaling method with an external 4K monitor then? You could write a script, which is changing the global DPI settings whenever you (un)plug your monitor. Certainly not great but better than having to use a QHD monitor like a caveman.
    You don't get it: I want to be able to use BOTH when I need them. It's the same whenever you need to connect a projector: you don't want to turn your laptop's monitor off. You need per-monitor scaling and so you need Wayland (no good fractional scaling). Whenever I need accurate colors I instead need to switch back to X11. It's not a pleasant experience for sure.

    Leave a comment:


  • LinAGKar
    replied
    Originally posted by darkbasic View Post

    Yes and no. Freesync 2 is a different beast AFAIK, with proprietary technology.
    No. Freesync 2 doesn't even exist anymore, having been replaced by Freesync Premium Pro. But when it did, it was just a certification for freesync (adaptive sync) monitors also supporting HDR. And maybe there were some more requirements, not sure.

    Leave a comment:


  • holunder
    replied
    Originally posted by darkbasic View Post
    I've carefully chosen my laptop to have integer scaling, unfortunately I cannot do so for external monitors because 5K ones aren't available.
    So just use X.org and its better scaling method with an external 4K monitor then? You could write a script, which is changing the global DPI settings whenever you (un)plug your monitor. Certainly not great but better than having to use a QHD monitor like a caveman.

    Leave a comment:


  • darkbasic
    replied
    Originally posted by holunder View Post

    That’s what this news is about, fractional scaling in Wayland in the future… If you want to use a notebook and a monitor, QHD on the notebook is a good resolution to drive the same DPI/HiDPI setting like your 4K monitor with X.org
    I've carefully chosen my laptop to have integer scaling, unfortunately I cannot do so for external monitors because 5K ones aren't available.

    Leave a comment:


  • MorrisS.
    replied
    Originally posted by Myownfriend View Post

    What do you mean by "log into Wayland"?
    I assume that he means that it's impossible to use Wayland in PLASMA with Nvidia card and with both proprietary and mesa drivers.

    Leave a comment:


  • Myownfriend
    replied
    Originally posted by bug77 View Post
    I've been trying to log into Wayland, but I've given up at some point.
    What do you mean by "log into Wayland"?

    Leave a comment:


  • bug77
    replied
    Originally posted by Charlie68 View Post

    Surely not, but of course many users here are a bit spoiled, I don't know anyone who has a 4K monitor and I assure you I know a lot of people. However good for them ... let's say that is not really the norm.
    As it happens, I have one of the tougher setups right in front of me: 32" 4k, next to a 24" 1920x1200. The perfect proving ground for Wayland and yet, here I am, just increasing the font size in an X session. I've been trying to log into Wayland, but I've given up at some point. Really, really nothing to see over there, besides stuff that works on X, but doesn't on Wayland.

    Leave a comment:


  • holunder
    replied
    Originally posted by darkbasic View Post

    You said it: X.org. You can't get per-monitor scaling on X11 and that's the biggest blocker.
    That’s what this news is about, fractional scaling in Wayland in the future… If you want to use a notebook and a monitor, QHD on the notebook is a good resolution to drive the same DPI/HiDPI setting like your 4K monitor with X.org

    Leave a comment:


  • darkbasic
    replied
    Originally posted by billyswong View Post

    So you tone map your photo according to your colour *inaccurate* monitor and ignore if it looks right when displayed on a different monitor in the future after your current one die? It is indeed an compromise that we are hard to avoid due to the current market. But it is still an unfortunate compromise.
    Why would it be color inaccurate? Color management doesn't work by hardware calibrating your monitor's gamut to any other existing gamut: you first calibrate it to a certain gamma/color temperature, then you measure the color reponse into a profile and then your applications are responsible to map the source color space into the destination one. Hardware calibrating your monitor to a specific gamut severely limits it capabilities and basically means working around color management.

    You cannot expect any image to be displayed correctly on any monitor with a gamut smaller than the image itself, period. Even if the gamut of the monitor is big enough to contain it, it will look dull if you edited it on monitor with a higher contrast (ex HDR1000) and show it on a monitor with a lower one. There is no way around it and you can mitigate the issue by compressing the low/high parts of the spectrum in order to linearly preserve the middle ones as much as possible, but that's it. The accepted workflow has always been to limit the already small dynamic range of your monitor to the even smaller dynamic range of the prints in order to get similar results across different monitors (you can always calibrate a monitor to underperform but not viceversa). I don't give a *** about prints, if I want to print a photo I will process it accordingly. I want to get the best possible results when viewed on an high dynamic range HDR monitor and the photo must be tonemapped to SDR if you don't have one.

    Leave a comment:


  • billyswong
    replied
    Originally posted by darkbasic View Post

    I don't care if its gamut superimpose P3 exactly or not, what I care is having a gamut whose **volume** is big enough to show me additional colors from my DSLR compared to the more traditional sRGB. A volume slightly bigger than Adobe RGB/P3 (without necessarily superimposing them in their entirety) would be plenty enough for my use case.
    I want to be able to edit my photos in a linear color space in darktable while tone mapping the huge dynamic range of my reflex to the fairly big dynamic range of an HDR1400 monitor. I don't give a f***k about poorly mastered HDR movies: I want to do my own tone mapping and be able to appreciate my photos without having to compress their dynamic range too much. I don't care if almost nobody else would be able to appreciate them: time will come where this won't be considered an alien workflow.
    So you tone map your photo according to your colour *inaccurate* monitor and ignore if it looks right when displayed on a different monitor in the future after your current one die? It is indeed an compromise that we are hard to avoid due to the current market. But it is still an unfortunate compromise.

    Leave a comment:

Working...
X