Announcement

Collapse
No announcement yet.

KDE KWin Preparing Preliminary Support For Running HDR Games

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by vextium View Post

    HDR isn't possible in xorg I'm pretty sure.
    I am not sure about HDR, but at least 10-bit colors are already supported by X at least, but not by default.

    You need to pass this hunk into an Xorg.conf file:

    Code:
    Section "Screen"
    Identifier "asdf"
    DefaultDepth 30
    EndSection​
    But be aware that Steam-games currently don't work with 10 bit anyway (which is yet another pre-requisite for HDR).

    Comment


    • #22
      Originally posted by cl333r View Post
      Am I the only one who finds HDR to be way too hard to implement on Linux?
      VRR/HDR/10bit has all been on the HARD ROAD under Linux. It's happening but very slowly!

      Comment


      • #23
        Originally posted by cl333r View Post
        Am I the only one who finds HDR to be way too hard to implement on Linux?


        am i.jpg
        It seem to works even in 8bit on some streaming platforms

        Comment


        • #24
          Originally posted by dev_null View Post
          as to someone who doesn’t know the topic well. Why is it so hard ? In my imagination you just should use floats in shaders what you do anyway mostly, then change type of the screen surface to something like r16g16b16 instead of r8g8b8. make rgb 1,1,1 to be maximum white in SDR to not break old code and software and what is more is actually HDR that’s it. why to invent 5 or so standards and make them super complicated???
          Im not sure what specifically you are referring to so here is a crash course, tldr at the bottom;

          1a. Why is HDR hard?
          There are a lot of reasons why straight HDR is still hard. HDR has made a lot of progress in a lot of facets, the most importing being transfer curve. Firstly, it's important to know that a transfer curve manipulates a image (videos are series of images) so that we can accurately emulate how humans physically interpret light. Old transfer curves use whats called a relative luminance. This means brightness is from 0.0 to 1.0 or more accurately 0% brightness, and 100% brightness. This worked fine back when all displays were roughly 80 nits, or 200 nits, or 160 nits etc.

          However now we have HDR, HDR comes along and now we have displays that can be 350 nits, 1000 nits, and anywhere inbetween, relative luminance obviously wont work any more (it was already struggling) as 90% brightness on a 400 nit display, and 90% brightness on a 1000 nit display is the difference between a decent lamp, and actually blindingly bright at night. so now HDR transfer curves like PQ and HLG, use an absolute luminance system if you have say RGB(0.5, 0.5, 0.5), no matter what, we always know what the rendering intent of that is, that it will be roughly around 90 nits, and that RGB(0.8, 0.8, 0.8) will be about 1500 nits.

          This poses additional issues however. even if it's a large improvement we still need to know things like "How bright is the display monitor?" if you only have a 400 nit monitor, you simply cant put out 1000 nits of brightness, so you need to tonemap 1000 nits to 400 nits. Should the monitor handles this? in many cases it can, but maybe it shouldn't always especially since we may need to display two different images with different peak brightness, how should you display a properly tonemapped image designed for 400 nits, and a 1000 nit image at the same time? there are hundreds of questions like these we need to answer.

          1b. Why is SDR on HDR hard?

          I already went over Tonemapping and relative vs absolute luminance, but how do you display, relative luminance images where we don't know the intended luminance, on an absolute luminance display? the general "media" approach to this is to map white to 203 nits, this is fine for consuming media, but what about producing media? srgb's spec is set for 80nits, adobeRGB is 160nits. BT.2100 recommend mapping to 140nits. etc. as you can guess from this alone, it's incredibly hard to decide on how to do mapping.

          Also what tonemapping do we use? there are many various tonemappers, whats the rendering intent of the image? so on and so forth. So basically we are adding onto issue that we already have with straight HDR. but now we really need to know the image profile, so we can try and guess the intended nits of the SDR image to properly map (this is a loosing battle anyways since some people just dont care and create sRGB images with an intended peak of around 160nits.)

          Mixing transfers is just a hard thing to do because you also need to figure out the intended luminance, sometimes this can be fairly easy, but often times it can be a real pain.

          1c. Gamut

          the above doesn't even bother to account for gamut mapping, which is the range of color an image can produce. I turns out when we muck about color, this is also a problem in HDR we need to compress, expand, and map the range of colors too. without getting too indepth and risk talking about things I don't know, if you see that funky triangle, that's showing gamut. being able to convert gamut while not making the image look like trash is hard.

          1d. Whitepoint

          the last issue here is whitepoint, not all transfers have the same whitepoint, this means "White" can look more blue, or more yellow/red, depending on on what colorspace it is, now we have to somehow map "white" when "white" looks different from source to destination.

          2. Bitdepth

          Bitdepth actually isn't hard at all, we solved this a long time ago, it's the other things that are holding bitdepth back. Wayland is not something that is some kind of rolling release, you want as little color protocols as possible, which means things like colordepth are being held back by the other color management stuff

          3. Why not just use shaders?

          GPUS have luts and all sorts of colormanagement stuff baked in, yes some shaders will likely get used, but when you set a lut in GPU (fixed function) it uses a LOT less energy then using something like a shader to transform the image, this may not sound like a lot, but when you need to run shaders on the entire screen, every refresh, and then also run shaders per layer per update, gpu usage will absolutely skyrocket.

          TLDR: color is hard. Emulating and representing it accurately is very complicated when you need to mix multiple formats, and doing this in shaders will simply be too expensive to do everywhere.

          Comment


          • #25
            I've just run Wayland session on gt 710 card by chrome based on Wayland. Streaming quality image is superlative.

            Comment


            • #26
              Originally posted by Quackdoc View Post

              Im not sure what specifically you are referring to so here is a crash course, tldr at the bottom;

              1a. Why is HDR hard?
              There are a lot of reasons why straight HDR is still hard. HDR has made a lot of progress in a lot of facets, the most importing being transfer curve. Firstly, it's important to know that a transfer curve manipulates a image (videos are series of images) so that we can accurately emulate how humans physically interpret light. Old transfer curves use whats called a relative luminance. This means brightness is from 0.0 to 1.0 or more accurately 0% brightness, and 100% brightness. This worked fine back when all displays were roughly 80 nits, or 200 nits, or 160 nits etc.

              However now we have HDR, HDR comes along and now we have displays that can be 350 nits, 1000 nits, and anywhere inbetween, relative luminance obviously wont work any more (it was already struggling) as 90% brightness on a 400 nit display, and 90% brightness on a 1000 nit display is the difference between a decent lamp, and actually blindingly bright at night. so now HDR transfer curves like PQ and HLG, use an absolute luminance system if you have say RGB(0.5, 0.5, 0.5), no matter what, we always know what the rendering intent of that is, that it will be roughly around 90 nits, and that RGB(0.8, 0.8, 0.8) will be about 1500 nits.

              This poses additional issues however. even if it's a large improvement we still need to know things like "How bright is the display monitor?" if you only have a 400 nit monitor, you simply cant put out 1000 nits of brightness, so you need to tonemap 1000 nits to 400 nits. Should the monitor handles this? in many cases it can, but maybe it shouldn't always especially since we may need to display two different images with different peak brightness, how should you display a properly tonemapped image designed for 400 nits, and a 1000 nit image at the same time? there are hundreds of questions like these we need to answer.

              1b. Why is SDR on HDR hard?

              I already went over Tonemapping and relative vs absolute luminance, but how do you display, relative luminance images where we don't know the intended luminance, on an absolute luminance display? the general "media" approach to this is to map white to 203 nits, this is fine for consuming media, but what about producing media? srgb's spec is set for 80nits, adobeRGB is 160nits. BT.2100 recommend mapping to 140nits. etc. as you can guess from this alone, it's incredibly hard to decide on how to do mapping.

              Also what tonemapping do we use? there are many various tonemappers, whats the rendering intent of the image? so on and so forth. So basically we are adding onto issue that we already have with straight HDR. but now we really need to know the image profile, so we can try and guess the intended nits of the SDR image to properly map (this is a loosing battle anyways since some people just dont care and create sRGB images with an intended peak of around 160nits.)

              Mixing transfers is just a hard thing to do because you also need to figure out the intended luminance, sometimes this can be fairly easy, but often times it can be a real pain.

              1c. Gamut

              the above doesn't even bother to account for gamut mapping, which is the range of color an image can produce. I turns out when we muck about color, this is also a problem in HDR we need to compress, expand, and map the range of colors too. without getting too indepth and risk talking about things I don't know, if you see that funky triangle, that's showing gamut. being able to convert gamut while not making the image look like trash is hard.

              1d. Whitepoint

              the last issue here is whitepoint, not all transfers have the same whitepoint, this means "White" can look more blue, or more yellow/red, depending on on what colorspace it is, now we have to somehow map "white" when "white" looks different from source to destination.

              2. Bitdepth

              Bitdepth actually isn't hard at all, we solved this a long time ago, it's the other things that are holding bitdepth back. Wayland is not something that is some kind of rolling release, you want as little color protocols as possible, which means things like colordepth are being held back by the other color management stuff

              3. Why not just use shaders?

              GPUS have luts and all sorts of colormanagement stuff baked in, yes some shaders will likely get used, but when you set a lut in GPU (fixed function) it uses a LOT less energy then using something like a shader to transform the image, this may not sound like a lot, but when you need to run shaders on the entire screen, every refresh, and then also run shaders per layer per update, gpu usage will absolutely skyrocket.

              TLDR: color is hard. Emulating and representing it accurately is very complicated when you need to mix multiple formats, and doing this in shaders will simply be too expensive to do everywhere.
              Wow! thank you.

              Comment


              • #27
                Originally posted by ms178 View Post
                I am not sure about HDR, but at least 10-bit colors are already supported by X at least, but not by default.

                You need to pass this hunk into an Xorg.conf file:

                Code:
                Section "Screen"
                Identifier "asdf"
                DefaultDepth 30
                EndSection​
                But be aware that Steam-games currently don't work with 10 bit anyway (which is yet another pre-requisite for HDR).
                Someone made incorrect presume that there is no 10 bit steam games.


                Gamescope HDR 10 bit is a mandatory feature.

                Yes up until now Gamescope HDR has required direct to DRM as the compositor so it can get the HDR metadata it need to give to the Steam provided games it running.

                Steam also requires the means not output 10 bit and 8 bit at the same time because the steam app/launcher will be 8 bit and the HDR game will be 10 bit colour.

                Also to be fun at moment Gamescope HDR does not working with anyone other than AMD GPUs.

                Gamescope is also does some underhanded things like adding a layer to vulkan so that when application is creating a window in vulkan while running under xwayland it can end up directly allocating a window by Wayland protocol going straight to the gamescope compositor instead. Yes so xwayland in 8 bit color yet proton running program has allocated 10 bit HDR output that really works because it bypassing X11 protocol.

                ms178 what is the point of running 10 bit color if you don't have the monitor color space metadata information to be able to correct the output to look anywhere near right.

                Yes this also means those who want to stay with bare metal X11 xserver cannot depend on xwayland developers to work on the X11 protocol to support HDR because from what valve has already implemented HDR under xwayland can be done by bypassing xwayland completely of course this solution does not work with bare metal X11 server.

                Comment


                • #28
                  Originally posted by oiaohm View Post

                  ms178 what is the point of running 10 bit color if you don't have the monitor color space metadata information to be able to correct the output to look anywhere near right.

                  Yes this also means those who want to stay with bare metal X11 xserver cannot depend on xwayland developers to work on the X11 protocol to support HDR because from what valve has already implemented HDR under xwayland can be done by bypassing xwayland completely of course this solution does not work with bare metal X11 server.
                  I know that this work is not for bare metal X11 server, I was wondering about native X11 support for HDR on KDE's Kwin and only that. Not Xwayland. Not Wayland. Not Gamescope.

                  I've got a 10-bit display (only with fake HDR support), and would like to use 10-bit colors with Kwin on X11 in Steam games. It is a longstanding and well-known shortcoming and probably won't be fixed this year if at all in this combination.

                  Comment


                  • #29
                    Originally posted by ms178 View Post
                    I know that this work is not for bare metal X11 server, I was wondering about native X11 support for HDR on KDE's Kwin and only that. Not Xwayland. Not Wayland. Not Gamescope.

                    I've got a 10-bit display (only with fake HDR support), and would like to use 10-bit colors with Kwin on X11 in Steam games. It is a longstanding and well-known shortcoming and probably won't be fixed this year if at all in this combination.
                    https://wiki.winehq.org/256_Color_Mode issue that a long term X11 protocol issue and it the core problem. Steam games you will run into the issue that the program wants to display sdr(24 bit/8bit color) next to HDR(30bit/10bit color) at the same time and the old X11 server design completely fails.


                    Yes there was a proposed change to the X11 protocol in 2017 to make HDR work on bare metal correctly one problem this X11 protocol change was never completed or merged.

                    So unless someone like Nvidia takes up the maintainer ship of bare metal X11 server and gets the changes for HDR support into the X11 protocol the steam HDR games are not going to work with bare metal X11 server because the X11 protocol does not in fact support displaying mixed color mode. Issue is not KDE in this case is the X11 protocol issue. Yes Wayland support for HDR games has also required altering Wayland protocol.

                    Windows the OS and Games designed for it expect to be able to display SDR and HDR at the same time even if the SDR is a little washed out.

                    Yes as you read the XDC2017 you notice you need full color conversion handling due to how much of a mess HDR is in fact.

                    Comment


                    • #30
                      Originally posted by oiaohm View Post

                      https://wiki.winehq.org/256_Color_Mode issue that a long term X11 protocol issue and it the core problem. Steam games you will run into the issue that the program wants to display sdr(24 bit/8bit color) next to HDR(30bit/10bit color) at the same time and the old X11 server design completely fails.


                      Yes there was a proposed change to the X11 protocol in 2017 to make HDR work on bare metal correctly one problem this X11 protocol change was never completed or merged.

                      So unless someone like Nvidia takes up the maintainer ship of bare metal X11 server and gets the changes for HDR support into the X11 protocol the steam HDR games are not going to work with bare metal X11 server because the X11 protocol does not in fact support displaying mixed color mode. Issue is not KDE in this case is the X11 protocol issue. Yes Wayland support for HDR games has also required altering Wayland protocol.

                      Windows the OS and Games designed for it expect to be able to display SDR and HDR at the same time even if the SDR is a little washed out.

                      Yes as you read the XDC2017 you notice you need full color conversion handling due to how much of a mess HDR is in fact.
                      Thanks a lot for the provided context. I don't know if Nvidia wants to tackle that problem though, but considering they work on bringing explicit sync support to X, there is a slight chance that they finish up the needed work, or that someone else steps up.

                      Comment

                      Working...
                      X