Announcement

Collapse
No announcement yet.

KDE KWin Preparing Preliminary Support For Running HDR Games

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • oiaohm
    replied
    Originally posted by Quackdoc View Post
    I wonder how much of this can be shimmed by using nested compositors for mixed color mode.
    This is when you straight up notice you are cursed location due to Nvidia.
    xnest and xephyr with Nvidia don't support hardware graphical acceleration. Those working with wine have got to know that you with Nvidia you drop back to software rendering when you use any form of nesting.

    The only hope you have that nested compositors will work with Nvidia hardware is that Wayland development will force them into it. Eglstreams is a very strict demonstration how much Nvidia does not want to-do this with how they attempted to sell the idea that process separation and stability was not important feature .

    Yes nested solution is a option with AMD and Intel.

    The broken xephyr support also equaled not being able to support newer X11 protocol features on older X11 bare metal server when you had Nvidia GPU.

    Yes so much for the concept that Nvidia drivers have been fine right.

    This is the problem hope for nested compositor that support all GPUs equals Wayland solution because Nvidia need to be dragged kicking and screaming into implementing support for this.

    The ball like it or not for HDR support under bare metal X11 is in the Nvidia court and Nvidia is not interested in return it. Yes either X11 protocol extension by Nvidia that works and gets merged or Nvidia making X11 nesting in fact work with their hardware and getting those changes merged.

    Its so likely that Xwayland on gamescope on bare metal X11 server will work before Xepher with hardware accelerated graphics works with Nvidia and it possible Xepher with Nvidia never gets hardware acceleration.

    The hard reality is HDR like it or not if nothing changes on Linux is only going to come into functional usefulness any time soon for those who are using Wayland based solutions.

    Leave a comment:


  • Quackdoc
    replied
    Originally posted by oiaohm View Post
    So unless someone like Nvidia takes up the maintainer ship of bare metal X11 server and gets the changes for HDR support into the X11 protocol the steam HDR games are not going to work with bare metal X11 server because the X11 protocol does not in fact support displaying mixed color mode.
    I wonder how much of this can be shimmed by using nested compositors for mixed color mode.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by ms178 View Post
    Thanks a lot for the provided context. I don't know if Nvidia wants to tackle that problem though, but considering they work on bringing explicit sync support to X, there is a slight chance that they finish up the needed work, or that someone else steps up.
    Lets be real here. Nvidia work on explicit sync now is because xwayland will not allow them to load their own driver code into the process to emulate implicit sync. nvidia module inside X11 server emulates implicit sync. Having to implement implicit sync emulation is required to implicit CUDA and Opengl to specification.



    Nvidia has no sign at the moment that they are going to work on X11 bare for new features. Also the history of nvidia working on X11 bare metal is them making proposals than getting some other developer to code the stuff by having their driver malfunction without it. This stunt has not been working with Xwayland.

    Functional implicit sync in default opengl means means that opengl/opencl combination could work like the CUDA/Opengl combination. Nvidia demand for explicit sync is really nothing more than smoke and mirrors. Nvidia wants implicit sync to only work when it suits them so preventing competition.

    Leave a comment:


  • ms178
    replied
    Originally posted by oiaohm View Post

    https://wiki.winehq.org/256_Color_Mode issue that a long term X11 protocol issue and it the core problem. Steam games you will run into the issue that the program wants to display sdr(24 bit/8bit color) next to HDR(30bit/10bit color) at the same time and the old X11 server design completely fails.


    Yes there was a proposed change to the X11 protocol in 2017 to make HDR work on bare metal correctly one problem this X11 protocol change was never completed or merged.

    So unless someone like Nvidia takes up the maintainer ship of bare metal X11 server and gets the changes for HDR support into the X11 protocol the steam HDR games are not going to work with bare metal X11 server because the X11 protocol does not in fact support displaying mixed color mode. Issue is not KDE in this case is the X11 protocol issue. Yes Wayland support for HDR games has also required altering Wayland protocol.

    Windows the OS and Games designed for it expect to be able to display SDR and HDR at the same time even if the SDR is a little washed out.

    Yes as you read the XDC2017 you notice you need full color conversion handling due to how much of a mess HDR is in fact.
    Thanks a lot for the provided context. I don't know if Nvidia wants to tackle that problem though, but considering they work on bringing explicit sync support to X, there is a slight chance that they finish up the needed work, or that someone else steps up.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by ms178 View Post
    I know that this work is not for bare metal X11 server, I was wondering about native X11 support for HDR on KDE's Kwin and only that. Not Xwayland. Not Wayland. Not Gamescope.

    I've got a 10-bit display (only with fake HDR support), and would like to use 10-bit colors with Kwin on X11 in Steam games. It is a longstanding and well-known shortcoming and probably won't be fixed this year if at all in this combination.
    https://wiki.winehq.org/256_Color_Mode issue that a long term X11 protocol issue and it the core problem. Steam games you will run into the issue that the program wants to display sdr(24 bit/8bit color) next to HDR(30bit/10bit color) at the same time and the old X11 server design completely fails.


    Yes there was a proposed change to the X11 protocol in 2017 to make HDR work on bare metal correctly one problem this X11 protocol change was never completed or merged.

    So unless someone like Nvidia takes up the maintainer ship of bare metal X11 server and gets the changes for HDR support into the X11 protocol the steam HDR games are not going to work with bare metal X11 server because the X11 protocol does not in fact support displaying mixed color mode. Issue is not KDE in this case is the X11 protocol issue. Yes Wayland support for HDR games has also required altering Wayland protocol.

    Windows the OS and Games designed for it expect to be able to display SDR and HDR at the same time even if the SDR is a little washed out.

    Yes as you read the XDC2017 you notice you need full color conversion handling due to how much of a mess HDR is in fact.

    Leave a comment:


  • ms178
    replied
    Originally posted by oiaohm View Post

    ms178 what is the point of running 10 bit color if you don't have the monitor color space metadata information to be able to correct the output to look anywhere near right.

    Yes this also means those who want to stay with bare metal X11 xserver cannot depend on xwayland developers to work on the X11 protocol to support HDR because from what valve has already implemented HDR under xwayland can be done by bypassing xwayland completely of course this solution does not work with bare metal X11 server.
    I know that this work is not for bare metal X11 server, I was wondering about native X11 support for HDR on KDE's Kwin and only that. Not Xwayland. Not Wayland. Not Gamescope.

    I've got a 10-bit display (only with fake HDR support), and would like to use 10-bit colors with Kwin on X11 in Steam games. It is a longstanding and well-known shortcoming and probably won't be fixed this year if at all in this combination.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by ms178 View Post
    I am not sure about HDR, but at least 10-bit colors are already supported by X at least, but not by default.

    You need to pass this hunk into an Xorg.conf file:

    Code:
    Section "Screen"
    Identifier "asdf"
    DefaultDepth 30
    EndSection​
    But be aware that Steam-games currently don't work with 10 bit anyway (which is yet another pre-requisite for HDR).
    Someone made incorrect presume that there is no 10 bit steam games.


    Gamescope HDR 10 bit is a mandatory feature.

    Yes up until now Gamescope HDR has required direct to DRM as the compositor so it can get the HDR metadata it need to give to the Steam provided games it running.

    Steam also requires the means not output 10 bit and 8 bit at the same time because the steam app/launcher will be 8 bit and the HDR game will be 10 bit colour.

    Also to be fun at moment Gamescope HDR does not working with anyone other than AMD GPUs.

    Gamescope is also does some underhanded things like adding a layer to vulkan so that when application is creating a window in vulkan while running under xwayland it can end up directly allocating a window by Wayland protocol going straight to the gamescope compositor instead. Yes so xwayland in 8 bit color yet proton running program has allocated 10 bit HDR output that really works because it bypassing X11 protocol.

    ms178 what is the point of running 10 bit color if you don't have the monitor color space metadata information to be able to correct the output to look anywhere near right.

    Yes this also means those who want to stay with bare metal X11 xserver cannot depend on xwayland developers to work on the X11 protocol to support HDR because from what valve has already implemented HDR under xwayland can be done by bypassing xwayland completely of course this solution does not work with bare metal X11 server.

    Leave a comment:


  • dev_null
    replied
    Originally posted by Quackdoc View Post

    Im not sure what specifically you are referring to so here is a crash course, tldr at the bottom;

    1a. Why is HDR hard?
    There are a lot of reasons why straight HDR is still hard. HDR has made a lot of progress in a lot of facets, the most importing being transfer curve. Firstly, it's important to know that a transfer curve manipulates a image (videos are series of images) so that we can accurately emulate how humans physically interpret light. Old transfer curves use whats called a relative luminance. This means brightness is from 0.0 to 1.0 or more accurately 0% brightness, and 100% brightness. This worked fine back when all displays were roughly 80 nits, or 200 nits, or 160 nits etc.

    However now we have HDR, HDR comes along and now we have displays that can be 350 nits, 1000 nits, and anywhere inbetween, relative luminance obviously wont work any more (it was already struggling) as 90% brightness on a 400 nit display, and 90% brightness on a 1000 nit display is the difference between a decent lamp, and actually blindingly bright at night. so now HDR transfer curves like PQ and HLG, use an absolute luminance system if you have say RGB(0.5, 0.5, 0.5), no matter what, we always know what the rendering intent of that is, that it will be roughly around 90 nits, and that RGB(0.8, 0.8, 0.8) will be about 1500 nits.

    This poses additional issues however. even if it's a large improvement we still need to know things like "How bright is the display monitor?" if you only have a 400 nit monitor, you simply cant put out 1000 nits of brightness, so you need to tonemap 1000 nits to 400 nits. Should the monitor handles this? in many cases it can, but maybe it shouldn't always especially since we may need to display two different images with different peak brightness, how should you display a properly tonemapped image designed for 400 nits, and a 1000 nit image at the same time? there are hundreds of questions like these we need to answer.

    1b. Why is SDR on HDR hard?

    I already went over Tonemapping and relative vs absolute luminance, but how do you display, relative luminance images where we don't know the intended luminance, on an absolute luminance display? the general "media" approach to this is to map white to 203 nits, this is fine for consuming media, but what about producing media? srgb's spec is set for 80nits, adobeRGB is 160nits. BT.2100 recommend mapping to 140nits. etc. as you can guess from this alone, it's incredibly hard to decide on how to do mapping.

    Also what tonemapping do we use? there are many various tonemappers, whats the rendering intent of the image? so on and so forth. So basically we are adding onto issue that we already have with straight HDR. but now we really need to know the image profile, so we can try and guess the intended nits of the SDR image to properly map (this is a loosing battle anyways since some people just dont care and create sRGB images with an intended peak of around 160nits.)

    Mixing transfers is just a hard thing to do because you also need to figure out the intended luminance, sometimes this can be fairly easy, but often times it can be a real pain.

    1c. Gamut

    the above doesn't even bother to account for gamut mapping, which is the range of color an image can produce. I turns out when we muck about color, this is also a problem in HDR we need to compress, expand, and map the range of colors too. without getting too indepth and risk talking about things I don't know, if you see that funky triangle, that's showing gamut. being able to convert gamut while not making the image look like trash is hard.

    1d. Whitepoint

    the last issue here is whitepoint, not all transfers have the same whitepoint, this means "White" can look more blue, or more yellow/red, depending on on what colorspace it is, now we have to somehow map "white" when "white" looks different from source to destination.

    2. Bitdepth

    Bitdepth actually isn't hard at all, we solved this a long time ago, it's the other things that are holding bitdepth back. Wayland is not something that is some kind of rolling release, you want as little color protocols as possible, which means things like colordepth are being held back by the other color management stuff

    3. Why not just use shaders?

    GPUS have luts and all sorts of colormanagement stuff baked in, yes some shaders will likely get used, but when you set a lut in GPU (fixed function) it uses a LOT less energy then using something like a shader to transform the image, this may not sound like a lot, but when you need to run shaders on the entire screen, every refresh, and then also run shaders per layer per update, gpu usage will absolutely skyrocket.

    TLDR: color is hard. Emulating and representing it accurately is very complicated when you need to mix multiple formats, and doing this in shaders will simply be too expensive to do everywhere.
    Wow! thank you.

    Leave a comment:


  • MorrisS.
    replied
    I've just run Wayland session on gt 710 card by chrome based on Wayland. Streaming quality image is superlative.

    Leave a comment:


  • Quackdoc
    replied
    Originally posted by dev_null View Post
    as to someone who doesn’t know the topic well. Why is it so hard ? In my imagination you just should use floats in shaders what you do anyway mostly, then change type of the screen surface to something like r16g16b16 instead of r8g8b8. make rgb 1,1,1 to be maximum white in SDR to not break old code and software and what is more is actually HDR that’s it. why to invent 5 or so standards and make them super complicated???
    Im not sure what specifically you are referring to so here is a crash course, tldr at the bottom;

    1a. Why is HDR hard?
    There are a lot of reasons why straight HDR is still hard. HDR has made a lot of progress in a lot of facets, the most importing being transfer curve. Firstly, it's important to know that a transfer curve manipulates a image (videos are series of images) so that we can accurately emulate how humans physically interpret light. Old transfer curves use whats called a relative luminance. This means brightness is from 0.0 to 1.0 or more accurately 0% brightness, and 100% brightness. This worked fine back when all displays were roughly 80 nits, or 200 nits, or 160 nits etc.

    However now we have HDR, HDR comes along and now we have displays that can be 350 nits, 1000 nits, and anywhere inbetween, relative luminance obviously wont work any more (it was already struggling) as 90% brightness on a 400 nit display, and 90% brightness on a 1000 nit display is the difference between a decent lamp, and actually blindingly bright at night. so now HDR transfer curves like PQ and HLG, use an absolute luminance system if you have say RGB(0.5, 0.5, 0.5), no matter what, we always know what the rendering intent of that is, that it will be roughly around 90 nits, and that RGB(0.8, 0.8, 0.8) will be about 1500 nits.

    This poses additional issues however. even if it's a large improvement we still need to know things like "How bright is the display monitor?" if you only have a 400 nit monitor, you simply cant put out 1000 nits of brightness, so you need to tonemap 1000 nits to 400 nits. Should the monitor handles this? in many cases it can, but maybe it shouldn't always especially since we may need to display two different images with different peak brightness, how should you display a properly tonemapped image designed for 400 nits, and a 1000 nit image at the same time? there are hundreds of questions like these we need to answer.

    1b. Why is SDR on HDR hard?

    I already went over Tonemapping and relative vs absolute luminance, but how do you display, relative luminance images where we don't know the intended luminance, on an absolute luminance display? the general "media" approach to this is to map white to 203 nits, this is fine for consuming media, but what about producing media? srgb's spec is set for 80nits, adobeRGB is 160nits. BT.2100 recommend mapping to 140nits. etc. as you can guess from this alone, it's incredibly hard to decide on how to do mapping.

    Also what tonemapping do we use? there are many various tonemappers, whats the rendering intent of the image? so on and so forth. So basically we are adding onto issue that we already have with straight HDR. but now we really need to know the image profile, so we can try and guess the intended nits of the SDR image to properly map (this is a loosing battle anyways since some people just dont care and create sRGB images with an intended peak of around 160nits.)

    Mixing transfers is just a hard thing to do because you also need to figure out the intended luminance, sometimes this can be fairly easy, but often times it can be a real pain.

    1c. Gamut

    the above doesn't even bother to account for gamut mapping, which is the range of color an image can produce. I turns out when we muck about color, this is also a problem in HDR we need to compress, expand, and map the range of colors too. without getting too indepth and risk talking about things I don't know, if you see that funky triangle, that's showing gamut. being able to convert gamut while not making the image look like trash is hard.

    1d. Whitepoint

    the last issue here is whitepoint, not all transfers have the same whitepoint, this means "White" can look more blue, or more yellow/red, depending on on what colorspace it is, now we have to somehow map "white" when "white" looks different from source to destination.

    2. Bitdepth

    Bitdepth actually isn't hard at all, we solved this a long time ago, it's the other things that are holding bitdepth back. Wayland is not something that is some kind of rolling release, you want as little color protocols as possible, which means things like colordepth are being held back by the other color management stuff

    3. Why not just use shaders?

    GPUS have luts and all sorts of colormanagement stuff baked in, yes some shaders will likely get used, but when you set a lut in GPU (fixed function) it uses a LOT less energy then using something like a shader to transform the image, this may not sound like a lot, but when you need to run shaders on the entire screen, every refresh, and then also run shaders per layer per update, gpu usage will absolutely skyrocket.

    TLDR: color is hard. Emulating and representing it accurately is very complicated when you need to mix multiple formats, and doing this in shaders will simply be too expensive to do everywhere.

    Leave a comment:

Working...
X