Announcement

Collapse
No announcement yet.

AMD GPU Driver Developers Pursuing New HDR Display Work For Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • billyswong
    replied
    Originally posted by dimko View Post
    about first, it's not apples to apples, but what can be said about bad HDR can be said about good SDR screens:
    https://www.youtube.com/watch?v=0srgPxK-WhQ

    If you dont need games and videos on screen or pro video editing - 1080P 30 hz LCD is enough for you, no?
    most people who actively use computers do at least some of this on daily basis. Heck, even youtube supposedly supports HDR.
    The youtube video shows how the current "HDR" is often doing things wrong. When a monitor is running in "SDR" mode, the game can squeeze brightness of the scene to within SDR range without losing details in bright region or dark region. In HDR mode, they ought to be capable to make bright region truly brighter (and/or dark region truly darker) without any loss of detail either. How much brighter the bright region can go does depend on the monitor capacity but in no case should the enabling of "HDR" cause blunt white-clamping.

    SDR monitors / printed photo need strong local tone mapping to keep everything clear in high contrast scene. With an HDR monitor, the same clarity can be achieved with weaker tone mapping. How many nits a monitor shines is no excuse to loss of detail in bright region / dark region in scene. Loss of detail is either artistic choice, or lazy technology because doing things properly is deemed too computational intensive.

    Originally posted by dimko View Post
    Nope, you can't do that really. Someone said something stupid and someone repeated.

    You do have light bulbs in your room or have seen sun, right?
    And even then HDR monitors are NOWHERE as bright as those two. Otherwise everyone would buy screens to light up giant spaces, like football stadions. It's LCD, so it will save tonnes of money on power, right
    I don't stare at the light bulb or the sun directly, especially not for a long period.​ But for media content creators, I agree they may want a screen with the "HDR white" maximized when authoring / editing.
    Last edited by billyswong; 01 May 2021, 10:52 PM.

    Leave a comment:


  • Spacefish
    replied
    Originally posted by sandy8925 View Post

    I'm more interested in wide colour gamut and not insane levels of brightness.........I'm good with 300 nits, thanks.
    Colormangement is a mess in Linux, as well as it is on windows.

    Popping colors sell in the store, that´s why a lot of monitors have significantly larger color gamuts than sRGB.. They traditionally have, as these cathode ray monitors had a very large color gamut, compared to the early TFTs..

    Correct behaviour would be to interpret all "legacy" 8bit content as sRGB if not marked otherwise (like some pictures or movies). So basically have 10/12bit output form the compositor to the display and hammer down any 8bit surface that does not define a colorspace and or brightness to sRGB 200nits or something like that.

    If you do this, the hole GUI tends to leak really sad / colorless..
    Windows implemented it that way in their first "HDR" implementation. People started to complain widely that their screen looks "washed out" when enabling HDR, that´s why they changed it back, such that only the peak brightness is limited for surfaces not advertising any color support but the gamut is display native / wrong.
    In later versions they added a setting to change the peak brightness for "SDR" content.

    IMHO the best solution would be to always interpret legacy content as sRGB, or at least make it user settable in the compositor.

    Furthermore current "Color Managed" applications basically do the color space conversions inside the app. They just read the monitor profile and map their output to the monitor profile. This works, but is the wrong way arround for the future IMHO.. The app should mark the surface it is rendering with a specific colorspace / gamma curve and the compositor / graphics stack of the OS should do all the mapping to the output devices...

    Leave a comment:


  • sandy8925
    replied
    Originally posted by [email protected] View Post
    True HDR displays cost a lot of money. Most of what you found out there is mediocre (for HDR standards) panels with only 300/400 nits, nowhere near de 1000/1400 nits of a real one. Hence the lack of enthusiasm on the Linux (developer) crowd for these things.

    I imagine that, having to choose, most developers will invest on a better CPU/memory, than on a fancy display that features something they don't need on their normal workflow.
    I'm more interested in wide colour gamut and not insane levels of brightness.........I'm good with 300 nits, thanks.

    Leave a comment:


  • MadeUpName
    replied
    Originally posted by Zan Lynx View Post

    These days the monitor probably will be HDR. But back in 2012 there were professional 10-bit color monitors like the HP ZR30w I am using right now. They were used professionally. Worked very well too.

    One interesting thing is that with the correct software and color profile this monitor can mostly reproduce HDR 400. Back before HDR was implemented on computer hardware, that is how cinema professionals at Pixar, Disney and other places did it.
    Sure and if they were selling it today it would have a big fat HDR sticker on it. The problem is you can't feed it an HDR signal from Linux so it would work as a crazy expensive SDR monitor. Beyond that no one was delivering in Rec.2020 because it didn't exist then never mind the ACES color space which is now industry standard for intermediaries.

    Leave a comment:


  • Zan Lynx
    replied
    Originally posted by MadeUpName View Post
    For professionals doing color grading these days they will color grade on an HDR monitor then output different versions depending on where it will be used. Broadcast, movie theaters etc. You can't color grade on an SD monitor and then output HDR and expect to get good results. Color graing in HDR even if your not planning on displaying it now in HDR leaves it future proofed. The fact that you can't use an HDR monitor with Linux means it is not useful for professionals.
    These days the monitor probably will be HDR. But back in 2012 there were professional 10-bit color monitors like the HP ZR30w I am using right now. They were used professionally. Worked very well too.

    One interesting thing is that with the correct software and color profile this monitor can mostly reproduce HDR 400. Back before HDR was implemented on computer hardware, that is how cinema professionals at Pixar, Disney and other places did it.

    Leave a comment:


  • dimko
    replied
    Originally posted by [email protected] View Post
    True HDR displays cost a lot of money. Most of what you found out there is mediocre (for HDR standards) panels with only 300/400 nits, nowhere near de 1000/1400 nits of a real one. Hence the lack of enthusiasm on the Linux (developer) crowd for these things.
    Chicken-egg situation.

    Leave a comment:


  • dimko
    replied
    Originally posted by billyswong View Post
    Thanks for explanation guys. No, I have no interest in seeing explosion scene harm my eyes realistically. If anyone want HDR screens replace all SDR screens and become the new standard, HDR screens need a knob / setting of clamping the maximum brightness.
    Nope, you can't do that really. Someone said something stupid and someone repeated.

    You do have light bulbs in your room or have seen sun, right?
    And even then HDR monitors are NOWHERE as bright as those two. Otherwise everyone would buy screens to light up giant spaces, like football stadions. It's LCD, so it will save tonnes of money on power, right

    Leave a comment:


  • dimko
    replied
    Originally posted by billyswong View Post

    HDR display in my knowledge is about showing colors brighter than paper-white

    _____________________________

    Maybe if we can think up HDR usage outside gaming / movie, there will be more enthusiasm to implement HDR in open source crowd? How about emulating florescence and metallic ink on paper? We will get syntax highlight that are literally "highlight"~
    about first, it's not apples to apples, but what can be said about bad HDR can be said about good SDR screens:
    https://www.youtube.com/watch?v=0srgPxK-WhQ

    If you dont need games and videos on screen or pro video editing - 1080P 30 hz LCD is enough for you, no?
    most people who actively use computers do at least some of this on daily basis. Heck, even youtube supposedly supports HDR.

    Leave a comment:


  • dimko
    replied
    I am shining on the inside now

    Leave a comment:


  • MadeUpName
    replied
    Photographers/Cinematographers measure light in stops with one stop being ether a doubling of having of brightness. When you are looking at a sunset with objects that are in casting shadpws the difference in brightness from the deepest back to the brightness of the sun is about 20 stops. That is the dynamic range. An Alexi camera can capture about 18 stops. A normal cinema camera about 14 - 16. A fairly high end DSLR/mirrorless camera ~14. A Rec.709 monitor like the vast majority of people are using is 8 stops. A movie theatre has about 10 stops. Display technology totally sucks compared to camera technology.

    When the scene you are capturing exceeds the dynamic range of the technology what happens is the bright colors are clipped and become white and the darkest colors get clipped and become black. Both lead to a loss of detail. A prime example you see every where is an interior shot where the windows are blown out and go white due to too much dynamic range in the shot. That washed out look you get on a monitor in a brightly lit room isn't a loss of dynamic range it is a loss of contrast. A monitor that has a higher dynamic range can help counteract that by over powering the room light. But yes that will be hard on the eyes if you stare at it for 8 hours straight. But all monitors HDR or SDR allow you to control the brightness. That is needed for calibrating. And no they are going to burn ou your eyes. HDR isn't just pixel values but also includes meta data that the display will use to turn up or down the back lights of the display to help make brights brigter and darks darker. I spend pretty well my entire day looking at HDR monitors and on days when I am color grading I do it in a blacked out room. If you are working on a text document it will just look like an SD display.

    For professionals doing color grading these days they will color grade on an HDR monitor then output different versions depending on where it will be used. Broadcast, movie theaters etc. You can't color grade on an SD monitor and then output HDR and expect to get good results. Color graing in HDR even if your not planning on displaying it now in HDR leaves it future proofed. The fact that you can't use an HDR monitor with Linux means it is not useful for professionals.
    Last edited by MadeUpName; 30 April 2021, 02:19 PM.

    Leave a comment:

Working...
X