Announcement

Collapse
No announcement yet.

AMD GPU Driver Developers Pursuing New HDR Display Work For Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by [email protected] View Post
    True HDR displays cost a lot of money. Most of what you found out there is mediocre (for HDR standards) panels with only 300/400 nits, nowhere near de 1000/1400 nits of a real one. Hence the lack of enthusiasm on the Linux (developer) crowd for these things.

    I imagine that, having to choose, most developers will invest on a better CPU/memory, than on a fancy display that features something they don't need on their normal workflow.
    I'm more interested in wide colour gamut and not insane levels of brightness.........I'm good with 300 nits, thanks.

    Comment


    • #22
      Originally posted by sandy8925 View Post

      I'm more interested in wide colour gamut and not insane levels of brightness.........I'm good with 300 nits, thanks.
      Colormangement is a mess in Linux, as well as it is on windows.

      Popping colors sell in the store, that´s why a lot of monitors have significantly larger color gamuts than sRGB.. They traditionally have, as these cathode ray monitors had a very large color gamut, compared to the early TFTs..

      Correct behaviour would be to interpret all "legacy" 8bit content as sRGB if not marked otherwise (like some pictures or movies). So basically have 10/12bit output form the compositor to the display and hammer down any 8bit surface that does not define a colorspace and or brightness to sRGB 200nits or something like that.

      If you do this, the hole GUI tends to leak really sad / colorless..
      Windows implemented it that way in their first "HDR" implementation. People started to complain widely that their screen looks "washed out" when enabling HDR, that´s why they changed it back, such that only the peak brightness is limited for surfaces not advertising any color support but the gamut is display native / wrong.
      In later versions they added a setting to change the peak brightness for "SDR" content.

      IMHO the best solution would be to always interpret legacy content as sRGB, or at least make it user settable in the compositor.

      Furthermore current "Color Managed" applications basically do the color space conversions inside the app. They just read the monitor profile and map their output to the monitor profile. This works, but is the wrong way arround for the future IMHO.. The app should mark the surface it is rendering with a specific colorspace / gamma curve and the compositor / graphics stack of the OS should do all the mapping to the output devices...

      Comment


      • #23
        Originally posted by dimko View Post
        about first, it's not apples to apples, but what can be said about bad HDR can be said about good SDR screens:
        https://www.youtube.com/watch?v=0srgPxK-WhQ

        If you dont need games and videos on screen or pro video editing - 1080P 30 hz LCD is enough for you, no?
        most people who actively use computers do at least some of this on daily basis. Heck, even youtube supposedly supports HDR.
        The youtube video shows how the current "HDR" is often doing things wrong. When a monitor is running in "SDR" mode, the game can squeeze brightness of the scene to within SDR range without losing details in bright region or dark region. In HDR mode, they ought to be capable to make bright region truly brighter (and/or dark region truly darker) without any loss of detail either. How much brighter the bright region can go does depend on the monitor capacity but in no case should the enabling of "HDR" cause blunt white-clamping.

        SDR monitors / printed photo need strong local tone mapping to keep everything clear in high contrast scene. With an HDR monitor, the same clarity can be achieved with weaker tone mapping. How many nits a monitor shines is no excuse to loss of detail in bright region / dark region in scene. Loss of detail is either artistic choice, or lazy technology because doing things properly is deemed too computational intensive.

        Originally posted by dimko View Post
        Nope, you can't do that really. Someone said something stupid and someone repeated.

        You do have light bulbs in your room or have seen sun, right?
        And even then HDR monitors are NOWHERE as bright as those two. Otherwise everyone would buy screens to light up giant spaces, like football stadions. It's LCD, so it will save tonnes of money on power, right
        I don't stare at the light bulb or the sun directly, especially not for a long period.​ But for media content creators, I agree they may want a screen with the "HDR white" maximized when authoring / editing.
        Last edited by billyswong; 01 May 2021, 10:52 PM.

        Comment

        Working...
        X