I really hope they can work together and pull this off. I love Linux, but I feel like it's stuff like this that will forever keep it niche. HDR has been around for at least like 5 years. By the time this works on Linux everything will have moved to 8K and some new UltraHDR standard.
Announcement
Collapse
No announcement yet.
AMD Continues Working Toward HDR Display Support For The Linux Desktop
Collapse
X
-
Originally posted by Anux View PostSure there is no need for standard office/desktop stuff to be HDR, you don't want to stress your eyes with bright lights while working 8 hours.
Although HDR is a big improvement in picture quality, you don't need high end quality everywere. The jump from standard television to 1080p was much more valuable and I don't know many that like watching old 480i content, while there is no problem watching something without HDR.
But well... this is just my ideal. In real life, the movie studios are using HDR to put dark scenes dark and bright scenes bright, all in absolute luminosity, and only deliver okay in cinema / darkroom environment. For common TV / computer / smartphone environment, the tonemapping should have been done like what I wrote in last paragraph. Let's see if game studios will get it right one day, as game developers are less likely to keep pretending everyone will game in darkrooms.
- Likes 1
Comment
-
Originally posted by mppix View Post
This is well in progress but there is a lot to do..
- Likes 1
Comment
-
Originally posted by billyswong View PostThis shows how misled people are in the expectation of HDR. HDR shouldn't have been about making everything brighter. It should have been about giving bright spots in a scene a chance to be not a patch of white but still contain details or colours. In office / desktop stuff, this translates to a way to present metallic ink.
But well... this is just my ideal. In real life, the movie studios are using HDR to put dark scenes dark and bright scenes bright, all in absolute luminosity, and only deliver okay in cinema / darkroom environment. For common TV / computer / smartphone environment, the tonemapping should have been done like what I wrote in last paragraph. Let's see if game studios will get it right one day, as game developers are less likely to keep pretending everyone will game in darkrooms.
There is a nightmare to HDR displaying srgb if it wrong. Example is Darth Vader in star wars in a particular scene. When the scene was shot he was standing in background in front of a black screen so unable to be seen.
Some of people expectation about HDR is that most HDR monitor technically sux badly.
We are starting to now see like above with what is called True 10-bit. These have true black and true 10 bits per pixel. The so called HDR monitors with zoned dimming would be like 8bit per pixel with averaged brightness adjustment. It going to be a while before True 10-bit HDR makes it into consumer monitors still.
Monitor panel bit color depth may seem confusing, but this article will help simplify the pro's and con's of 10-bit vs 8-bit + Frame Rate Control (FRC).
Yes Monitor makers claim 8-bit + frc has been good enough. Do note this leaded to a horrible problem. Person is mastering video on True 10 bit screen that does not match what the end viewer will have in any way shape or form on 8-bit +frc screen. There are many studios not bothering with HDR production because of the problem. SRGB what you see in production and what the person sees on cheaper screens will be closer.
billyswong I would not say people have been miss lead as such. Instead I would say the quality of the HDR experience has not been great. Something else to be aware of is the sRGB we see on our current monitors most cases is not idea. People like old CRT monitors because they have bigger contrast range.
Bigger contrast range equal being able to show more colors as well. One of the horrible realities is we will never be able to display metallic ink correctly on a monitor HDR does not change this. Issue metallic ink is not 1 colour. Its a funny issue left and right eye with metallic ink in fact sees a slightly different colour. HDR VR headset might be able to do metallic ink. Yes a person who has lost an eye has had time telling the difference between metallic ink and non metallic ink of the same colour. The bigger colour space gives HDR better chance of doing one of the colours a metalic ink. Also a CRT monitor or invididual pixel brightness control oled monitor due to higher contrast has a better chance of doing the colour of metallic ink even when srgb.
The reality is the first 5 years of HDR the monitors have really not been ideal. 8bit+frc solution to emulating 10bit HDR is not that much of a step up from 8bit SRGB. Yes 8bit+frc serous-ally give the appearance that the HDR difference is just brighten up sections of the image. The step from 8bit+frc and 8bit SRGB to True 10bit HDR is quite a step up. Remember we took quite a step down when we went from SRGB on CRT to LCD as well. 8bit+frc mostly end up reducing the difference between CRT SRGB quality and LCD produced quality.
CRT srgb was very close to the gold standard of srgb for quality. Yes oled individual pixel 10bit HDR is the gold standard HDR. 8 bit LCD screens have not been gold standard in colour display.
- Likes 1
Comment
-
Originally posted by billyswong View PostThis shows how misled people are in the expectation of HDR. HDR shouldn't have been about making everything brighter.
and only deliver okay in cinema / darkroom environment.
Originally posted by oiaohm View PostThe reality is the first 5 years of HDR the monitors have really not been ideal. 8bit+frc solution to emulating 10bit HDR is not that much of a step up from 8bit SRGB. Yes 8bit+frc serous-ally give the appearance that the HDR difference is just brighten up sections of the image. The step from 8bit+frc and 8bit SRGB to True 10bit HDR is quite a step up. Remember we took quite a step down when we went from SRGB on CRT to LCD as well. 8bit+frc mostly end up reducing the difference between CRT SRGB quality and LCD produced quality.
- Likes 1
Comment
-
Originally posted by Anux View PostCheap SDR display are still 6 bit + frc and I highly doubt any CRT can hold up to the contrast of a VA display even with 8 bit + frc.
The reality is 8bit CRT in fact has higher contrast than 8bit+frc LCD. The issue is true black. CRT has wider viewing angle that it remains color correct at as well.
Remember the cheapest SVGA CRT were still 8bit SDR. The true 10 bit colour displays with true led per pixel have the same properties as the phosphors of CRT screens of large viewing angles of correct colour display and really close true black.
VA displays are the best of the LCD tech when it come to how close they get to true black but they are still a long way off compared to CRT.
Anux it is a surprise to a lot. Massive thing is the old CRT had good colour but is huge on desk and bad on power usage and bad on radiation exposure. LCD provided a set of advantages that we traded away some colour correctness for. Per LED per pixel displays now allow getting those colour corrections back.
Please note the first ever 10bit HDR screen was not a LCD screen but was in fact a CRT screen so yes there are CRT screens that can absolutely beat 8bit+frc VA displays.
- Likes 2
Comment
-
There is good progress already happening in Wayland arround Color Management as well as HDR. There are a lot of edge cases and decisions that have to be made, but they try to do it "right" this time, like having surfaces announce their color space, depth, hdr curve and such to the display manager instead of letting the application do the color management / transformation.
But this comes with other challenges like performance, you have to do the color space transformation and gamma curve calculation on the GPU for example, as otherwise the buffer needs to travel betwenn GPU and CPU such that the color tranformation can happen there, which would have a serious performance impact.
There are a lot of merged changes already: https://gitlab.freedesktop.org/wayla...sts?label_name[]=Colour%20managementLast edited by Spacefish; 07 October 2022, 07:49 PM.
Comment
-
Originally posted by oiaohm View Post
The reality is 8bit CRT in fact has higher contrast than 8bit+frc LCD.
The issue is true black
VA displays are the best of the LCD tech when it come to how close they get to true black but they are still a long way off compared to CRT.
Edit: Actually your right CRT has darker blacks than VA. But VA is already dark enough that you have more problems with the light reflecting of the walls than with display glow. So unless your walls are black/unreflective and no one sits in front of the CRT you probably have no gain from the darker black.
CRT had good colour
Please note the first ever 10bit HDR screen was not a LCD screen but was in fact a CRT screen
At the time of the change LCD was realy a step back in image quality (mostly TN) and we had another step back in resolution with the 4:3 to 16:9 transition. But today's even modestly priced LCDs are better than CRTs in nearly all categories.
- Likes 1
Comment
-
Originally posted by Spacefish View PostBut this comes with other challenges like performance, you have to do the color space transformation and gamma curve calculation on the GPU for example, as otherwise the buffer needs to travel betwenn GPU and CPU such that the color tranformation can happen there, which would have a serious performance impact.
Comment
Comment