Announcement

Collapse
No announcement yet.

Linux Developers To Meet Again To Work On HDR, Color Management & VRR

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by Quackdoc View Post
    Gamut isn't really that hard, rec2020/2100 colorspace already is that.
    It would be very unwise to settle for rec 2020 as internal color space like it was with sRGB (or linear RGB) in the past. What you need is a color space that covers all visible colors like CIEXYZ, CIELAB or oklab.
    The majority of content is mastered for DCI-p3. But we also have scRGB which is a modification of sRGB to allow a much greater gamut, and a linear transfer. Either scRGB or bt.2020 would be fine.
    It might be fine in the short term but as soon as we get a new color space say REC 2030 we would have to rework the whole HDR stack. Why not make it right from the beginning?

    This is two things, gamut mapping and tone mapping, Gamut mapping isn't that hard
    It's as hard as tone mapping, maybe a little less visible because most of the time we don't use much of the available gamut.
    we have been doing gamut mapping since the 90s or even before. it's mostly a solved problem.
    As we have with tone mapping but it's not a solved problem since there is no mathematical correct way to do it. You are mapping bigger gamuts to smaller ones. It's a lossy process you can clip it or map/compress it (with a plethora of different transfer curves), but there is no right way to do it you can only do what looks subjectively good to you.

    Tone mapping is much harder
    It's as hard as gamut, it's just more recognizable since we have much better brightness perception than color.

    Tone mapping actually doesn't look that bad, even when mapping down to 600 or even 400 nits, it still looks way better then sRGB, it's just a matter of properly mapping it.
    Looking good/better is very subjective. Imagine 2 viewing conditions, both with a 400 nits display but one in a totally dark room and one in a lit room. For the dark room you could just make everything darker and still have a nice HDR experience but in the lit room the same result would be much to dark or if you adjust average brightness most highlights will be blown out, both will look bad.

    Fixing the peak nits of the display fixes the majority of the issues.
    Yes as is always the case if we don't know about display properties.
    IMO this is something that should be signaled in edid, but it's usually fine to ask the user or default to 1knits.
    Even more in my opinion, EDID should contain a complete color profile, peak and average brightness as well as a room sensor to know the viewing conditions. This would allow for the best possible experience each device can offer.

    Comment


    • #52
      Originally posted by Myownfriend View Post

      Even though I've only been properly using Linux since 2020, my earliest experience with Linux was in maybe 2005 when I messed with a live ISO of Ubuntu. I wanna say it was Dapper Drake or Intrepid Ibex? Anyway I'm just saying I've been some what aware of what's been going on in the Linux world for awhile and I remember seeing videos of Linux desktops with transparency, wobble effects, virtual desktop on a cube, windows burning away, and stuff and I definitely thought it was cool but even when I was 18, some of it seemed kind of excessive. These days I think people would find those effects kind of cute but also way too campy and impractical.

      My earliest experience was around 1999 with Debian, I had a dual boot era between 2002-2006, was Linux full time from 2006-2019, and have been dual booting again since 2018...some stuff requires Windows, dammit...

      While it is easy to go very overboard with effects, slight effects make the difference between a smooth and clunky feeling system. Those effects, however, aren't what made 2006-2015 such a great time for the Linux desktop. They helped, but they weren't everything. It was how we were able to mix and match things from different environments to create really unique systems. Like swapping around panels, window managers, file managers, etc from one project to the next. Being able to do all that and have spiffy effects was what was so great. Like, the KDE cube nowadays is a piece of shit joke when compared the the old cube. Sorry not sorry. It is.

      GTK3 and the GNOME insistence on CSD started fracturing how well that worked and then Wayland really put a hold on interoperability. Getting hit with the 1-2 combination of CSD/SSD and X11/Wayland probably gave Wayland a much worse reputation than it deserved.

      In another 5 years, Wayland might be where X11 systems were a decade ago. That comes out bad, but you gotta put that into perspective. X11 was nearly 20 years old 10 years back and Wayland is about 15 years old now. I figure that more and more Wayland stuff will be interchangeable once more and more of it supports the same parts of the protocols.

      That's more along the lines of stuff that I was thinking. I know a lot of people that use multiple displays and all the tearing, the lack of multi-DPI scaling, the inability to have mismatched frame rates, etc, would immediately turn them off. I can only imagine what they would think if they tried running a game that changes their desktop gamma settings and changes their resolution. Those things just feel hacky and archaic now.
      I never had tearing the few times I used multiple monitors. Granted, they all ran at 60hz, I used v sync, and DPI scaling is rather moot at 1080p and lower for the majority of people with good enough eyesight. But when you limit back in the day to a single monitor at a resolution where DPI scaling isn't necessary, the UIs from back then can be just as good or better than modern day systems.

      When you consider that CSD/SDD, Wayland/X11, ALSA/Pulse, 1080p/4K, fixed HZ/VRR, 60/120/144/240, VGA/DP-HDMI, and more all hit us around the same time about 12 years ago, the Linux desktop is doing alright with all the changes all that entails. Shit, ALSA/Pulse is now Pulse/Pipe; 1080p/4K is irrelevant because scaling is considered these days, frame rate and VRR is mostly done, CSD and SSD apps mostly blend together outside of stuff that tries to go its own way....the proverbial glass might not be all the way full, but it's well past half full.

      Comment


      • #53
        Originally posted by Anux View Post
        It would be very unwise to settle for rec 2020 as internal color space like it was with sRGB (or linear RGB) in the past. What you need is a color space that covers all visible colors like CIEXYZ, CIELAB or oklab.

        It might be fine in the short term but as soon as we get a new color space say REC 2030 we would have to rework the whole HDR stack. Why not make it right from the beginning?
        It wouldn't be a lot of work, but even if it was, scRGB covers the vast majority of the visible spectrum and then some. It would be a great option, since going with non RGB based colorspaces can cause computational overhead that we would rather not do.


        It's as hard as tone mapping, maybe a little less visible because most of the time we don't use much of the available gamut.

        As we have with tone mapping but it's not a solved problem since there is no mathematical correct way to do it. You are mapping bigger gamuts to smaller ones. It's a lossy process you can clip it or map/compress it (with a plethora of different transfer curves), but there is no right way to do it you can only do what looks subjectively good to you.

        It's as hard as gamut, it's just more recognizable since we have much better brightness perception than color.
        Tonemapping is significantly harder to do then gamut mapping, we have perceptually good gamut mapping, You can map dci-p3, rec2020 to sRGB primaries and it will look perceptually fine to the vast majority of people. This is not the case with tone mapping. There many many tone mappers and not a single one can be agreed upon to be "perceptually good", not even 4 methods like gamut mapping can cover this use case.


        Looking good/better is very subjective. Imagine 2 viewing conditions, both with a 400 nits display but one in a totally dark room and one in a lit room. For the dark room you could just make everything darker and still have a nice HDR experience but in the lit room the same result would be much to dark or if you adjust average brightness most highlights will be blown out, both will look bad.
        I don't see how this is relevant to what I said the transfer and display characteristics, on any single given display that can do both, HDR400, assuming the display isnt trash, will look better then SDR assuming you can actually see it.
        Yes as is always the case if we don't know about display properties.

        Even more in my opinion, EDID should contain a complete color profile, peak and average brightness as well as a room sensor to know the viewing conditions. This would allow for the best possible experience each device can offer.
        Agreed. though im not sure about a room sensor, that may be a bit overkill, I don't trust samsung with that power lol

        Comment


        • #54
          [QUOTE=skeevy420;n1444452]GTK3 and the GNOME insistence on CSD started fracturing how well that worked.../QUOTE]

          I like CSD. Even before I started using Linux I felt like the title bar was a huge waste of space. Unrelated to that, I also felt like menu bars are extremely lazy design decision for a lot of applications, so Gnome's design language appeals to me.

          Chrome started the trend of browser putting tabs in their title bar. Then Adobe started styling Photoshop and Illustrator to use CSDs as well. Spotify on most platforms uses CSDs. All of macOS's applications use CSDs. All of these things happened outside of Linux. CSD was kind of inevitable because there's always going to be applications that have their own their own look to them whether it be because of their choice in toolkit or intentional widget theme. In those scenarios, it's more important to me that everything feels cohesive within a window/application than between applications. For example, I don't have an issue with Blender and Spotify having their own looks but I wish their headers matched lol

          Originally posted by skeevy420 View Post
          and then Wayland really put a hold on interoperability. Getting hit with the 1-2 combination of CSD/SSD and X11/Wayland probably gave Wayland a much worse reputation than it deserved.
          Personally, I feel like I would have made a lot of the same decisions as Wayland if I had to develop a windowing environment. I get that some people saw Linux as being more like a bunch of puzzle pieces that people can use to make their own OS but that was always going to hurt it more than anything. You can't expect a bunch of parts to be similar enough to each other that they can work with each other in any combination while also being unique enough for someone to pick one over the other.

          You can still use whatever file managers you want, that doesn't really have anything to do with X11 or Wayland. You can also still run different shells over Wayland compositors even some of the existing shells have ways to modify how the shell looks.

          Originally posted by skeevy420 View Post
          I never had tearing the few times I used multiple monitors. Granted, they all ran at 60hz, I used v sync, and DPI scaling is rather moot at 1080p and lower for the majority of people with good enough eyesight. But when you limit back in the day to a single monitor at a resolution where DPI scaling isn't necessary, the UIs from back then can be just as good or better than modern day systems.
          I have to disagree. I had wanted to try out Linux for years before I actually did. I would check back and see what was new but I always distinctly remember thinking that Linux UIs felt dated. The UIs that did feel modern at the time, like Gnome 3, looked like a lot of Android UIs at the time which I thought was really ugly. Now I feel like Gnome, KDE, and Cosmic look pretty modern but the rest are lagging behind a bit. I've always been a bit of UI snob.

          I used to screenshot application that I thought looked bad and Photoshop my own concepts for how they could improve and stuff lol

          Originally posted by skeevy420 View Post
          When you consider that CSD/SDD, Wayland/X11, ALSA/Pulse, 1080p/4K, fixed HZ/VRR, 60/120/144/240, VGA/DP-HDMI, and more all hit us around the same time about 12 years ago, the Linux desktop is doing alright with all the changes all that entails. Shit, ALSA/Pulse is now Pulse/Pipe; 1080p/4K is irrelevant because scaling is considered these days, frame rate and VRR is mostly done, CSD and SSD apps mostly blend together outside of stuff that tries to go its own way....the proverbial glass might not be all the way full, but it's well past half full.
          That's a nice outlook. It's good to see some positivity here.

          Comment


          • #55
            Originally posted by Quackdoc View Post
            It wouldn't be a lot of work, but even if it was, scRGB covers the vast majority of the visible spectrum and then some. It would be a great option, since going with non RGB based colorspaces can cause computational overhead that we would rather not do.
            There is no more computational overhead than converting scRGB to any of the common color spaces at least with oklab. It's just a few 3x3 matrices which are fast on the CPU and nearly cost free on GPUs. While scRGB doesn't cover the whole visible spectrum and in turn covers a huge invisible part, is not perceptually uniform and not hue correct.

            Comment

            Working...
            X