Announcement

Collapse
No announcement yet.

NVIDIA Continues Prepping The Linux Desktop Stack For HDR Display Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by bug77 View Post

    Can you share which one? And what 10bit content do you have available?
    Thisun, only downside is how glossy it is (but it kind of has to be for color accuracy).

    I have a few 10 bit tv show and movie rips to try out whenever it gets supported, but I have not gone out of my way seeking 10 bit content until its actually working. Mostly just got it because its an IPS 4k 27" that was on sale at the time I bought it.

    Comment


    • #12
      Originally posted by efikkan View Post
      OLED surely can, but not "quantum dot"(QLED); which is just a fancy name for the latest filters Samsung puts on their LCD panels.

      It certainly meets both gamut and brightness requirements.

      Originally posted by zanny View Post
      Thisun, only downside is how glossy it is (but it kind of has to be for color accuracy).

      I have a few 10 bit tv show and movie rips to try out whenever it gets supported, but I have not gone out of my way seeking 10 bit content until its actually working. Mostly just got it because its an IPS 4k 27" that was on sale at the time I bought it.
      Ah, standard gamut. 10bit isn't going to do much for you.
      Last edited by bug77; 21 September 2017, 03:27 PM.

      Comment


      • #13
        Originally posted by bug77 View Post
        Ah, standard gamut. 10bit isn't going to do much for you.
        10 bit standard can be hacked to do a pretty good HDR impression. It's what a lot of video producers were using to produce HDR before displays were widely available.

        No, it isn't perfect.

        Comment


        • #14
          Originally posted by bug77 View Post
          As mentioned, it uses local dimming to represent darker regions, which degrades local color accuracy. "QLED" is in no way comparable to OLED.

          Comment


          • #15
            Originally posted by bug77 View Post
            FreeSync/G-Sync is all the rage these days and you can't use ULMB with G-Sync. So...
            Actually you can, although the result is not pleasant to look at. But anyway, that's besides the point. Watching sports converted to 120FPS with SVP on Linux would be even more awesome with 120Hz ULMB.

            But we can't have nice things, I guess :P

            Comment


            • #16
              Originally posted by efikkan View Post
              As mentioned, it uses local dimming to represent darker regions, which degrades local color accuracy. "QLED" is in no way comparable to OLED.
              There's no mention of local dimming on the wiki page I linked. Also, you keep saying "QLED" which is Samsung's implementation. So I'm not really sure what we're talking about here.
              I haven't seen Quantum Dot screens yet, so I don't know how they perform, but at least on paper they're very comparable to OLED.

              Comment


              • #17
                Originally posted by efikkan View Post
                As mentioned, it uses local dimming to represent darker regions, which degrades local color accuracy. "QLED" is in no way comparable to OLED.
                What i consider a true quanium dot display is one that uses QLED's similarly to an OLED display:


                To me this is what bug77 was referencing, not the current round of quantum marketing crap

                Comment


                • #18
                  Originally posted by efikkan View Post
                  As mentioned, it uses local dimming to represent darker regions, which degrades local color accuracy. "QLED" is in no way comparable to OLED.
                  A good implementation of local dimming still allows for dark scenes where you have enough range to make out details without everything seeming unrealistically bright. So worth it over standard 8-bit, IMO, until large OLED screens become affordable.

                  Comment


                  • #19


                    Originally posted by bug77 View Post
                    There's no mention of local dimming on the wiki page I linked. Also, you keep saying "QLED" which is Samsung's implementation. So I'm not really sure what we're talking about here.
                    I haven't seen Quantum Dot screens yet, so I don't know how they perform, but at least on paper they're very comparable to OLED.
                    This should clear things up:
                    "LCD TVs with quantum dots, branded as "QLED" use BLU (Back-Light Unit) and produce enhanced colors by attaching quantum dot films. On the other hand, genuine QLED TVs would not use BLU and inorganic nano-particles would produce light. However, consumers could be misled by TVs branded as QLED TVs today, as they still use LCDs panels for picture generation. So far, authentic Quantum-dot LED TVs exist in laboratories only."
                    QLED, just LCD with a quantum dot filter is the technology we see in QLED TVs and "HDR" PC screens. The technology you want doesn't really exsist yet.

                    Originally posted by Otus View Post
                    A good implementation of local dimming still allows for dark scenes where you have enough range to make out details without everything seeming unrealistically bright. So worth it over standard 8-bit, IMO, until large OLED screens become affordable.
                    Local dimming is awful in images with large variations of lightness:


                    Individually lit pixels is the only way to go.

                    Comment


                    • #20
                      This was 2017. Now it's 2022. Does HDR work now on Linux?

                      Comment

                      Working...
                      X