Announcement

Collapse
No announcement yet.

Red Hat Is Hiring So Linux Can Finally Have Good HDR Display Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Britoid View Post

    Any specific film studio requesting this? like Pixar?

    I can't see why HDR would be useful in a car?
    Not going to be mentioning names, but as for the car the expectations from in-car entertainment systems keep going up and being able to stream HDR content is part of that.

    Comment


    • #12
      Originally posted by ChristianSchaller View Post

      Actually almost all studios use Linux for the desktop.
      Source? I'm not disputing some probably do. I'm disputing the "almost all", specifically because of the problem with HDR and HiDPI to a lesser extent - because HiDPI does at least partially work.

      Comment


      • #13
        Originally posted by Britoid View Post

        Any specific film studio requesting this? like Pixar?

        I can't see why HDR would be useful in a car?
        Probably Tesla. They love sticking it to the man whenever they can. HDR media for their infotainment centers is one of the pieces Tesla would need Linux to have for them to stick it to Microsoft and Windows in regards to their in-car OS.

        Then there are the Germans. They seem to love Linux. It wouldn't surprise me to see BMW or VW using Linux. Especially BMW. They partner with AMD from time to time...AMD embraces Linux. Just connecting dots.

        And Valve. They need HDR and ghetto-HDR, excuse me, AutoHDR to make SteamOS 3.0 more competitive with Windows 10 and 11.

        Comment


        • #14
          Originally posted by stormcrow View Post

          It's obvious. Because even $200 - $300 monitors have HDR-10 now, and Linux's support is abysmal. It's not even all that great for HiDPI. Studios are more likely to use Macs or Windows front ends with Linux compute cluster back ends.
          I'm a bit confused. For instance Samsung Smart Monitor M5 S27AM500NR claims to be HDR10 capable, but only has 8-bit colors. My display has 8+2bit colors with dithering. I kind of assumed HDR means 8+2bit colors or native 10bit. For instance LG 27UL500-W is HDR10 & 8+2b capable. ASUS PB27UQ is one of the cheapest native 10bit displays.

          Comment


          • #15
            Originally posted by You- View Post
            I dont know why HLG was even invented the way it was.

            The idea is great - you can progressively improve quality in an organic manner without resorting to different streams.

            However it can only the spectrum "above" SDR and not below, so you cant get greater contrast in darkness.

            They should have started progressive enhancement from allowing the darkness, especially as no broadcaster was using anything that would automatically upscale into HLG anyway.

            (I would love to see the usual Red Hat haters post in this topic. Find flaws and blame for why Red Hat is doing this development.)
            75% of HLG's value range is dedicated to brightness below reference white. Out of the 1,024 possible values, that's 768. That's three times what SDR allocates for a similar area.

            Comment


            • #16
              Originally posted by Britoid View Post
              I can't see why HDR would be useful in a car?
              Perhaps because it is very obvious at night if 'black' on a display isn't truly black. Better contrast for vital things like the driver's screen is always going to be desireable as well.

              Comment


              • #17
                Originally posted by stormcrow View Post

                Source? I'm not disputing some probably do. I'm disputing the "almost all", specifically because of the problem with HDR and HiDPI to a lesser extent - because HiDPI does at least partially work.
                Linux use on VFX and animation workstations really kicked off with ILM in the early 2000's, working on "Star Wars", and Dreamworks migrating to it for "Shrek". The following years more and more studios moved to Linux workstations, such as Weta, Digital Domain, Double Negative, The Mill, Pixar, Disney Animation, Animal Logic, and on and on. By around 2010 it was safe to call Linux the de facto workstaion OS using Red Hat (and lately also Ubuntu). Today many "mid level" VFX and animation studios use Linux as well, so saying "almost all" is absolutely correct. Render farms running Linux is a given and used for just as long, actually before ILM starting using it, with VFX shots in "Titanic" rendered on Linux, and of course for other infrastructure such as storage.

                Comment


                • #18
                  Originally posted by numasan View Post

                  Linux use on VFX and animation workstations really kicked off with ILM in the early 2000's, working on "Star Wars", and Dreamworks migrating to it for "Shrek". The following years more and more studios moved to Linux workstations, such as Weta, Digital Domain, Double Negative, The Mill, Pixar, Disney Animation, Animal Logic, and on and on. By around 2010 it was safe to call Linux the de facto workstaion OS using Red Hat (and lately also Ubuntu). Today many "mid level" VFX and animation studios use Linux as well, so saying "almost all" is absolutely correct. Render farms running Linux is a given and used for just as long, actually before ILM starting using it, with VFX shots in "Titanic" rendered on Linux, and of course for other infrastructure such as storage.
                  To add some fun facts off the top of my head:

                  - The "Titanic" shots were rendered on an ALPHA CPU cluster.

                  - "Star Wars Episode II - Attack of the Clones" was the first all-digitally filmed & produced feature-film by ILM on Linux workstations powered by MIPS64.

                  - Microsoft tried to get in the VFX industry by buying SoftImage; however, since the industry was running on SGI's Unix-OS called "IRIX" moving over to Linux was a lot easier & cheaper for the studios, so Microsoft admitted to defeat by selling SoftImage a few years later.

                  - The only reason why nVidia even bothered to port their industry-leading driver to Linux was precisely because of its dominance in this field.

                  - RHEL was used in a public PiXAR demonstration showing off the real-time rendering capabilities of their in-house animation software named "Marionette" in conjunction with nVidia's Quadro GPUs.

                  Comment


                  • #19
                    Originally posted by caligula View Post

                    I'm a bit confused. For instance Samsung Smart Monitor M5 S27AM500NR claims to be HDR10 capable, but only has 8-bit colors. My display has 8+2bit colors with dithering. I kind of assumed HDR means 8+2bit colors or native 10bit. For instance LG 27UL500-W is HDR10 & 8+2b capable. ASUS PB27UQ is one of the cheapest native 10bit displays.
                    That is because HDR is classified by both contrast and colour depth, so the 8-bit HDR displays have the extra contrast but not the extra colour depth and since they support HDR contrast they are "HDR", just that they don't have to specify which of the classifications they met.

                    Comment


                    • #20
                      Originally posted by Teggs View Post

                      Perhaps because it is very obvious at night if 'black' on a display isn't truly black. Better contrast for vital things like the driver's screen is always going to be desireable as well.
                      That's a limitation of the displays, not the data driving them. In short, LCDs suck at contrast.

                      Comment

                      Working...
                      X