Announcement

Collapse
No announcement yet.

Red Hat Is Hiring So Linux Can Finally Have Good HDR Display Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by mdedetrich View Post

    Furthermore Davinci Resolve only started supporting Linux recently so claiming that in the industry people mainly run it with Linux just doesn't add up and is disingenuous.
    The Wiz started on Apple but only lasted a brief period of time. Davinci ran on custom hardware. Resolve started on Linux and only later was ported to Mac and then later Windows. Give this a look. The whole thing is well worth watching..

    https://youtu.be/7WvP5_HFQSk?t=1017

    Comment


    • #52
      Originally posted by sophisticles View Post

      Have you ever used Resolve? I have been using it for years, getting it to work reliably on Linux is a nightmare, there's all sorts of text and GUI scaling issues, and that's assuming you can even get it to launch and run after you have it installed. The most success i have had is with Ubuntu and Fedora, the rest of the distros I don't even bother.

      If you want the best Resolve experience you use a Mac, and specifically the new M1, with a Windows based pc being a close second.

      Hate to break this to all those living in Linux fantasy land, but if you want reliable, quality video editing of any kind, you nearly always want to first go with a Mac and then a Windows based system as a close second.
      Ya except the fact that BMD has a RH7 image that is their blessed standard. Not a windows image or a Mac Image. The only issue with installing Resolve on Linux is if you have an AMD card because resolve needs either Cuda or openCL to run and AMDs openCL support sucks rhinos.

      Comment


      • #53
        Originally posted by mroche View Post
        To add clarification, "studios" tends to refer to feature animation studios e.g. Pixar, WDAS, DreamWorks and contracted VFX studios e.g. DNeg, DD, ILM, MPC, etc, usually on the medium to larger size. In this situation, the use of RHEL/CentOS is pretty much de facto, and not just for servers. A good portion of that stems from the transition from IRIX (MIPS) to x86 where Red Hat Linux was ready to scoop up that customer base. I personally worked at Pixar and Blue Sky (at Red Hat now, thanks Disney), along with knowing what a bunch of the other studios are doing because we all talk with each other and our vendors.
        So, the dishonesty and misrepresentation continues. When people say things like "studios" and "Hollywood uses", that tends to strongly suggest all "studios" involved in the making of movies and tv, not just 3d animation.

        But since you arbitrarily defined "studios" in the manner you did and since you mentioned Pixar, let's see what a Pixar scientist has to say:

        https://sciencebehindpixar.org/ask-a-pixar-scientist

        Pixar has a huge "render farm," which is basically a supercomputer composed of 2000 machines, and 24,000 cores. This makes it one of the 25 largest supercomputers in the world. That said, with all that computing power, it still took two years to render Monster's University.

        β€” Peter Collingridge; Khan Academy
        So Pixar does not use Linux based desktops, they use a supercomputer that is one of the 25 largest in the world. And I am fairly certain that they probably do use either a custom Linux based or BSD based OS, primarily because the licensing costs of Windows HPC is through the roof, as are the licensing costs for UNIX.

        As for Dreamworks:

        https://www.techradar.com/news/world...ndered-1127122

        One of our first realizations, after meeting an animator, is that just about every frame of a movie such as Rise of the Guardians is created using an HP Workstation Z820. This behemoth is outfitted with two Intel Xeon E5-2687W processors, running at 3.1GHz each. Because each system has eight cores, animators can use 16 cores for processing animation files.
        https://www8.hp.com/h20195/v2/getpdf.aspx/c04111526.pdf

        Now this article is a bit old and the HP workstation referenced is available with either Windows or Linux, so maybe they do use Linux in some rendering capacity, but according to this they use Premo, which seems to be a custom built software:

        https://www.theverge.com/2014/6/12/5...-your-dragon-2

        https://www.awn.com/animationworld/d...ci-tech-honors

        http://www.olivier-ladeuix.com/blog/...tion-software/

        Honestly, the more I read about the custom software that Pixar and Dreamworks use, the more I think that they may not even run on an OS per se, they may be using custom setups where the rendering software runs on bare metal, maybe even coded with extensive assembler, based on the claims of speed.

        Comment


        • #54
          Originally posted by partcyborg View Post
          FINALLY!

          SO happy someone is finally stepping up to make this happen. Lack of HDR is the one thing my Kodi based media player is missing. Every few months I check to see how much progress has been made on adding support, even on the bleeding edge, and the answer has always none whatsoever
          Kodi has HDR support. Wayland is the big bottle neck right now. As RH was the ones really pushing Wayland it makes sense that they step up. But I also highly suspect that BMD and many others are leaning on them to get it in gear.

          https://www.collabora.com/news-and-b...dynamic-range/

          Comment


          • #55
            Originally posted by F.Ultra View Post

            That is because HDR is classified by both contrast and colour depth, so the 8-bit HDR displays have the extra contrast but not the extra colour depth and since they support HDR contrast they are "HDR", just that they don't have to specify which of the classifications they met.
            I can see how 10bit color channels introduce new challenges vs 8bit. But how is the contrast such an issue? I suppose all sorts of different gamut settings have similar kinds of problems outside 100% sRGB.

            Comment


            • #56
              Originally posted by stormcrow View Post
              Source?
              Probably the client list of his employer, Red Hat.

              Originally posted by mdedetrich View Post
              Not for color management they don't which is whats relevant here. If you are talking about studios here, they tend to use Linux for servers (no real surprise there), rendering farms or if they do use it for desktop its for graphics and not color touching.

              Studios don't use Linux when they need to do color because Linux is absolutely terrible in this.
              Oh, glorious how you're actually trying to tell the man from Red Hat that he does not have any customers that run applications like DaVinci Resolve.

              🍿🍿🍿🍿

              Comment


              • #57
                Originally posted by blacknova View Post
                I think 8bit per component is a way beyond most people can really distinguish on common hardware,
                I used to think "8-bpc should be enough for everyone", until I actually started using it full-time. It was back in the days of CRT monitors, when I first noticed banding artifacts, to my shock and amazement! I immediately created some test patterns, to confirm what I thought I saw, and I was absolutely right! I could clearly distinguish the boundaries between single-intensity differences, even in fairly normal viewing conditions. I know the graphics card was outputting full 8-bit resolution, because I could distinctly see each intensity level, in at least some parts of the scale.

                Since then, I've noticed plenty of banding with 8-bits per channel. Right now, I can see it in the part of the yellow sky of my desktop wallpaper -- a massive panorama, taken from the mars rover, displayed on a true 8-bit (SDR) LCD. I know those artifacts aren't baked into the image, since I took the full-resolution image from NASA and cropped/scaled it myself. Moreover, I even used a color picker to confirm that they're actually just single intensity value differences.

                Comment


                • #58
                  Originally posted by sophisticles View Post

                  So Pixar does not use Linux based desktops, they use a supercomputer that is one of the 25 largest in the world. And I am fairly certain that they probably do use either a custom Linux based or BSD based OS, primarily because the licensing costs of Windows HPC is through the roof, as are the licensing costs for UNIX.

                  Now this article is a bit old and the HP workstation referenced is available with either Windows or Linux, so maybe they do use Linux in some rendering capacity, but according to this they use Premo, which seems to be a custom built software:

                  Honestly, the more I read about the custom software that Pixar and Dreamworks use, the more I think that they may not even run on an OS per se, they may be using custom setups where the rendering software runs on bare metal, maybe even coded with extensive assembler, based on the claims of speed.
                  Now this, THIS, is a magnificent piece of... I don't know what the heck. I appreciate you trying to disprove my own real-life work experience as a film industry Linux sysadmin with delusion. I was literally explaining to you how the industry extends far beyond NLE editors which you seemed to be basing your entire initial rebuttal on.

                  Examples for you to ponder (as these are studios I had direct relations with):
                  *Pixar:
                  - Render farm nodes run RHEL
                  - Workstations are vSphere VDI over Teradici PCoIP running RHEL (using MATE, some of us in systems used GNOME).
                  * WDAS (same as above, not sure if using vSphere or something else and which DE they settled on).
                  * Blue Sky
                  - Render farm on CentOS
                  - Desk-side workstations running CentOS (using Cinnamon, formerly XFCE).

                  Every studio I've mentioned so far in my prior comment, that others have mentioned, and more also use Linux as their primary pipeline desktop OS.

                  Thank you for making my night with this. Have a great weekend.

                  Cheers,
                  Mike
                  mroche
                  Senior Member
                  Last edited by mroche; 18 September 2021, 09:05 PM.

                  Comment


                  • #59
                    Originally posted by birdie View Post
                    IOW nothing for normal people, OK
                    Well,
                    wswartzendruber
                    Senior Member
                    wswartzendruber did mention gaming, but that's still not something the majority of users do, on their Linux desktop.

                    This comment is rather puzzling, as if you're expecting that each advancement in technology should be immediately relevant to the majority of users. Traditionally, the boundaries of tech are pushed by the most demanding users. Then, as the tech matures, it trickles down to the mainstream, where other opportunistic applications take advantage of it and widen the pool of beneficiaries.

                    In this case, the early adopters should be:
                    • Video production
                    • Video consumption
                    • Photo editing
                    • Games
                    Once the tech has matured, perhaps browsers will embrace it. The arrival of newer image formats, and the fact that most phone cameras have supported HDR for a while, should mean there's no lack of HDR content to populate the web. This should provide benefits even to the masses of unwashed, bearded Linux admins.
                    coder
                    Senior Member
                    Last edited by coder; 18 September 2021, 09:37 PM.

                    Comment


                    • #60
                      Originally posted by sophisticles View Post
                      So Pixar does not use Linux based desktops,
                      Where in that link does it say they don't use Linux desktops? I don't see any mention of: Linux, desktops, Mac OS, or even Windows!

                      Originally posted by sophisticles View Post
                      And I am fairly certain that they probably do use either a custom Linux based or BSD based OS, primarily because the licensing costs of Windows HPC is through the roof, as are the licensing costs for UNIX.
                      Why are you speculating? If you don't know, then you don't know. Anyone can speculate.

                      Also, talk of render farms is missing the point, since the matter in contention seems to be that of desktop Linux.

                      Originally posted by sophisticles View Post
                      Honestly, the more I read about the custom software that Pixar and Dreamworks use, the more I think that they may not even run on an OS per se, they may be using custom setups where the rendering software runs on bare metal,
                      That's some insane shit, right there. They want good hardware support, which means they depend on the OS and vendor-supplied drivers. And not just for GPUs, but also storage and networking. Today's hardware and the software needed to use it is far too complex for one to imagine not using an OS. Not to mention the whole subject of development, debugging, deployment, and admin tools! The effort & costs of developing without an OS would be monumental, yet the benefits would probably be negligible at best (more likely, their custom solution would perform worse, due to the amount of work that has gone into refining modern operating systems).

                      At most, they'd likely just use customized kernel parameters & build options.

                      Of course, I'm just speculating, but then so are you. At least I've seen the inside of some professional 3D and video post production software. I'm sure the developers have enough to worry about, without trying to do the OS' job too!

                      Originally posted by sophisticles View Post
                      maybe even coded with extensive assembler, based on the claims of speed.
                      You can write plenty of assembler, even while running on a normal OS. A lot of media player/encoder software uses assembly language for their hot loops. I do wonder how much it buys them vs. simply using intrinsics in C/C++. I've done both, and intrinsics are good enough for me.

                      Comment

                      Working...
                      X