Announcement

Collapse
No announcement yet.

Red Hat Is Hiring So Linux Can Finally Have Good HDR Display Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Sonadow
    replied
    Originally posted by coder View Post
    Openness, maturity, and good hardware support seem like important characteristics. I can't say what will be the differentiating factor, because that day will be years in the future and my statement was as much a prediction about the evolution of Linux (or lack thereof) as about the emergence of a more compelling alternative.

    BTW, I didn't mind Windows 7 for running office apps and stuff, but when my employer finally upgraded us to Windows 10, I was thoroughly unimpressed. Since I don't run it at home, I can't say whether most of the issues were the core OS or just how they deployed it, but it was the slowest and most unstable Windows I ever experienced since probably Windows 3.1. Granted, I skipped Win 98, ME, Vista, and 8. And I never upgraded until there were a couple Service Packs released, either. AFAICT, Microsoft hardly gives a shit about Windows, anymore. I think they mainly just see it as a way to rope in customers for their cloud services.


    And I think that's an underestimate. Perhaps Michael could shed some light onto the matter, with a little survey.
    I like Windows 10. It hardly felt slow on an Apollo Lake Atom x7-E3950 processor. If anything, Windows 10 was more responsive on that system compared to Gnome Wayland and Plasma Wayland on Debian 10 and 11.

    And it's a fact that most organizations have crappy Windows deployments. During the days of Windows 7, a business outsourced their IT support to my employer and we had to assist them for a Windows 7 rollout over SCCM. I don't know if the image they used was broken or not, but it was the worst performing Windows 7 installation I ever had the misfortune of using. Windows would start with broken display drivers and then switch the desktop back to the Windows 7 Basic theme, launching stuff like Internet Explorer and Chrome were dead slow, startup and shutdown times were plain bad, etc.

    Wherever there was an opportunity, i bypassed their SCCM infrastructure and used retail Windows DVDs to perform the install, then loaded their volume license product key to activate it and hunted down the individual drivers. Needless to say, those computers installed with the retail image were way more performant and reliable than the SCCM image.
    Last edited by Sonadow; 18 September 2021, 11:28 PM.

    Leave a comment:


  • coder
    replied
    Originally posted by Sonadow View Post
    And what will that be? Windows? macOS? Fuchsia? FreeBSD (i sure hope not)?
    Openness, maturity, and good hardware support seem like important characteristics. I can't say what will be the differentiating factor, because that day will be years in the future and my statement was as much a prediction about the evolution of Linux (or lack thereof) as about the emergence of a more compelling alternative.

    BTW, I didn't mind Windows 7 for running office apps and stuff, but when my employer finally upgraded us to Windows 10, I was thoroughly unimpressed. Since I don't run it at home, I can't say whether most of the issues were the core OS or just how they deployed it, but it was the slowest and most unstable Windows I ever experienced since probably Windows 3.1. Granted, I skipped Win 98, ME, Vista, and 8. And I never upgraded until there were a couple Service Packs released, either. AFAICT, Microsoft hardly gives a shit about Windows, anymore. I think they mainly just see it as a way to rope in customers for their cloud services.

    Originally posted by Sonadow View Post
    I said With the exception of a few.
    And I think that's an underestimate. Perhaps Michael could shed some light onto the matter, with a little survey.

    Leave a comment:


  • Sonadow
    replied
    Originally posted by coder View Post
    I use Linux for pragmatic reasons. When a better alternative exists (and it's only a matter of time), I'll switch without much reluctance.
    And what will that be? Windows? macOS? Fuchsia? FreeBSD (i sure hope not)?

    Originally posted by coder View Post
    You probably underestimate the number of us who are realistic about Linux' shortcomings.
    I said With the exception of a few.

    Leave a comment:


  • coder
    replied
    Originally posted by Sonadow View Post
    With the exception of a few, every other person outright lies through their teeth to make Linux look and sound more important and usable than it really is.
    You're misconstruing the size of a vocal minority.

    And I say that without really having a dog in this fight. I use Linux for pragmatic reasons. When a better alternative exists (and it's only a matter of time), I'll switch without much reluctance.

    You probably underestimate the number of us who are realistic about Linux' shortcomings.

    Originally posted by Sonadow View Post
    Look at all the posts about Linux's shitass percentage on Steam.
    And what percentage of this forum's users do you really think even read such threads?
    Last edited by coder; 18 September 2021, 10:23 PM.

    Leave a comment:


  • Sonadow
    replied
    Originally posted by sophisticles View Post

    So, the dishonesty and misrepresentation continues.
    sophisticles

    Dude, you've been here since 2015. Don't you know that lying and misrepresentation here to artificially inflate Linux's numbers and importance is a mandatory requirements in Phoronix? With the exception of a few, every other person outright lies through their teeth to make Linux look and sound more important and usable than it really is.

    Look at all the posts about Linux's shitass percentage on Steam. Every time it falls to < 1%:
    "ZOMG THE POLL IS BROKEN LINUX HAS MORE THAN THAT LINUX IS DA BEST GAMING PLATFORM EVER THIS IS ANTI-LINUX PROPAGANDA"

    And when it occasionally spikes:
    "OMG LINUX NUMBA ONE WINDOWS IS FINISHED WE WILL TAKE OVER AND DESTROY WINDOWS!"

    Leave a comment:


  • sophisticles
    replied
    Originally posted by mroche View Post

    Now this, THIS, is a magnificent piece of... I don't know what the heck. I appreciate you trying to disprove my own real-life work experience as a film industry Linux sysadmin with delusion. I was literally explaining to you how the industry extends far beyond NLE editors which you seemed to be basing your entire initial rebuttal on.

    Examples for you to ponder (as these are studios I had direct relations with):
    *Pixar:
    - Render farm nodes run RHEL
    - Workstations are vSphere VDI over Teradici PCoIP running RHEL (using MATE, some of us in systems used GNOME).
    * WDAS (same as above, not sure if using vSphere or something else and which DE they settled on).
    * Blue Sky
    - Render farm on CentOS
    - Desk-side workstations running CentOS (using Cinnamon, formerly XFCE).

    Every studio I've mentioned so far in my prior comment, that others have mentioned, and more also use Linux as their primary pipeline desktop OS.

    Thank you for making my night with this. Have a great weekend.

    Cheers,
    Mike
    I wonder if a simple Google search turns up anything that proves you are either a liar or working in something other than the film industry?

    Let's give it a shot:

    https://www.pixarpost.com/2013/06/pi...pple-wwdc.html

    https://mashable.com/article/apple-r...gs-pixar-films

    https://zworkstations.com/apps/pixar...-workstations/

    https://zworkstations.com/products/h...onfiguration=7

    https://www.businessinsider.com/pixa...e-films-2014-1 <-- You need to register to read this

    From Pixar's website:

    https://renderman.pixar.com/tech-specs

    It looks like Pixar's Renderman runs on all 3 major OSes.

    Now is it possible that you actually do work in some aspect of the industry and that portion uses RHEL? Sure.

    But I can find significant evidence that vast portions of the film industry use OSX and Windows, especially the NLE aspect, which is greater than the VFX aspect.

    Thanks for playing.

    Leave a comment:


  • coder
    replied
    Originally posted by sophisticles View Post
    So Pixar does not use Linux based desktops,
    Where in that link does it say they don't use Linux desktops? I don't see any mention of: Linux, desktops, Mac OS, or even Windows!

    Originally posted by sophisticles View Post
    And I am fairly certain that they probably do use either a custom Linux based or BSD based OS, primarily because the licensing costs of Windows HPC is through the roof, as are the licensing costs for UNIX.
    Why are you speculating? If you don't know, then you don't know. Anyone can speculate.

    Also, talk of render farms is missing the point, since the matter in contention seems to be that of desktop Linux.

    Originally posted by sophisticles View Post
    Honestly, the more I read about the custom software that Pixar and Dreamworks use, the more I think that they may not even run on an OS per se, they may be using custom setups where the rendering software runs on bare metal,
    That's some insane shit, right there. They want good hardware support, which means they depend on the OS and vendor-supplied drivers. And not just for GPUs, but also storage and networking. Today's hardware and the software needed to use it is far too complex for one to imagine not using an OS. Not to mention the whole subject of development, debugging, deployment, and admin tools! The effort & costs of developing without an OS would be monumental, yet the benefits would probably be negligible at best (more likely, their custom solution would perform worse, due to the amount of work that has gone into refining modern operating systems).

    At most, they'd likely just use customized kernel parameters & build options.

    Of course, I'm just speculating, but then so are you. At least I've seen the inside of some professional 3D and video post production software. I'm sure the developers have enough to worry about, without trying to do the OS' job too!

    Originally posted by sophisticles View Post
    maybe even coded with extensive assembler, based on the claims of speed.
    You can write plenty of assembler, even while running on a normal OS. A lot of media player/encoder software uses assembly language for their hot loops. I do wonder how much it buys them vs. simply using intrinsics in C/C++. I've done both, and intrinsics are good enough for me.

    Leave a comment:


  • coder
    replied
    Originally posted by birdie View Post
    IOW nothing for normal people, OK
    Well, wswartzendruber did mention gaming, but that's still not something the majority of users do, on their Linux desktop.

    This comment is rather puzzling, as if you're expecting that each advancement in technology should be immediately relevant to the majority of users. Traditionally, the boundaries of tech are pushed by the most demanding users. Then, as the tech matures, it trickles down to the mainstream, where other opportunistic applications take advantage of it and widen the pool of beneficiaries.

    In this case, the early adopters should be:
    • Video production
    • Video consumption
    • Photo editing
    • Games
    Once the tech has matured, perhaps browsers will embrace it. The arrival of newer image formats, and the fact that most phone cameras have supported HDR for a while, should mean there's no lack of HDR content to populate the web. This should provide benefits even to the masses of unwashed, bearded Linux admins.
    Last edited by coder; 18 September 2021, 09:37 PM.

    Leave a comment:


  • mroche
    replied
    Originally posted by sophisticles View Post

    So Pixar does not use Linux based desktops, they use a supercomputer that is one of the 25 largest in the world. And I am fairly certain that they probably do use either a custom Linux based or BSD based OS, primarily because the licensing costs of Windows HPC is through the roof, as are the licensing costs for UNIX.

    Now this article is a bit old and the HP workstation referenced is available with either Windows or Linux, so maybe they do use Linux in some rendering capacity, but according to this they use Premo, which seems to be a custom built software:

    Honestly, the more I read about the custom software that Pixar and Dreamworks use, the more I think that they may not even run on an OS per se, they may be using custom setups where the rendering software runs on bare metal, maybe even coded with extensive assembler, based on the claims of speed.
    Now this, THIS, is a magnificent piece of... I don't know what the heck. I appreciate you trying to disprove my own real-life work experience as a film industry Linux sysadmin with delusion. I was literally explaining to you how the industry extends far beyond NLE editors which you seemed to be basing your entire initial rebuttal on.

    Examples for you to ponder (as these are studios I had direct relations with):
    *Pixar:
    - Render farm nodes run RHEL
    - Workstations are vSphere VDI over Teradici PCoIP running RHEL (using MATE, some of us in systems used GNOME).
    * WDAS (same as above, not sure if using vSphere or something else and which DE they settled on).
    * Blue Sky
    - Render farm on CentOS
    - Desk-side workstations running CentOS (using Cinnamon, formerly XFCE).

    Every studio I've mentioned so far in my prior comment, that others have mentioned, and more also use Linux as their primary pipeline desktop OS.

    Thank you for making my night with this. Have a great weekend.

    Cheers,
    Mike
    Last edited by mroche; 18 September 2021, 09:05 PM.

    Leave a comment:


  • coder
    replied
    Originally posted by blacknova View Post
    I think 8bit per component is a way beyond most people can really distinguish on common hardware,
    I used to think "8-bpc should be enough for everyone", until I actually started using it full-time. It was back in the days of CRT monitors, when I first noticed banding artifacts, to my shock and amazement! I immediately created some test patterns, to confirm what I thought I saw, and I was absolutely right! I could clearly distinguish the boundaries between single-intensity differences, even in fairly normal viewing conditions. I know the graphics card was outputting full 8-bit resolution, because I could distinctly see each intensity level, in at least some parts of the scale.

    Since then, I've noticed plenty of banding with 8-bits per channel. Right now, I can see it in the part of the yellow sky of my desktop wallpaper -- a massive panorama, taken from the mars rover, displayed on a true 8-bit (SDR) LCD. I know those artifacts aren't baked into the image, since I took the full-resolution image from NASA and cropped/scaled it myself. Moreover, I even used a color picker to confirm that they're actually just single intensity value differences.

    Leave a comment:

Working...
X