Announcement

Collapse
No announcement yet.

AMD's Open-Source Mesa Driver Continues To Be Ruthlessly Optimized For Workstation Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • GI_Jack
    replied
    Originally posted by Serafean View Post
    So it took 10+ years from playing complete catchup, to getting to the point "Binary blob might be completely forgotten about". Good job everyone involved.
    A pleasure to be an AMD linux customer.
    Is it really. The good news is that AMD Linux works. Most of it being FOSS so recompiled against arbitrary versions of the kernel or userspace is never and issue leading to greater compatibility. Now, in terms of support, even though the number of quirks and issues has decreased, its still a giant pain in the fucking balls. insane defaults. New drivers being shipped upstream lag and miss new kernels, even stuff like fan control are wonky. Need a third party python daemon to do this in userspace.

    Now, once you get it running, it runs like a champ, and does so with Free software, which itself is pretty impressive, especially as its reasonably performant, and will run all modern software, as long as you don't expect max performance from the absolutely most demanding workloads. Even that is changing too i guess, with latest cards. Even as nice as the zen CPUs work as welll, still need out of tree modules like zenpower for accurate monitoring.

    Now, what Distros are supported by what cards? Ooo, another hard one. How many distros on what versions can you just pull a working amd OpenGL, Vulkan and OpenCL stack out from supported repos?

    Compare with nVidia: the binary drivers ship the full package of everything needed, download from their website, and do maintenance releases on old versions to keep old cards working with the latest X/Kernel, etc... An nVidia card is almost certain to work on any linux distro.

    Leave a comment:


  • smitty3268
    replied
    Originally posted by aufkrawall View Post
    I wonder if Intel Mesa and Windows use the same shader compiler.
    They do not.

    I believe the new GPUs have a new compiler based on LLVM that there was some talk about sharing and integrating into Mesa, but it hasn't happened yet and I think there's some pushback from the Intel Mesa devs who don't want to do that.

    Leave a comment:


  • smitty3268
    replied
    Originally posted by Qaridarium
    is it my Autism or what does "To Be Ruthlessly Optimized" mean ??

    does it mean they kill children for it ?

    what is Ruthlessly in writing software code ?
    Others have explained this, but it's a common english expression, not something you'd get from a dictionary definition.

    Leave a comment:


  • boxie
    replied
    Originally posted by Qaridarium
    is it my Autism or what does "To Be Ruthlessly Optimized" mean ??

    does it mean they kill children for it ?

    what is Ruthlessly in writing software code ?
    My interpretation is chasing down every little optimization available, regardless of how much time is spent doing it

    Leave a comment:


  • boxie
    replied
    Originally posted by ResponseWriter View Post
    As for Intel's DG2, I've always found Intel had much more reliable Linux GPU drivers than AMD but up to now they've never had competitive hardware. It'll be interesting to see what they come up with.
    Might I suggest that it is because it easier to write drivers for simpler hardware?

    I am looking forward to seeing what Intel can do on the Gaming GPU front - and seeing how well it runs under Linux vs Windows

    Leave a comment:


  • ResponseWriter
    replied
    Originally posted by krzyzowiec View Post

    That sucks. Not sure what is going on there, but I can say that this is certainly not the case with their 6800 GPUs. I am a very happy owner of one and it has been flawless since I acquired it, along with my Ryzen 5900x. I am a very happy AMD customer and I will stay that way if they keep up the good work and the good Linux support.
    My 6800 XT has been great, too, other than a couple of issues fixed some time ago, and far more stable than my 290 ever was despite the latter being around for years. How it behaves over the next few years when new development has unintended consequences for older hardware will be the real test.

    As for Intel's DG2, I've always found Intel had much more reliable Linux GPU drivers than AMD but up to now they've never had competitive hardware. It'll be interesting to see what they come up with.

    Leave a comment:


  • arQon
    replied
    Originally posted by Qaridarium
    is it my Autism or what does "To Be Ruthlessly Optimized" mean ??
    Not autism, just lack of vocabulary.

    Hitting e.g. 90% of the performance of PRO is "optimized". Getting that to 92%, then 93%, then 95%, then 96%, is optimizing ruthlessly.

    Leave a comment:


  • krzyzowiec
    replied
    Originally posted by khnazile
    There's no sign of AMD doing anything about that. This is very disappointing, and that is one of the reasons why I can't recommend buying their products. Opensource drivers have a very little value when they don't work.
    That sucks. Not sure what is going on there, but I can say that this is certainly not the case with their 6800 GPUs. I am a very happy owner of one and it has been flawless since I acquired it, along with my Ryzen 5900x. I am a very happy AMD customer and I will stay that way if they keep up the good work and the good Linux support.

    Leave a comment:


  • Viki Ai
    replied
    This is why I use Radeon GPUs at home. And at work where I can (to the point of having IT just buy me a Radeon Card instead of upgrading my whole machine since Dell presently is only shipping Nvidia-based systems, at least via what our IT department offers - I am less fussy on machines that will only be using Windows, but this particular box is going to be running Linux for GPU-heavy applications and life is too short and my time too valuable to be messing about with closed Nvidia drivers!)

    Leave a comment:


  • Teggs
    replied
    Originally posted by ferry View Post
    So, why is opengl performance measured with Snx?
    Because the poster Marek was replying to referenced Snx explicitly.

    Leave a comment:

Working...
X