NVIDIA 545.29.02 Linux Driver Released With Much Better Wayland Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • M.Bahr
    replied
    Sonadow You are trolling.

    Leave a comment:


  • Sonadow
    replied
    Originally posted by M.Bahr View Post

    You seem to have some misconceptions about GNU Linux. We don't hate stable ABIs. Linus set one main principle for Linux. "Don't break userspace." This is to have a reliable base for the devs and the users. As for open source drivers like mesa this principle can be uphold, while being able to extend features at the same time nevertheless.
    But with an out-of-tree kernel module like nvidia's without stable ABI we got no guarantee for anything and also can not debug and fix issues that easy. We need more documentation about nvidia's hardware functions. But a big portion of the control seem to have been moved into nvidia's over-sized firmware blob. This could mean an even bigger disadvantage for nvk and nouveau as it already is.

    As for your examples of GTK3-4 and Qt5-6 you are comparing apples with oranges there. If one doesn't like GTK for whatever reason, that dev can switch to another GUI Toolkit. Or the dev can simply fork GTK and QT and customize them for special needs. But do you have such options and freedom with nvidia's drivers? No, you don't. You totally depend on their sad record of mediocre support for GNU Linux and this is a big problem. Maybe nvidia's politics is going to change in decade or so. But until then, I can't recommend their products to anyone of my friends or to my customers.
    No, this is just Linux wanting to have its cake and eat it too.

    You whine that without a stable ABI from Nvidia you have no guarantee for anything. Ever thought how countless other hardware vendors must have felt when they wanted to target the Linux kernel for driver development?

    You made your bed, now lie on it.

    Leave a comment:


  • M.Bahr
    replied
    Originally posted by Sonadow View Post

    Why are you and other Linux users complaining about the lack of a stable ABI for Nvidia hardware? Linux users hate stable ABIs. So much so that you all even point to the overused "stable_abi_nonsense" file that Linus left in the kernel all the time. Unstable ABIs means developers have the "freedom" to break things in the name of improvement.

    So why are you complaining? Nvidia is giving you the unstable ABI you all wanted and insisted on.

    And why are you all complaining that Qt6 breaks away from Qt5, or GTK4 breaks away from GTK3? Or when Python 3 breaks from Python 2 and every Perl release breaks an older one? After all, stable ABIs are all nonsense.​
    You seem to have some misconceptions about GNU Linux. We don't hate stable ABIs. Linus set one main principle for Linux. "Don't break userspace." This is to have a reliable base for the devs and the users. As for open source drivers like mesa this principle can be uphold, while being able to extend features at the same time nevertheless.
    But with an out-of-tree kernel module like nvidia's without stable ABI we got no guarantee for anything and also can not debug and fix issues that easy. We need more documentation about nvidia's hardware functions. But a big portion of the control seem to have been moved into nvidia's over-sized firmware blob. This could mean an even bigger disadvantage for nvk and nouveau as it already is.

    As for your examples of GTK3-4 and Qt5-6 you are comparing apples with oranges there. If one doesn't like GTK for whatever reason, that dev can switch to another GUI Toolkit. Or the dev can simply fork GTK and QT and customize them for special needs. But do you have such options and freedom with nvidia's drivers? No, you don't. You totally depend on their sad record of mediocre support for GNU Linux and this is a big problem. Maybe nvidia's politics is going to change in decade or so. But until then, I can't recommend their products to anyone of my friends or to my customers.
    Last edited by M.Bahr; 22 November 2023, 05:05 PM.

    Leave a comment:


  • Sonadow
    replied
    Originally posted by M.Bahr View Post

    Nvidia's proprietary linux driver is still a mixed bag for their customers. The performance is often below nvidia's windows driver. And i doubt, that nvidia is ever going to improve the quality of their linux drivers, unless more people are moving from windows to linux. Sadly there is not much hope in terms of an open source alternative like nouveau and nvk, as nvidia's proprietary firmware blob poses no stable ABI.
    Why are you and other Linux users complaining about the lack of a stable ABI for Nvidia hardware? Linux users hate stable ABIs. So much so that you all even point to the overused "stable_abi_nonsense" file that Linus left in the kernel all the time. Unstable ABIs means developers have the "freedom" to break things in the name of improvement.

    So why are you complaining? Nvidia is giving you the unstable ABI you all wanted and insisted on.

    And why are you all complaining that Qt6 breaks away from Qt5, or GTK4 breaks away from GTK3? Or when Python 3 breaks from Python 2 and every Perl release breaks an older one? After all, stable ABIs are all nonsense.​

    Leave a comment:


  • M.Bahr
    replied
    Originally posted by Rovano View Post
    Doesn't AMD cards on Linux no longer lose anymore compared to Windows? Performance maybe?
    Nvidia's proprietary linux driver is still a mixed bag for their customers. The performance is often below nvidia's windows driver. And i doubt, that nvidia is ever going to improve the quality of their linux drivers, unless more people are moving from windows to linux. Sadly there is not much hope in terms of an open source alternative like nouveau and nvk, as nvidia's proprietary firmware blob poses no stable ABI.
    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite


    The mesa radv drivers are very performant for a long time now. They often can overtake amd's windows driver. As for the compute side i hope we can some day use rusticl as a viable cross vendor alternative to end the ongoing cuda vendor-lock-in drawback.

    Leave a comment:


  • Panix
    replied
    Originally posted by M.Bahr View Post

    By those prejudices you have just exposed, that you didn't use modern amd gpus for a long time. Most of the points are not true except for compute at the moment and hdmi 2.1. But guess what, this is not even a disadvantage as it turned out, that hdmi cables got mostly bad signal quality in comparison to display port cables, which are preferred by more experienced users than you obviously. Also software support for compute is improving. But of what use is a high end nvidia gpu with nvidia's drivers for linux, when they often perform worse than on winblows? Thank you nvidia for contributing to a bad reputation of linux due to your garbage drivers.: https://youtu.be/iJsUcVOmZAY
    But don't worry i am not going to convince you. Just throw more of your money into nvidia's throat, because ...

    Ngreedia, the way it's meant to be paid!
    Fair enough - I haven't used an AMD gpu in Windows or Linux for a long time. But, I'm going by AMD gpu users/owners themselves and the facts/info - that is reported - regarding performance/issues in Compute/Blender, ML, video editing and gaming - I conceded it's fine for gaming. I think it's a coin flip for games - in Windows and the FOSS driver in Linux probably makes it a good combo/fit - but, gaming for me is - ranked....I dunno, third, maybe - in my priority list.
    Do you use an AMD gpu for anything other than gaming? I have read of some hdmi audio problems - I dunno if that's mostly amd or nvidia or both. So, I suppose there's some hdmi issues but my critique is for the hdmi 2.1 format not to work at all - which eliminates use with modern monitors and especially, 4K TVs - which I care about the latter. I use a large screen - 50" and I'm not changing anytime soon....if anyone wants to buy a 50" monitor, great but I don't have 3 grand for a monitor that size. So, I use a 4K TV - in the future, I probably want one that offers hdmi 2.1, maybe 120 hz - but, AMD gpus don't work with hdmi 2.1 because of the HDMI forum. That's not my fault or problem. I have to go with the features and options I want and prefer. Every time I look at the features that AMD supports - they are lacking this and that. I find it difficult to recommend or support AMD because of this.
    Edit: Don't tell me to use display port either. That cop-out is unacceptable and nonsense.

    Leave a comment:


  • Rovano
    replied
    Originally posted by M.Bahr View Post

    By those prejudices you have just exposed, that you didn't use modern amd gpus for a long time. Most of the points are not true except for compute at the moment and hdmi 2.1. But guess what, this is not even a disadvantage as it turned out, that hdmi cables got mostly bad signal quality in comparison to display port cables, which are preferred by more experienced users than you obviously. Also software support for compute is improving. But of what use is a high end nvidia gpu with nvidia's drivers for linux, when they often perform worse than on winblows? Thank you nvidia for contributing to a bad reputation of linux due to your garbage drivers.: https://youtu.be/iJsUcVOmZAY
    But don't worry i am not going to convince you. Just throw more of your money into nvidia's throat, because ...

    Ngreedia, the way it's meant to be payed!
    What does not work with Nvidia?
    What does not work for years? Vaapi broke. The only thing was bothering me.​
    At the moment, the default setting of acceleration in the Wayland browsers is not. I manually believe that it can be wound. What does not work further? I'm on lowlatency kernel 6.5 with an older Nvidia driver.​
    Tip: I noticed that under Wayland (7Watts - Desktop PC) NVIDIA has better electric consumption than under X11(24W) currently on Kwin. In a number of drivers 545, a new power management will come. I'm curious about it.​
    In Gnome, I think it has a small consumption in both.​

    Doesn't AMD cards on Linux no longer lose anymore compared to Windows? Performance maybe?
    Last edited by Rovano; 03 November 2023, 09:53 AM.

    Leave a comment:


  • M.Bahr
    replied
    Originally posted by Panix View Post

    Isn't AMD worth over a billion dollars? Possibly, in the neighborhood of over 180 billion?!? They're worth this much but they can't do anything about their lack of features for fan control, voltage regulation, hdmi 2.1, Compute support, using more than one monitor or display, power during video playback etc. etc.? Seriously, AMD?!? LOL!
    Too many AMD fanboys on here?
    By those prejudices you have just exposed, that you didn't use modern amd gpus for a long time. Most of the points are not true except for compute at the moment and hdmi 2.1. But guess what, this is not even a disadvantage as it turned out, that hdmi cables got mostly bad signal quality in comparison to display port cables, which are preferred by more experienced users than you obviously. Also software support for compute is improving. But of what use is a high end nvidia gpu with nvidia's drivers for linux, when they often perform worse than on winblows? Thank you nvidia for contributing to a bad reputation of linux due to your garbage drivers.: https://youtu.be/iJsUcVOmZAY
    But don't worry i am not going to convince you. Just throw more of your money into nvidia's throat, because ...

    Ngreedia, the way it's meant to be paid!
    Last edited by M.Bahr; 03 November 2023, 10:59 AM.

    Leave a comment:


  • Panix
    replied
    Originally posted by M.Bahr View Post

    Exactly this! But when things still don't work as expected for die-hard nvidia users, then we hear them typically accusing wayland again. Because how can a trillion dollar corporation not be able to achieve wayland compliance, right? It is always the fault of wine, vkd3d or wayland etc to them. This is really ridiculous just as the wait for nvidia is. Even over ten years old iGPUs from the competition run better on wayland than nvidia gpus with their closed drivers. It is about time that we see nvidia seriously sticking to an over 10 years old linux standard. They will need another couple of years i guess, if they ever are serious about the transition in the first place.

    But even when they do you always have to wait weeks or even months until their mediocrely debugged drivers get released for new kernel versions. This is a major drawback for people in need of running brand new software. Sorry i can not recommend that garbage to any people in my surroundings. The only hope is nouveau and nvk. But expect a lot of water running down the river until they can be called a viable alternative. Mesa RADV and ANV took years to be where they stand now.
    Isn't AMD worth over a billion dollars? Possibly, in the neighborhood of over 180 billion?!? They're worth this much but they can't do anything about their lack of features for fan control, voltage regulation, hdmi 2.1, Compute support, using more than one monitor or display, power during video playback etc. etc.? Seriously, AMD?!? LOL!
    Too many AMD fanboys on here?

    Leave a comment:


  • shinger
    replied
    Originally posted by pong View Post
    Well NV has some pretty fine developer tools e.g. CUDA, though proprietary, at least they run on almost any GPU they've made in the past 15 or whatever years
    with programmable shader support. Yes I would have rather seen an open standard (openCL, SYCL, OPENACC, something well multi-vendor supported). But given when they started and the relative lack of competition at least they did something free-as-in-beer to get/use and which works well. We probably wouldn't have a lot of the HPC/AIML stuff we have today to the extent we have it if it wasn't for NV/CUDA.

    Also unexpectedly it seems like there are unexpected limitations showing with other CL languages & GPU vendors e.g. OpenCL / Intel 4GBy Intel memory allocation limits vs. NV's "unified/heterogeneous memory", AMD's persistent lack of good launch-day/month/year ROCm support for consumer GPUs, Intel's good/bad OSS support.

    Intel has on the one hand a decent starter DGPU line, decent basic OSS LINUX driver support, but incredibly just does stupid stuff on top of that like
    a MS Windows control panel that couldn't even control the fans (at least in the first many months' versions), OSS documentation for many of the graphic aspects of their DGPUs but not (AFAICT) a SHRED of documentation on some of the most basic utilitarian things to enable LINUX use -- control the fans, monitor the temperatures / voltages / clocks, control the LEDs, control the power limit, support the ASPM/PM functions so the GPU doesn't sit "almost idle" 98% of the time drawing 41W when if the ASPM / power management worked (i.e. there was a CLI tool or at least the documentation to write an OSS one) it'd take well under 20W sitting at the desktop in many cases.

    So it's no party having NV because it's too closed in driver / documentation / code / OSS support to have highly functionall OSS at this time. But given their new-ish open kernel modules and published documentation of some tools / APIs I wonder if actually one would theoretically be able to have a better functional free-beer + free OSS "support" system for NV GPUs that actually is equal to or better than what Intel / AMD has now. It doesn't matter if nouveau is years behind the amount of refactoring / support needed to take advantage of all the information / code / tools they do have access to for Ampere, Lovelace, et. al.

    AMD could maybe be almost perfect for LINUX if they'd supply ROCm support, any needed missing support for control / monitoring (clocks / power / fans / video CODEC / whatever may not be published / OSS).

    And they are ALL horrible AFAICT wrt. stuff that should be essential for a DGPU -- SR-IOV & guest VM compute / graphics support; ZERO in MSWIN & LINUX both.

    In a way I think apple / mac has a better "solution" architecturally -- M1-M2-M3 with "unified memory" and very wide RAM buses with high memory bandwidth options so the DRAM bus actually ends up in the high end models with GPU-like memory bandwidth and then unified CPU memory with GPU memory so one doesn't have to use ONLY GPU resources for HPC compute but one can also use CPU resources competitively with what the GPU can add in. Of course NOTHING is open / oss there, tragically.

    In reality we've got to stop relying on TOYS (gaming GPUs) for the foundation of modern desktop PC compute / graphics. AMD/INTEL/ARM/RISC-V or whoever needs to support a high bandwidth wide V/RAM BUS going into standard desktop processors with the processors having an open ISA and multi-cores so one can have a more true open and also architecturally unified / distributed approach to GPU / compute / graphics / HPC.

    Back in 2008 when i decided to make Linux Ubuntu my daily driver i had to choose between letting go of gaming (lack of good support for gaming in Linux back then) or keep using Windows which i hated. I decided to go all Linux.

    So asking a lot from a GPU in Linux wasn't even that much anymore. Simply day to day tasks to have some normal representations of windows..display server that works fine and the most i was asking for was wobbly windows/Cube with compiz and hardware accelerations for playing some x264 videos. In about 15 years i always still had the necessary issues with Nvidia drivers. Dont get me wrong i dont blame the Nvidia developers for it but the useless management. Because they decide what to support and what not. We can see it in the open source world. Very often the quality products that have been developed is just awesome. Majority of the time closed source applications are a joke compare to the OSS alternative.

    I am glad that we have the choice to choose AMD at least to be able to not experience those issue.

    Leave a comment:

Working...
X