Announcement

Collapse
No announcement yet.

NVIDIA Doesn't Expect To Have Linux 5.9 Driver Support For Another Month

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by birdie View Post

    I don't need your sympathy or your nod. Even when surrounded by Open Source fans, even while I've been using Linux for more than two decades, I still have my wits about me and I don't blindly hate companies because they have their own PoV. The only thing I cannot forgive NVIDIA for is their reluctance (which is yet to be explained) to release firmware for their GPUs in order to let nouveau work properly. Hating is stupid and counterproductive anyways.
    You are telling us that hating a company is counterproductive, while writing hateful words directed at actual people doesn't seem to worry you the least. Check that with your therapist.

    I am well aware of both Nvidia and AMD's power efficiency, thank you very much. It shows both RX 5600 and 5700 doing fine, by the way. You'd better pick the 4K performance if you really want to demonstrate RTX 3080's true efficiency. But then, we're not talking about TDP and price, so you're eluding part of the problem. Oh, and everybody knows RX 500 series were flawed, but do you want to talk about GTX 8000's? Or GTX 5000's? Or even GeForce 4 vs Radeon 9700? Who's counterproductive now?

    Does the fact that I pick my parts from whoever fits me the most make me a "fanatic?"
    Who is screaming in this thread?
    Why do you want people to exercise "modesty" when you brag about your many years using Linux and your contributions to open source software as a means to have people keep their mouth shut?

    I only exposed my views on the subject that don't need your approbation.
    Last edited by omer666; 19 October 2020, 06:11 PM.

    Comment


    • Originally posted by omer666 View Post
      Where did you sense any sympathy in my words?

      You are telling us that hating a company is counterproductive, while writing hateful words directed at actual people doesn't seem to worry you the least. Check that with your therapist.
      I quoted your words about sympathy - make sure you at least remember what you said earlier. I don't have a therapist yet.

      Speaking of hateful words? Are you talking about me calling people "fanatics" on several occasions? But the people I've referred to hate the company with passion for absolutely no reasons. Again, I've repeated it three times already, does NVIDIA make anyone buy their GPUs? No? Do they impede Linux development? No? Do they do anything bad aside from providing closed source drivers for a God forsaken OS? No? Then why so much unwarranted hatred as if they've killed your children?

      Originally posted by omer666 View Post
      I am well aware of both Nvidia and AMD's power efficiency, thank you very much. It shows both RX 5600 and 5700 doing fine, by the way. You'd better pick the 4K performance if you really want to demonstrate RTX 3080's true efficiency. But then, we're not talking about TDP and price, so you're eluding part of the problem. Oh, and everybody knows RX 500 series were flawed, but do you want to talk about GTX 8000's? Or GTX 5000's? Or even GeForce 4 vs Radeon 9700? Who's counterproductive now?
      Look how you've carefully chosen good AMD products and carefully ignored all their fiascos. And you've carefully ignored all the great NVIDIA products starting with the GeForce4 Ti 4200. How nice of you. The RX 5000 series is the the first one to truly feature good performance/TDP ratio in what, eight years?

      Radeon HD 7000 series: decent, oh, wait it was in 2012
      Radeon 200 series: OC'ed rebrandeon with bad thermals
      Radeon 300 series: OC'ed rebrandeon with bad thermals
      Radeon RX 400 series: not good
      Radeon RX Vega series: not good, cost AMD a fortune to produce, deprecated fast because margins were horrible.
      Radeon RX 500 series: bad
      Radeon VII: not good, a top tier product released two years after the GTX 1080 Ti for the same price featuring the same performance and performance/TDP ratio, only the GTX 1080 Ti was produced using a 16 nm node, and the Radeon VII on a 7 nm node.

      Originally posted by omer666 View Post
      Does the fact that I pick my parts from whoever fits me the most make me a "fanatic?"
      Who is screaming in this thread?
      Why do you want people to exercise "modesty" when you brag about your many years using Linux and your contributions to open source software as a means to have people keep their mouth shut?

      I only exposed my views on the subject that don't need your approbation.
      It depends on how you choose these parts. If you go around screaming "NVIDIA fuck you" and buy anything by AMD just not to buy better NVIDIA products that makes you nothing but fanatic. Rational people buy products based only on their merits. Fanatics will buy everything from their company because ... they have tribalism in their genes instead of rationale. And this forum and lots of comments here are a prime example of that: "we love our Linux tribe and we hate everything which is not Open Source with vengeance". Look where it's taken Linux in the past 30 years. Practically nowhere. 99% of people have never heard of it.

      There's one thing which Open Source fans will never see in Linux (in its current form and current development model): native AAA games. And Open Source AAA games will not happen ever. Good luck having tons of fun with Wine/Proton (and don't get me started on various anticheat solutions which just do not work under Linux).

      Lastly I don't gag people. You're now making things up
      Last edited by birdie; 19 October 2020, 08:44 PM.

      Comment


      • Originally posted by birdie View Post
        Much touted RX 500 series cards are horrible in terms of power consumption, thermals, noise and effectiveness.
        Dear Birdie,

        those graphs do not mean much -- I would say they are completely misleading.
        What they show is the end of the scale of power consumption by FPS, they do not represent standard usage power consumption (or better, power consumption at given scenario/usage)
        Are you familiar with power vs rpm plots used to characterise engines?
        A GPU is more or less like a engine, it can scale down or up its power usage depending on the requirement.
        The important is to understand that characteristic and where your usage lands on the power consumption

        To give you an example, I run Dota 2 an a rx480 using vulkan (RADV+ACO), 1080p, full screen and logging the output of 'sensors' every 1 second for 30 seconds while changing the MAX FPS settings
        I just left the game running while watching a live event.
        The results are as follows
        MAX FPS AVERAGE Power Consumption
        60 56
        90 60
        240* 90
        *: my card does not achieve 240 PFS -- here it means essentially FPS Uncapped

        In my use case, Dota2 Maxed out with VSync on does not consume more than 60 Watt on average .....

        Now, the above is just an exercise, but it would be useful to understand how the different GPUs (across vendors) scale their power at LOCKED FPS.
        I would rather buy a GPU that gives me 60fps locked (MAX/HIGH/ULTRA) at less than 60 Watt, rather than a GPU that gives me 300+ FPS at 300+ Watt

        Unfortunately I do not know which GPU is which as I have yet to come across a report on the above across the different 'I give you the numbers to buy your new GPU' websites

        TBH, I would not be surprised (given what shown in the article on FPS per Watt) that at the end all the GPUs (red vs green, open vs closed) are equal at main stream settings (1080p, 60FPS) in terms of power (thus thermals and effectively noise )
        Last edited by Grinness; 19 October 2020, 07:04 PM.

        Comment


        • Originally posted by cyni
          In what sense is it not ready? In many ways it is simpler to use than the competition.
          Did you semi-quote my name on purpose so I would not find your post?

          How is it "simpler" to use if it requires you to read a thousand manuals, use a terminal and even potentially learn programming just to do a simple task?!

          Originally posted by cyni
          Outside of specialized proprietary software made for professionals with no open-source equivalents, it seems to have covered all the important ground
          Video editing?

          Comment


          • Originally posted by Grinness View Post
            Dear Birdie,
            Dear Grinness,

            I don't understand what you're trying to say here. That certain not exactly power efficient GPUs work OK'ish in relatively simple games with VSync turned on? But what about more complex titles? Should people buy their GPUs, play carefully selected games or enable VSync or even decrease image quality to achieve good thermals? Sorry, very few people do that (btw I also do it for certain non-demanding games like King's Bounty). Most people however just install their video cards, enable maximum image quality and play. Lastly if you're frugal you've got a chance of buying the GTX 1650 with a 75W TDP or the RX 570 which has mostly the same performance at 167W. The choice is yours.

            And lastly, a thing which people choose not to discuss here: with NVIDIA I can choose which drivers to use. With AMD your drivers are part of your kernel and Mesa. You cannot easily shuffle these packages around when regressions occur: in fact most people have no idea how to downgrade the Mesa package or roll back the kernel.

            Comment


            • Originally posted by birdie View Post

              Dear Grinness,

              I don't understand what you're trying to say here. That certain not exactly power efficient GPUs work OK'ish in relatively simple games with VSync turned on? But what about more complex titles? Should people buy their GPUs, play carefully selected games or enable VSync or even decrease image quality to achieve good thermals? Sorry, very few people do that (btw I also do it for certain non-demanding games like King's Bounty). Most people however just install their video cards, enable maximum image quality and play. Lastly if you're frugal you've got a chance of buying the GTX 1650 with a 75W TDP or the RX 570 which has mostly the same performance at 167W. The choice is yours.

              And lastly, a thing which people choose not to discuss here: with NVIDIA I can choose which drivers to use. With AMD your drivers are part of your kernel and Mesa. You cannot easily shuffle these packages around when regressions occur: in fact most people have no idea how to downgrade the Mesa package or roll back the kernel.
              Birdie,

              what I mean is very simple: TDP (as for CPUs) and (max FPS)/(max Power) do not mean a thing if you do not contextualise (e.g. low/medium/high grade GPU, type of usage)
              In fact what you call 'not exactly power efficient' is perfectly in line with expectation for majority of cards within the same era/price/grade (rx480 is from 2016, 14nm processing node)
              The article you link in fact states the following, w.r.t. the GTX 1650 OC:

              "Another great alternative is the Radeon RX 5500 XT, which starts at $180 with a similar performance uplift and much quieter coolers, especially the Sapphire Pulse is a good choice here. If you are willing to buy used, then the Radeon RX 570 or RX 580 are definitely worth a look as they are being sold off at bargain prices because people are upgrading. Yes, they don't have the latest tech and draw more power, but they are still compatible with all new games, which would help you pass the time until next-gen becomes available."
              https://www.techpowerup.com/review/g...-gddr6/35.html

              With the huge list of cards in the image you link, it is like arguing which car is better between a city-car (any type/any brand) and a sport-car (any type/any brand) without knowing the use you gonna make of it (city driving, highway, sunday-race,...)

              Finally, with AMD you have 3 drivers you can choose from: Mesa, AMD-PRO, AMD-Open -- no need to recompile anything (and ppl even complain for the 'too many options')
              With NVidia you have 1 driver (unfortunately nouveau is not a real option, until NVidia ....)

              Comment


              • Wow, downgrading a package, such a feat!

                No, really, you do the same thing you are blaming others for. You are such an Nvidia fanatic that you compare GPUs that aren't even the same generation (cf RX 570 vs GTX 1650). You're clever enough to note that I ignored AMD's failures on purpose, but not enough to realise I was just mimicking your behavior.

                And I am sorry to say this, but RX 200, 300 and 400 series were quite good in fact, they had a higher power draw, but not by much compared to Nvidia's offerings at that time. For around the same power draw and price as a GTX 960, you could have an RX 470 which outperformed it nicely, for example.

                Comment


                • Originally posted by birdie View Post
                  Wayland is still a toy, incomplete and doesn't offer benefits for most users out there while still having rough edges. I don't understand why NVIDIA has to support or "comply" with it. What if you created a brand new yet another graphics server tomorrow? Should NVIDIA also support it?
                  This is because you have not read the Wayland requirements.


                  Wayland compositors are after KMS support. Interesting enough we are fairly much to the point that all X11 graphics drivers par Nvidia support KMS. So in reality Nvidia does not support x.org X11 server properly. This is getting more important as distributions are installing x.org X11 server without root privileges so cannot fall back to the old VESA driver.

                  The hard reality here is Nvidia not supporting Wayland is also Nvidia not properly supporting x.org X11 server to come as secure as possible. Moving from user mode setting to kernel mode setting is a security change. This is also to move memory managed by kernel and memory managed by GPU into alignment to reduced privilege exploits.

                  Reality Nvidia needs to get their driver working correctly in KMS mode. The eglstreams is Nvidia attempt to avoid KMS but the result of eglstreams why no one else is going that route is creating memory managed by GPU under 1 system and memory managed by cpu under another allowing privilege errors.

                  Yes I would like Nvidia to have future graphics server support. But the reality is nvidia does not support current x.org X11 and Linux framebuffer properly. Yes wayland support from Nvidia that is proper would be just icing on the cake fixing up the x.org X11 and linux framebuffer issues correctly. Yes this requires Nvidia to accept that the way forwards is KMS.

                  Comment


                  • Originally posted by mdedetrich View Post
                    Its actually hilarious because if Linux had a Hybrid/Micro Kernel design this wouldn't even be a discussion. The NVidia blob would be sitting in userspace (or Ring 0 environment) like any other program and would communicate with the kernel via some interface.
                    You forget X11 as x.org or Xfree86 had a thing called User Mode Setting. That was drivers fulling in user space for graphics. Linux can like a microkernel have drivers in userspace.

                    https://en.wikipedia.org/wiki/Hybris_(software)

                    Android also has had userspace graphics drivers on a Linux kernel. Now there is a performance overhead having drivers in userspace. So Nvidia blob could be sitting fully in userspace if Nvidia was happy with the overhead today with Linux.

                    They absolutely Nvidia want to be ring 0 if they were happy in userspace they could have been using like user mode setting under Linux.

                    Originally posted by mdedetrich View Post
                    This issue has less to do with GPL than people think, its more to do with Linux sticking with an arguably archaic technical design for their kernel (every other kernel out there that has significant usage is either micro or hybrid kernel).
                    There is a interest point Mach kernel that is called a Microkernel today started as a monolithic Yes the base of OS X and what on iphones is hybrid between Mach Microkernel and BSD monolithic kernel. Common usage you don't find pure microkernels that often

                    Linux kernel and Freebsd kernel are called monolithic kernel a lot but when you look closer they are not a neat fit.


                    Generic PCI UIO driver

                    The generic driver is a kernel module named uio_pci_generic. It can work with any device compliant to PCI 2.3 (circa 2002) and any compliant PCI Express device. Using this, you only need to write the userspace driver, removing the need to write a hardware-specific kernel module.
                    Windows the NT design is a hybrid between micro kernel and monolithic ideas. OS X you could say is another form of Hybrid. Horrible reality with Fuse, UIO and other things Linux is another implementation of hybrid between micro kernel and monolithic.

                    Comment


                    • Originally posted by birdie View Post
                      And lastly, a thing which people choose not to discuss here: with NVIDIA I can choose which drivers to use. With AMD your drivers are part of your kernel and Mesa. You cannot easily shuffle these packages around when regressions occur: in fact most people have no idea how to downgrade the Mesa package or roll back the kernel.
                      Need to roll back kernel when issue happen turns out to be equal risk with AMD and Nvidia drivers. I have run both Nvidia and AMD over the years. Nvidia you do strike the case where the new kernel will not take the Nvidia third party module quite a bit so need to roll back the kernel or worse cases with Nvidia install a different kernel completely so the Nvidia driver will install(there are kernel built options distrubitions can use that will break nvidia drivers). AMD you don't have this problem where the driver will not go into kernel but the same set of steps to roll back kernel version or install new kernel you use with Nvidia you use with AMD the same way only difference is AMD case is to change driver where Nvidia is so you can install driver.

                      The kernel bit like it or not AMD/Nvidia are very the same. Skills of build a kernel from source and roll back kernel to get around graphical issues are required with Linux be you using Nvidia or AMD if you run into trouble. The big difference is security. AMD to fix a graphical issue you will be normally moving to a newer kernel. Nvidia to fix a graphical issue you will find your self stuck on older kernel versions so older drivers install.,

                      Also AMD drivers being provided with kernel and mesa in the distribution is not the only option with AMD either.

                      There are the unified Linux drivers as well. So as a AMD user I can choose to use kernel provided AMD driver and distribution provided Mesa or go unified driver. So I have more driver choice than you do due to the kernel driver in fact working.

                      Downgrading/upgrading mesa is normally not done by AMD users on Linux those cases are were we fall back to the unified.

                      Basically people don't raise the point you just did here because its not a real difference that disadvantages AMD. Really all you have shown is that you have not used AMD solution for GPU long enough with Linux to know that area is not a major difference. Worse in most cases it advantage in the AMD direction once you take security risks into account.



                      Comment

                      Working...
                      X