Announcement

Collapse
No announcement yet.

NVIDIA GeForce RTX 3080 Linux Gaming Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Looking a the performance per Watt, it looks like NViDIA is just pumping up power consumption to get more frames, essentially linearly.
    With the exception of MadMAx, the consumption per Watt is equivalent for all cards (within 0.05 circa)
    I am not sure that this is a great advancement, apart from the chip (Nvidia) being able to eat all that power (300++ Watt!!)

    Overall I am also not sure that FPS per Watt alone as currently calculated means much.
    TBH it would be great to see performance per watt at different capped framerate (@phoronix) -- e.g. 30, 60, 90, 120fps
    It should reveal how the power consumption scales; similarly to the torque of a car, it would reveal the real power consumption of those cards (AMD&NVidia) across the usage spectrum.

    If I had to pick now a card for 1080p -1440p, looking at FPS & FPS per Watt I would pick a Radeon 5700XT (often better min FPS than the 3080), but I am not sure how the power consumption would look like with (e.g.) vsync on (60FPS)

    Comment


    • #22
      Originally posted by birdie View Post

      This couldn't be further from the truth as I've been using NVIDIA GPUs on Linux for almost 20 years now and I cannot report any serious issues.

      Yeah, Wayland, no, sorry, I like to use my PC, not to use experimental features.

      Join Date: Dec 2019, Posts: 16. Looks like you're new around here and already have a very valuable opinion. LMAO. I guess I've been using Linux more than you've been alive.
      I find the nvidia Linux driver worse than terrible.
      It keeps crushing when I have less than 4GB of free ram and that without even using wayland (which I would like to try).
      This useless stream of ones and zeros also prevents me from configuring my kernel the way I want.

      PS0. The only reason I have an nvidia card is because I didn't know where I was getting into when I bought it.
      PS1. Counting posts in order to to win an argument, instead of proving your point is proof that you are a loser...
      Last edited by tildearrow; 12 October 2020, 08:03 PM.

      Comment


      • #23
        I'm not an Nvidia fan boy by any means, I've used both AMD/ATI and Nvidia cards over the past 25 years, but I do have a much better experience with Linux and usually windows too, with Nvidia. I've yet to get an amd card to work with my Linux From Scratch build, even when I tried purposely building it for AMD. And on Linux Mint I've had a better experience with Nvidia also. I do remember having to calculate back porch and front porch settings for XF86Config and don't know if that would have been different with AMD or not, but it was a pain back in like 2004 to set up a 24" 1920x1200 monitor with Nvidia and Red Hat, but we were required to use Nvidia cards at work.

        I love my RTX 2080 with Linux and gaming on Linux.

        And unless AMD pulls a rabbit out of their hat, I'll probably be buying a 4080 card in a couple of years.

        I do support open source and help contribute where I can with both my wallet and with feedback and an occasional bug/patch submission, but if binary blobs have the performance and less headache, I don't really care that it's closed source. Seems like a good way to protect your ideas and product.

        Comment


        • #24
          Originally posted by birdie View Post
          This couldn't be further from the truth as I've been using NVIDIA GPUs on Linux for almost 20 years now and I cannot report any serious issues.
          My claim is not that Nvida is impossible to use but rather that it requires more pointless side quests (out of the box troubleshooting). I own Nvidia cards because of cuda, it is the only option. But on the desktop side, out of the box, I experience more issues such as screen tearing on video playback and other assorted glitches than I do with even low-end integrated graphics.

          Originally posted by birdie View Post
          I guess I've been using Linux more than you've been alive.
          I started using Linux circa 2000 from a Red Hat for Dummies book. I am not saying that to represent myself as some kind of authority on Linux because I admittedly spent more time on the Windows side as a developer. But, my recommendation is not about the last 20 years but current point in point in time. Save money, skip Nvidia (unless you need cuda).

          Comment


          • #25
            I own pure AMD (consumer stuff, not server/workstation) gear now. My situation has been like that for over 18 months. It's solid, performant and priced right for general usages, and I will likely continue to invest in AMD in the near future, as is other people around me.

            So there, birdie

            And I've been abusing Linux since 1998. Compiled my first kernel on a Compaq Cyrix MediaGX 166 machine. Took a while...
            Hi

            Comment


            • #26
              Originally posted by skeetre View Post
              I'm not an Nvidia fan boy by any means, I've used both AMD/ATI and Nvidia cards over the past 25 years, but I do have a much better experience with Linux and usually windows too, with Nvidia. I've yet to get an amd card to work with my Linux From Scratch build, even when I tried purposely building it for AMD. And on Linux Mint I've had a better experience with Nvidia also. I do remember having to calculate back porch and front porch settings for XF86Config and don't know if that would have been different with AMD or not, but it was a pain back in like 2004 to set up a 24" 1920x1200 monitor with Nvidia and Red Hat, but we were required to use Nvidia cards at work.

              I love my RTX 2080 with Linux and gaming on Linux.

              And unless AMD pulls a rabbit out of their hat, I'll probably be buying a 4080 card in a couple of years.

              I do support open source and help contribute where I can with both my wallet and with feedback and an occasional bug/patch submission, but if binary blobs have the performance and less headache, I don't really care that it's closed source. Seems like a good way to protect your ideas and product.
              Buy a quadro , lol

              Comment


              • #27
                Originally posted by rene View Post
                Nobody serious in Linux and Open Source touches this binary-only blob like it's Covid. And AMD's equal performance Big Navi RDNA2 is just around the corner, soooo, .... yolo.
                Guess I'm not 'serious in Linux and Open Source', whatever that means...

                Comment


                • #28
                  Originally posted by Bobby Bob View Post

                  Guess I'm not 'serious in Linux and Open Source', whatever that means...
                  that meany you support the status quo of vendors hiding there stuff, and hindering innovation and not encouraging nor contributing to an open source 3d and graphic stack. Also you are on the mercy of Nvidia of developing and fixing it, and can never fix a bug or add a feature. If you want to run binary only windows drivers, well, there is Windows for that, ... And last but not least, want to move to something more modern? Haiku, RustOS? Yeah, there is no binary only Nvidia driver for you. And then Joe users like you complaints the devs: "Your OS sucks and is unusable, go support Nvidia GPU first!" Sad.
                  Last edited by rene; 13 October 2020, 03:46 AM.

                  Comment


                  • #29
                    Originally posted by birdie View Post

                    Does not running binary blobs make you happier? Enrich you? Make the world a better place? This "I hate closed source software" is just crap when you don't actively support open source and most people here on Phoronix have done nothing for the movement aside from leaving salty self-righteous comments.
                    You truely did not understand OpenSource. If you want to run binary only stuff there is Windows and macOS for you. "I hate closed source" is what people like Linus and all the others like me wrote all the ecosystem up out of nothing for 30 years. And now you ant to tell us we should accept undebugable, and questionable security binary only drivers in our kernel? There is nothing self-righteous in having register lever specifications to write software and drivers for your hardware. Undebugable, buggy vendor drivers and OS is exactly what OpenSource, Linux & BSD is about. Funny how fanboy users repeatedly want to sugar talk that to the actual freaking developers.

                    Comment


                    • #30
                      Originally posted by rene View Post

                      that meany you support the status quo of vendors hiding there stuff, and hindering innovation and not encouraging nor contributing to an open source 3d and graphic stack. Also you are on the mercy of Nvidia of developing and fixing it, and can never fix a bug or add a feature. If you want to run binary only windows drivers, well, there is Windows for that, ... And last but not least, want to move to something more modern? Haiku, RustOS? Yeah, there is no binary only Nvidia driver for you. And then Joe users like you complaints the devs: "Your OS sucks and is unusable, go support Nvidia GPU first!" Sad.
                      Yeah I'm actually OK with expecting a GPU manufacturer to be the one responsible for developing a GPU driver and fixing bugs or adding features... They make the hardware, I expect them to make the drivers...

                      I think you should go outside more often if you're this worked up about an issue like this.

                      Comment

                      Working...
                      X