Announcement

Collapse
No announcement yet.

NVIDIA GeForce GTX 1080 Ti On Linux: Best Linux Gaming Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by andrei_me View Post
    Michael what do you think in acquiring a Radeon R9 Fury X now? The price should be much more affordable now and would be nice to have the higher end version running on RadeonSI
    Depends on the price you get... I found my second nano card recently for about $199 in a local store... If you can find something like that then i would say it is worth it

    Comment


    • #32
      Originally posted by Pawlerson View Post
      Nvidia is shit. I switched from GeForce GTX 660 Ti to Radeon RX 480 OC and the difference is huge. Not just performance and Open Source drivers which is obvious, but everything is much more smooth. Firefox on Linux and Windows, 2D acceleration on Linux, Witcher 3 running on full HD and uber settings (there was terrible tearing on medium quality and 60 FPS with GeForce). Keep the good work AMD and my next CPU will be Ryzen (even though I'm very happy owner of i7).
      yeah myself looking forward to RavenRidge APUs . Naples looks good too. if Naples gets it right (which looks like it will ) it will be a good revenue stream for AMD.

      Comment


      • #33
        Hi everyone..I'm new here and was very curious in this process... The article says that nvidia gave the 1080 card to you guys,..

        1.When they do this, what is their intention? Do they just want you to review it? Or are they expecting you guys to reverse engineer it and create the open source drivers for it?

        2. You guys seem upset over this..so I'm suspect that AMD or Intel, goes out of their way to provide the FOSS community (you fine folks) with better tools or support, or even providing the Open Source drivers themselves?

        3. Do you all have to normally approach nVidia to get hardware to write drivers for? Or tools, and documentation for that matter?

        Comment


        • #34
          Originally posted by ElderSnake View Post
          How the heck do you quote posts on the mobile interface..
          There is light-gray line under each post. When you tap below that line, at the right edge, you get a menu.

          Originally posted by ElderSnake View Post
          Anyway yeah indepe , being able to use Wayland no doubt makes a difference. But even in Xorg sessions, the difference is still quite noticeable to me in response on the desktop, repainting stuff on the screen etc even in minimal WMs. I always found GNOME also really sluggish and prone to memory leaks on proprietary NVIDIA.

          In saying that, that's my experience. Yet I've seen other guys, like the LinuxGameCast guy Venn, run on pure NVIDIA for years and never complain about screen tearing or anything. And this is even before the CompositionPipeline switch thingy. So I don't geddit. But my desktop experience with NVIDIA has always been subpar, although Kwin used to run nicely circa KDE 4.6.
          Huh. (Hopefully) I"ll soon be able to run AMD, NVidia and Intel, each on Wayland and Xorg...then I'll see...

          Comment


          • #35
            Hi everyone..I'm new here and was very curious in this process... The article says that nvidia gave the 1080 card to you guys,..

            1.When they do this, what is their intention? Do they just want you to review it? Or are they expecting you guys to reverse engineer it and create the open source drivers for it?

            2. You guys seem upset over this..so I'm suspect that AMD or Intel, goes out of their way to provide the FOSS community (you fine folks) with better tools or support, or even providing the Open Source drivers themselves?

            3. Do you all have to normally approach nVidia to get hardware to write drivers for? Or tools, and documentation for that matter?

            Comment


            • #36
              Strong is the nVidia on Linux!

              Hard will be the nVidia fall when Vega dawns!

              :-)
              Last edited by hoohoo; 10 March 2017, 01:55 AM.

              Comment


              • #37
                Originally posted by torsionbar28 View Post

                I'm no fan of blobs, but I've not had many complaints using Nvidia cards on Linux. That said, I agree that AMD on Linux is much smoother looking, with far less visible tearing, for both 2D and 3D. Intel is too for that matter. I don't know what NVidia is doing wrong there, but the difference is very noticeable regardless of what vsync settings I use. Up to date distro, latest drivers, etc, nothing seems to make a difference.

                The open source AMD driver has come a loooong way, and I'll be looking at AMD whenever I eventually replace my original Titan GTX card.
                Can I plug in my TV and get sound over HDMI? I'm looking forward to the day this happens so I can sell off all of my Nvidia hardware.

                Comment


                • #38
                  Originally posted by torsionbar28 View Post

                  I'm no fan of blobs, but I've not had many complaints using Nvidia cards on Linux. That said, I agree that AMD on Linux is much smoother looking, with far less visible tearing, for both 2D and 3D. Intel is too for that matter. I don't know what NVidia is doing wrong there, but the difference is very noticeable regardless of what vsync settings I use. Up to date distro, latest drivers, etc, nothing seems to make a difference.

                  The open source AMD driver has come a loooong way, and I'll be looking at AMD whenever I eventually replace my original Titan GTX card.
                  Since I started using 3D accelerators (about 17 years ago), the only nVidia GPU's i was satisfied with was GeForce 3 Ti and GeForce 4 MX. Since FX debacle, they started using different tactics, with DX GPU's, it was first known time when vendor brutally lied costumers about GPU specification, latter when ATI introduced 9000 series with proper DX9 support, and when games started using it (such as Half Life 2), it was more than obvious that nvidia lied (similary priced/specification FX GPU strugled with 5FPS, while ATI was doing well over 30). After that, nvidia took path in "software tricks" (eg. drivers) to achieve edge over competition, and it was few times yet again when they lied about specifications... (not as important as FX series, but still). Last GPU's from nvidia i had was from 9000 series and 200 series, everyone loved 9800GT "performance", it was a nightmare for me, high FPS, lot's of lag (input), and latter in life, lot's of stutters. The reason why i ahd problems with it, is not the hardware (9800GT was good GPU), but software tricks nvidia used (and still use) to gain those high FPS. At the end, FPS was never relevant for me, what is relevant, is when i press button, how much time it takes to come to the action. And for that reas, I don't buy nvidia GPU's from 2008-09. I mean, i "upgraded" to 9800GT after using weak ATI 9550, and you can imagine how bad things were, when that "upgrade" was giant waste of money, lag i didn't experienced for years before came back, microstutter (not even close as smooth as old ATI), and so on.

                  My point is, some people might need to look at high FPS, and do not need fast response time (where "ATI" still dominates, at least it did few years ago..., I'm not in a GPU market for a while..., but if pattern goes from ~2002, i doubt it changed in last few years), but I know i will never recommend or buy green GPU ever again, even if competition (AMD or Intel) offer less performance for the same amount of money.

                  Comment


                  • #39
                    Things that could make a difference in Linux (at least Mint) detectably interrupting video generation are Cinnamon vs MATE desktop environments, desktop special effects vs. no special effects, generic kernel vs. real-time kernel, and running parallel tasks such as bit torrent. Similar effects may affect other builds and DEs. In an earlier Mint than I use now, there was a lot of processor power being expended into x.org and Cinnamon when they weren't visibly doing anything.

                    Comment


                    • #40
                      Hi Michael & yall,

                      thanks for the testing of 1080TI. Amazing work by nVidia.

                      "NVIDIA GeForce GTX 1080 Ti On Linux: Best Linux Gaming Performance." And that is a fact. :-)

                      GreekGeek :-)

                      Comment

                      Working...
                      X