Announcement

Collapse
No announcement yet.

Initial NVIDIA GeForce RTX 2080 Ti Linux Benchmarks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by duby229 View Post

    Because performance isn't the end all or the be all. Nvidia may have their driver configured for maximum FPS, but nVidia FPS does not equal AMD FPS. All of AMD's drivers and especially the OSS drivers deliver superior visual quality. It seems to me that AMD'S hardware and drivers were designed to deliver the best possible visual quality, while nVidia's hardware and drivers were designed to trade off visual quality for power efficiency and FPS. Personally I think AMD made the right choice. I didn't buy a high end video card to get worse visual quality, I did it to crank up the details to get better visual quality even if it takes a bit more power to achieve. Neither nVidia nor AMD have the right to lower quality for the sake of FPS.
    Stuff like this ^ is the reason why I say buy AMD so we can at least personally know AMD's edge. Even on my PS4, I swear there is something to it I can't quite point out. Closest thing I can think is people are missing even digging in digital vibrance for NVIDIA but I know there is more to it than that cause AMD seems to do things differently. Digital vibrance does help in the nvidia driver though. For anyone familiar with the old 3Dfx stuff and seperate drivers for games like Quake2 this stuff is very real.
    Last edited by creative; 21 September 2018, 03:03 PM.

    Comment


    • #52
      Of course while the PS4 is running on a form of BSD, its AMD hardware sports a very good example of such a thing as described above such as Killzone Shadow Fall. The hardware is really low end compared to this i7 and gtx 1070, yet it still shines despite its locked fps and overall LOD being lower.

      There is something to it and its not just the fact that its an optimized console game.

      So yea I do feel like I totally have the right to say "buy AMD" cause I have a gaming system that is completely AMD and for what it does do? It does very very well.
      Last edited by creative; 21 September 2018, 03:30 PM.

      Comment


      • #53
        AMD might not have won the PC gaming GPU competition, its really debatable anyway. As far as overall gaming is concerned grobally it has seriously kicked in NVidimusss' teeth, actually every tooth and piece of jaw is on the pavement.

        Honestly it took a while to sink in for me but now I have finally realized.

        AMD = Intel and NVidia be afraid, be very afraid.

        That being said I still game like a mad mofo on my i7/gtx 1070 rig.
        Last edited by creative; 21 September 2018, 03:45 PM.

        Comment


        • #54
          Originally posted by reavertm View Post
          Impressed? They are slow (especially Vega) compared to their Windows counterparts for some reason. Yes, they are open source, one redeeming quality.
          my R9 380 and 390 are both faster on Linux in the limited testing I've done. in my case, it's the game's inefficient use of the CPU that has a performance hit compared to Windows.

          Comment


          • #55
            Originally posted by msotirov View Post
            Care to offer screenshots proving said differences in visual fidelity?
            Go on, take a look:



            https://www.youtube.com/watch?v=a2IIM9fncqc (http://i.imgur.com/vG2RNKo.jpg)

            Get the Cheapest RX470 Here: http://amzn.to/2hilDfSGet the Cheapest GTX 1060 Here: http://amzn.to/2gz3yJ1When I first did my review of the RX 480 I noticed s...


            https://www.youtube.com/watch?v=i9ZG7pA5INc&t=2m15s (carpet)

            Comment


            • #56
              Originally posted by RussianNeuroMancer View Post
              All I can see is slightly better anisotropic filtering of the textures on AMD. I wouldn't say that it's worth the much worse performance, like duby229 was suggesting. I'd always prefer stable FPS over better anisotropic filtering.

              Comment


              • #57
                Originally posted by msotirov View Post
                All I can see is slightly better anisotropic filtering of the textures on AMD. I wouldn't say that it's worth the much worse performance, like duby229 was suggesting. I'd always prefer stable FPS over better anisotropic filtering.
                Anisotropic takes such a small amount of processing power 2xAF vs 16xAF is more of matter of asthetic than anything. Anisotrotropic filtering at highest settings has no performance hit compaired to plain trilinear filtering. What does have a performance hit as far as graphics settings go are things like AO, overall LOD and higher forms of transformative texture mapping like parallax occlusion mapping, cast shadow detail and light mapping.

                Comment


                • #58
                  Originally posted by creative View Post

                  Anisotropic takes such a small amount of processing power 2xAF vs 16xAF is more of matter of asthetic than anything. Anisotrotropic filtering at highest settings has no performance hit compaired to plain trilinear filtering. What does have a performance hit as far as graphics settings go are things like AO, overall LOD and higher forms of transformative texture mapping like parallax occlusion mapping, cast shadow detail and light mapping.
                  I know, I wasn't implying that AF is heavy. The guy I was replying to, was suggesting that Nvidia has better performance because their drivers supposedly "cheat". These screenshots with the slightly shittier AF in Hitman on an Nvidia GPU is all the proof they provided.

                  Comment


                  • #59
                    Guest Ok I see.

                    Comment


                    • #60
                      Originally posted by msotirov View Post
                      I know, I wasn't implying that AF is heavy. The guy I was replying to, was suggesting that Nvidia has better performance because their drivers supposedly "cheat". These screenshots with the slightly shittier AF in Hitman on an Nvidia GPU is all the proof they provided.
                      Sorry, I didn't have a chance to reply.

                      It's definitely true.

                      Interesting article. The GeforceFX's can get up to a 27% increase in scores in 3dmark03 with a whole list of discovered cheats: http://www.theinquirer.org/?article=9648 Also interesting: Ati's 9800 pro was also caught cheating in one test, although it made a less than 2% difference in the...

                      This is the first one known

                      There seems to be a major problem with NV's default settings in the control panel. Under default, it's supposed to "Let Application Decide" on AF. Yet it overrides in-game settings to **** quality for texture filtering. It started with a guy on OCUK who benched Titan X & Fury X, people noticed...



                      Really, I can go on and on, there are literally hundreds. This is just the front page of one simple google search you could have done yourself days ago. I'm not gonna front AMD gets caught too

                      Comment

                      Working...
                      X