NVIDIA R565 vs. Linux 6.13 + Mesa 25.0 Git AMD / Intel Graphics For Linux Gaming

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • skeevy420
    Senior Member
    • May 2017
    • 8635

    #31
    Originally posted by Quackdoc View Post

    the large issue is how bad modern games are for automated benchmarking. even if you get mangohud or dxvk hud or whatever to dump frametimes, many games you cannot just automate with inputs because of

    A) The games are Non deterministic (RNG is a bitch for automated benchmarking)
    B) Have a myriad of loadtime issues
    C) Input emulation can be hard as it may need uinput for a generic solution

    etc.

    Some programs may be "TAS friendly" and those could be benchmarked, but finding them is hard.
    Yep. Games like Elden Ring require a person to actually play the game to get those numbers. With the right setup someone playing the games could be an interesting segment on a Phoronix Stream. Granted, 30 minutes over 11 GPUs is 6.5 hours of playing the same damn section in Elden Ring over and over again. Physically testing 6 games and then reporting on that is literally a full time job.

    What sucks is "the right setup" is like $30K in hardware alone and that's one PC with 11 GPU swaps.

    It'd be cool if someone like Bridgman of Adg5f could talk someone at AMD into hooking Michael up with eleven 7800X3Ds. That takes it all from "only" $30K to $100K in PC hardware. Then there's needing in-line graphics capture so you're not skewing benchmarks with video capture software, you'd need a 12th PC for doing actual work, a 13th PC to do video rendering and whatnot so the 12th PC can still be used for other work...then there's all the other PCs and servers that have to be powered for all the other Phoronix testing.

    And to do all that would require some electrician work because you don't run that many PCs without upgrading your power grid. Wiring, breakers and boxes, power conditioners, etc.

    Speaking of one and multiple PCs, I've often wondered how much caching plays a role in benchmark results. Not just here, everyone's results. A lot of benchmarkers use the same PC and swap GPUs between tests. That makes me wonder: How is the caching handled? Does the last tested GPU have an easier time due to more cache built up? Is the first GPU at a disadvantage? Is the cache wiped between runs and, if yes, should the tests be ran at least three times with the first two results discard since they're cache training builds? Like they're the gaming equivalent of a PGO build/test.

    There's a lot of shit that's necessary for professional, full time benchmarks; gaming and general software and operating systems. Probably more than most people realize or consider.

    Michael, just wanted to say thanks and let you know that I appreciate all the work you put into everything. It's amazing how much content you put out.

    Comment

    • skeevy420
      Senior Member
      • May 2017
      • 8635

      #32
      Originally posted by ET3D View Post

      I understood this, and I think it was clear from my message. However, this doesn't invalidate the point that these benchmarks aren't really useful for gamers. The question is whether there's any real point in running these benchmarks. In some sense, sure, always nice to see some numbers. But I can't tell if they reflect actual gaming scenarios in any way, which makes them academic at best.
      They do and they don't. They don't because most every PC gamer uses mixed settings tailored to their GPU, monitor, and visual preferences like DOF and Bloom. They do because they give you a baseline to start your settings tinkering and give comparisons between GPU models in the fairest way possible.

      Comment

      • ET3D
        Phoronix Member
        • Jun 2020
        • 73

        #33
        @skeevy420​, you missed the point, but it did lead me to an idea.

        It would be interesting to have the Windows versions of these games tested with Proton alongside the native Linux versions. Hopefully this will be as easy to automate as the current benchmarks. While this would still limit testing to a subset of games which isn't representative of modern gaming, and therefore won't really solve the problem of these benchmarks not being that helpful to gamers, it would still provide more interesting info.

        By the way, there are a number of games with built-in benchmarks, but I don't know how Michael automates the tests and therefore if these benchmarks can fit into them. For example, Black Myth: Wukong,

        Comment

        • Michael
          Phoronix
          • Jun 2006
          • 14308

          #34
          Originally posted by ET3D View Post
          @skeevy420​, you missed the point, but it did lead me to an idea.

          It would be interesting to have the Windows versions of these games tested with Proton alongside the native Linux versions. Hopefully this will be as easy to automate as the current benchmarks. While this would still limit testing to a subset of games which isn't representative of modern gaming, and therefore won't really solve the problem of these benchmarks not being that helpful to gamers, it would still provide more interesting info.

          By the way, there are a number of games with built-in benchmarks, but I don't know how Michael automates the tests and therefore if these benchmarks can fit into them. For example, Black Myth: Wukong,
          For games like Wukong AFAIK there is only GUI knobs for triggering it and no way from the command line... I'm all ears for games that can be fully automated and benchmarking. I routinely check out new games for benchmarks but usually they fall short with AutoHotKeys and similar really not being reliable especially on Linux.
          Michael Larabel
          https://www.michaellarabel.com/

          Comment

          • Leopard
            Senior Member
            • Oct 2017
            • 348

            #35
            Originally posted by pyrex777 View Post
            you and Leopard, are you being seious right now? just because Michael didnt have a 7900 xtx at hand doesnt make this test unfair. all you have to do is literally compare gpus with the same price. 7800 xt, vs 4070 (which, vram aside, is still faster while consuming 50W less. maybe 4070 super. why are you looking at the 4070 ti super? you're trying to pass off as objective when your bias seems to be clear as water. This is not a marketing slide from Nvidia, this is a test from Michael for linux users who in theory should know better than the average consumer. he probably never even thought this would be used as an argument against the test.

            RDNA3 has been pretty bad, there is no way around it. AMD on the software side is stuck in 2017 while Nvidia has lossless magic that adds +20-30% for free and RT x-times faster than AMD.
            Even Intel in the span of 1 generation has caught up to DLSS with XeSS 2 and XeLL. If you've bought RDNA3 because of better Linux compat/a good deal, that's a valid reason, but strictly gaming speaking it's not close. plus, they are solid cards for casual compute work thanks to cuda.
            Yes, i'm being serious.

            I've gone ahead and ran his pts suite. (7900 XTX)

            OpenBenchmarking.org, Phoronix Test Suite, Linux benchmarking, automated benchmarking, benchmarking results, benchmarking repository, open source benchmarking, benchmarking test profiles


            This is the result.

            OpenBenchmarking.org, Phoronix Test Suite, Linux benchmarking, automated benchmarking, benchmarking results, benchmarking repository, open source benchmarking, benchmarking test profiles



            So that 7800 XT getting 289 fps result doesn't make sense at all, disparity ( depsite being a bigger gpu ) cannot be like that.

            His results are off the charts, completely weird.
            You do not have permission to view this gallery.
            This gallery has 1 photos.
            Last edited by Leopard; 10 December 2024, 02:10 PM.

            Comment

            • Quackdoc
              Senior Member
              • Oct 2020
              • 5063

              #36
              Originally posted by skeevy420 View Post
              It'd be cool if someone like Bridgman of Adg5f could talk someone at AMD into hooking Michael up with eleven 7800X3Ds. That takes it all from "only" $30K to $100K in PC hardware. Then there's needing in-line graphics capture so you're not skewing benchmarks with video capture software, you'd need a 12th PC for doing actual work, a 13th PC to do video rendering and whatnot so the 12th PC can still be used for other work...then there's all the other PCs and servers that have to be powered for all the other Phoronix testing.
              that would be one clunky setup lol​

              Originally posted by Michael View Post
              For games like Wukong AFAIK there is only GUI knobs for triggering it and no way from the command line... I'm all ears for games that can be fully automated and benchmarking. I routinely check out new games for benchmarks but usually they fall short with AutoHotKeys and similar really not being reliable especially on Linux.
              I'm not sure how sophisticated openbenchmarking is, but gstreamer's templatematch via opencv plugins may work for automatic gui presses, templatematch broadcasts messages over bus, but it probably wouldnt be too hard to wire up, I actually had a POC somewhere in the past. Ill see if I can dig it up, should I just make an issue ticket on openbenchmarking if I find it?

              Comment

              • Michael
                Phoronix
                • Jun 2006
                • 14308

                #37
                Originally posted by Quackdoc View Post

                I'm not sure how sophisticated openbenchmarking is, but gstreamer's templatematch via opencv plugins may work for automatic gui presses, templatematch broadcasts messages over bus, but it probably wouldnt be too hard to wire up, I actually had a POC somewhere in the past. Ill see if I can dig it up, should I just make an issue ticket on openbenchmarking if I find it?
                If you want to email me at michael at phoronix.com I will check it out but yeah not familiar with GStreamer templatematch. There would also be issues with some games only outputting the benchmark results graphically too, among other headaches.
                Michael Larabel
                https://www.michaellarabel.com/

                Comment

                • pyrex777
                  Junior Member
                  • Jun 2024
                  • 12

                  #38
                  Originally posted by Leopard View Post

                  Yes, i'm being serious.

                  I've gone ahead and ran his pts suite. (7900 XTX)

                  OpenBenchmarking.org, Phoronix Test Suite, Linux benchmarking, automated benchmarking, benchmarking results, benchmarking repository, open source benchmarking, benchmarking test profiles


                  This is the result.

                  OpenBenchmarking.org, Phoronix Test Suite, Linux benchmarking, automated benchmarking, benchmarking results, benchmarking repository, open source benchmarking, benchmarking test profiles



                  So that 7800 XT getting 289 fps result doesn't make sense at all, disparity ( depsite being a bigger gpu ) cannot be like that.

                  His results are off the charts, completely weird.
                  the difference is ~51%. while Techpowerup is unreliable for most compute scores since they just apply the same architectural multiplier, fp32 is ok and it says 61tflops vs 37. so it's ~63% more powerful on paper.
                  and in their relative benchmark table ("Performance Summary" at 1920x1080, 4K for 2080 Ti and faster.") the 7900 xtx is listed as 51% better. This checks out almost perfectly.

                  Also, considering Michael didnt use a 7800x3d but rather ARL which at best matches RPL, I really dont see how these values are weird.

                  This is me ignoring your sidestep to this topic, when you clearly said stuff I can only classify as obtuse such as "One would think Michael would responsibly test equal performers on both vendors but for some reason he wanted to shine NV here i guess lol." and "See, that is how marketing machine works.

                  That result is expected, he does purposefully picked 4070 Ti Super vs 7800 XT.

                  7800 XT is a 500 dollar gpu, 4070 Ti Super is a 800 dolar gpu...​"

                  I'd suggest you to confine your extreme bias to places like r/AyyMD.

                  Comment

                  • wertigon
                    Senior Member
                    • Jan 2020
                    • 301

                    #39
                    Michael Is the 4060 tested the 8GB or 16GB version? I am assuming 16GB for as fair a test as possible, but hard to be sure.
                    Last edited by wertigon; 10 December 2024, 07:39 PM.

                    Comment

                    • pyrex777
                      Junior Member
                      • Jun 2024
                      • 12

                      #40
                      Originally posted by wertigon View Post
                      Michael Is the 4060 tested the 8GB or 16GB version? I am assuming 16GB for as fair a test as possible, but hard to be sure.
                      the 4060 is only available with 8gb mate. that's the 4060 ti. but most games benchmarked arent vram hungry, so it really shouldn't matter when looking at 1440p scores.
                      plus this card competes with the 8gb 7600, and considering that they cost that much thanks to vram cuts too, I don't really see why you'd consider it unfair. like, that's their characteristic, the point of the test wasn't to measure the performance of AD107. if that makes sense.
                      Last edited by pyrex777; 11 December 2024, 05:21 AM.

                      Comment

                      Working...
                      X