Announcement

Collapse
No announcement yet.

The Radeon RX Vega Performance With AMDGPU DRM-Next 4.21 vs. NVIDIA Linux Gaming

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • The Radeon RX Vega Performance With AMDGPU DRM-Next 4.21 vs. NVIDIA Linux Gaming

    Phoronix: The Radeon RX Vega Performance With AMDGPU DRM-Next 4.21 vs. NVIDIA Linux Gaming

    Given the AMDGPU changes building up for DRM-Next to premiere in Linux 4.21 that is on top of the AMDGPU performance boost with Linux 4.20, here are some benchmarks of Linux 4.19 vs. 4.20 Git vs. DRM-Next (Linux 4.21 material) with the Radeon RX Vega 64 compared to the relevant NVIDIA GeForce competition.

    http://www.phoronix.com/vr.php?view=27179

  • #2
    Typo:

    Originally posted by phoronix View Post
    GTX 1070 now bto being faster than

    Comment


    • #3
      Interesting, so now the Vega 64 is between the 1070Ti and the 1080 (but more closer to the first one than the second one).
      ## VGA ##
      AMD: X1950XTX, HD3870, HD5870
      Intel: GMA45, HD3000 (Core i5 2500K)

      Comment


      • #4
        Padoka ppa has old xserver-xorg-video-amdgpu driver, it does not have VRR patches. Use Oibaf ppa, it has all bells and whistles every day.

        Comment


        • #5
          I've read somewhere, that Nvidia's performance is often faked by their reduced quality of colors and textures. I.e. while AMD render everything as prescribed, Nvidia cheat and skip computationally intense parts, which produces worse image, but looks like it's performing better. Is that correct?

          Comment


          • #6
            I wish AMD would make an insane compute unit gpu. I realize hbm2 is expensive as hell, but when you crack open a vega card, there is enough space to solder a Pi motherboard onto it. They have the real estate. They just need to kick it up a notch.

            And with 7nm, there will be more empty no man's land space on top of what we see today. Their radeon division really needs a proper roshambo, throat grabbing, shin kicking BFG card. From what I'm hearing, that isn't to be. But for once, just once, I would like to see them beat nvidia in the performance metrics. F the cost.

            Comment


            • #7
              Originally posted by ThoreauHD View Post
              Their radeon division really needs a proper roshambo, throat grabbing, shin kicking BFG card. From what I'm hearing, that isn't to be. But for once, just once, I would like to see them beat nvidia in the performance metrics. F the cost.
              Major performance gains are expected only in their post Navi architecture, which they dubbed "super-SIMD". Would be interesting to see how it will work out.

              See https://forum.level1techs.com/t/supe...acement/132791

              Comment


              • #8
                Originally posted by shmerl View Post
                I've read somewhere, that Nvidia's performance is often faked by their reduced quality of colors and textures. I.e. while AMD render everything as prescribed, Nvidia cheat and skip computationally intense parts, which produces worse image, but looks like it's performing better. Is that correct?
                That is a Windows thing from 15 years ago. Nvidia’s first Direct3D graphics card was the GeForce FX 5800 Ultra. They made it by bolting on Direct3D 9 onto their GeForce 4 series in a way that was not performant, presumably under the assumption that it would never matter. When this made their hardware perform terribly well before the GeForce 6 series was ready, they resorted to cheating to improve the performance of Direct3D games. As far as I know, the practice died out due to backlash. I have never heard of them doing this in their Linux drivers or in OpenGL games for that matter.

                For what it is worth, ATI has been caught cheating on numerous occasions:

                https://forums.anandtech.com/threads...eating.638328/
                https://forums.anandtech.com/threads...-2003.1087045/
                http://www.tomshardware.com/forum/75...2001se-figures
                http://www.tomshardware.com/forum/80...aught-cheating
                http://www.tomshardware.com/forum/298524-33-cheating-benchmarks-degrading-game-quality-nvidia

                Anyway, all of this nonsense is Windows specific. It was never a factor on Linux as far as I know. If anything, the lack of cheating on Linux is yet another reason why performance has always been lower on Wine than on Windows.
                Last edited by ryao; 12-02-2018, 03:38 PM.

                Comment


                • #9
                  Originally posted by ryao View Post

                  They did it more than 15 years ago to compensate for the architectural deficiencies that kept the GeForce FX 5800 Ultra from performing well in Direct3D 9 games. As far as I know, the practice died out due to backlash. I have never heard of them doing this in their Linux drivers.
                  Of course you have not heard, they would not tell when they are cheating. Test with GT8400, it works with both open and closed source drivers, maybe you could see difference and cheating. I did see amdgpu-pro cheating rendering with A8-7600 and Tomb Raider 2013 (with wine-staging) and amdgpu-pro was faster than Mesa at the time.

                  Comment


                  • #10
                    Originally posted by ryao View Post

                    That is a Windows thing from 15 years ago. Nvidia’s first Direct3D graphics card was the GeForce FX 5800 Ultra. They made it by bolting on Direct3D 9 onto their GeForce 4 series in a way that was not performant, presumably under the assumption that it would never matter. When this made their hardware perform terribly well before the GeForce 6 series was ready, they resorted to cheating to improve the performance of Direct3D games. As far as I know, the practice died out due to backlash. I have never heard of them doing this in their Linux drivers or in OpenGL games for that matter.

                    For what it is worth, I recall that ATI was caught cheating after their positions reversed when the GeForce 6 series was out. If you were to ignore the cheating and pick which of them renders things properly most often, that would be Nvidia. Their drivers have always been higher quality than ATI’s drivers.
                    Did anyone perform tests recently based on image quality comparison, rather than raw framerate? Nvidia blob is shared between Windows and Linux, and such cheating can be one of the reasons they staunchly refuse to open up their drivers.

                    Comment

                    Working...
                    X