Announcement

Collapse
No announcement yet.

NVIDIA GeForce vs. AMD Radeon Linux Gaming Performance At The Start Of 2018

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Seems the RX 580 is best buy card atm, and VEGA is much of a muchness. Here the Vega64 costs $865AUD while the 1080Ti costs $1027AUD, the choice is clear if your willing pay a little bit extra. The Vega64 IMO should be a $600AUD card, and should have a red label warning on the box telling people about the radioactive hazards of the card!

    Comment


    • #12
      Originally posted by theriddick View Post
      Seems the RX 580 is best buy card atm, and VEGA is much of a muchness. Here the Vega64 costs $865AUD while the 1080Ti costs $1027AUD, the choice is clear if your willing pay a little bit extra. The Vega64 IMO should be a $600AUD card, and should have a red label warning on the box telling people about the radioactive hazards of the card!
      I'm just curious about the "radioactive hazards" What do you mean?

      Comment


      • #13
        Originally posted by duby229 View Post
        I'm just curious about the "radioactive hazards" What do you mean?
        I was curious too - don't remember seeing a bugzilla ticket for that.
        Test signature

        Comment


        • #14
          Just a quick jab at the Vega64's huge TDP draw, 300-350 for %35 slower then a 250TDP 1080ti is not fun. Also once a card gets into the 300tdp range, the heat output is quite high and will cause some issues for certain cases/locations.

          Comment


          • #15
            Originally posted by theriddick View Post
            Just a quick jab at the Vega64's huge TDP draw, 300-350 for %35 slower then a 250TDP 1080ti is not fun. Also once a card gets into the 300tdp range, the heat output is quite high and will cause some issues for certain cases/locations.
            The same thing used to be true about 25w CPU's too, but heatsinks evolved. I'm certain thats what Vega is currently going through. You can expect even higher wattage in the future. It's what's gonna happen. And heatsinks are going to evolve, its inevitable. These blower heatsinks aren't quite gonna cut it anymore. Probably closed loop water is the most economic solution that already has some commercial precedence already. And believe me sooner or later nVidia will have to make the same move.

            EDIT: Temperatures are caused by electrons leaking through a transistors gate, and the hotter the gate the more they leak. Closed loop water is gonna have to happen. Removing heat fast enough actually causes it to produce less heat.

            EDIT: Those crystal heatsinks are badass too and probably have a place in the future as well. So I'm thinking a crystal heatsink within a closed loop water cooler perhaps even sitting on a peltier pad. That's probably gonna be the standard cooler for a GPU in the future sometime soon.
            Last edited by duby229; 10 January 2018, 11:02 PM.

            Comment


            • #16
              It is a really interesting benchmark. Right now Vega cards really stand where they should against nvidia side, especially on Feral heavy ports. It might need a bit more tweaking for older title. But really nice stuff. I'm looking forward on sparing enough money to switch to a full AMD rig, with vega and open source drivers. Thanks Michael once again for keeping all the bench up to date. I was full of hope for 2017, it has been a great year for gaming on our platform, and at the end of the year I was kind of disappointed by the amount of linux user, despite all the progress which has been done over the last 3 year, on top of that the bench of vega did not show up much evolution in your bench figures. That's really a nice way of starting 2018 and I once again have better hope for our beloved platform.

              You are really a corner stone of our Community Michael, and your good work on meltdown and other vulnerabilities on cpus has even been quoted by hardware.fr which are referred as the state of art of hardware journalism for their seriousness.

              Big thanks to you Michael!

              Comment


              • #17
                Originally posted by gsedej View Post
                AMD FineWine (R) RX 580 constantly beating 1060. The "almighty nvidia linux driver" is not so strong now, eh?

                But there might still some "choppyness" problems in amd drivers, like "egee" findings here https://www.youtube.com/watch?v=QlRyRkmQv70
                That video is pure bullshit/garbage and so is that "EGEE". I'll just copy some of the points a user (Samsai) made there:

                "This video is FUD. Sorry to call you out but that's how it is. Basically what you've done here is make a whole bunch of ill-informed statements that only work to perpetuate the decade-old "AMD GPUs suck on Linux" meme.
                1. No mention of Mesa and kernel versions. For all we know you could be using Mesa 13.x and fricking kernel 3.6. You need to keep these up to date if you intend to play. You also failed to mention whether you used AMDGPU (lower performance on GCN 1.0 hardware than radeon). Hell, you don't even mention what CPU you are using. I guess I could try to extract that information by scanning through your channel but it's common courtesy to mention these things when benchmarking.
                2. Your examples, which include War Thunder, Tomb Raider and Shadow of Mordor are inherently flawed and are not broad enough to make claims like "AAA games do not support AMD GPUs". How about you look at Feral's new products and check out how many of those support AMD GPUs either on launch or post-launch? Not to mention Feral actively fixes AMD issues and even submits patches to the god-damn drivers to make their games run. They also get people to test their stuff on AMD rigs before launch which wouldn't make sense if they never intended to support those GPUs.
                3. You are extrapolating data from a 6 year old GPU with 1GB of RAM. Even pitting it against a 730 isn't going to make it a fair comparison, particularly if your 730 is a 2GB model. None of which you disclosed in the video, might I add. Basically your point boils down to "there was bad performance therefore be scared". FX CPUs had bad performance therefore Ryzen's performance also sucks. Nvidia's Fermi GPUs burned hot, therefore Pascals also burn hot. Be very afraid.
                4. Your video entirely fails to take into account the convenience of keeping open source drivers up to date. Distro upgrades won't break them unless something is really wrong and provided you use a rolling distro with sensible software versions you don't have to mess around with third party repositories. These drivers also have the potential benefit of extended support. FGLRX died a while back but even the R600G driver keeps getting updates.
                5. While Wayland is no clear advantage yet and it's uncertain what direction it will go, it's a thing on AMD GPUs along with XWayland support. It is not a thing on Nvidia. Also, Wine people get Gallium Nine with AMD GPUs. Nvidia users don't. Also, the general 2D desktop performance on open source drivers is and has been superior for a good while. Even an Intel iGPU beats an Nvidia dGPU in pure desktop tasks.
                So good job. Now we have newbies who will never go into Linux gaming because they are too scared due to owning an AMD GPU and people who will just keep on buying Nvidia till the day they die and support the ever-more tilting GPU monopoly which will bless us with shitty products at shitty price-points using GPU drivers that will be forever broken and will drop support for old products whenever the corporation feels they need to extract just that little extra money out of their sheep."

                I don't think the point he made about AMDGPU vs radeon is true nowadays. At least that's what some users have said.

                Comment


                • #18
                  Originally posted by VikingGe View Post
                  I think if it was that bad in reality, people would have noticed and complained about it already. That guy is just spreading nonsense. "AAA games don't even support the damn cards" when a game from 2015 - a Feral port of all things - pops a message? Must have been living under a rock the past two and a half years (and probably hasn't updated his drivers in the meantime). Tomb Raider runs like shit? Yes. Does he test it on an Nvidia card just to find out that it runs equally shit on those as well? No, of course not, must be the AMD drivers.

                  Where valid complaints about the Vulkan driver mess, Vega not having had out-of-the-box display support for half a year, VCE being next to unusable, lack of a GUI configuration utility, having to keep up with newest driver versions etc.? Nowhere to be found. Where does he tell us which distro and driver version he's using? Well, he doesn't.

                  That entire video is just wrong on so many levels.
                  It's way worse. He used a shit tier card (with only 1GB of VRAM, might I add) from 2012... SIX YEARS AGO! That card doesn't even meet the MINIMUM requirements for Tomb Raider, and he put settings on medium. Oh, and he used a FX 6300 I think.

                  Nonsense doesn't even begin to describe such shit video. I'm ashamed of ever subscribing to that idiot.

                  Comment


                  • #19
                    @Michael: You was a little bit to 'fast' for vulkan numbers =>
                    [Mesa-dev] [PATCH] radv: don't emit unneeded vertex state.

                    https://lists.freedesktop.org/archiv...ry/181677.html

                    And Dave have more new stuff handy:
                    [Mesa-dev] [PATCH 2/2] radv: inline push constants where possible.

                    https://lists.freedesktop.org/archiv...ry/181675.html

                    [Mesa-dev] [PATCH] radv/winsys: replace bo list searchs with a hash table.

                    https://lists.freedesktop.org/archiv...ry/181683.html

                    Last edited by nuetzel; 11 January 2018, 12:44 AM.

                    Comment


                    • #20
                      Originally posted by theriddick View Post
                      Seems the RX 580 is best buy card atm, and VEGA is much of a muchness. Here the Vega64 costs $865AUD while the 1080Ti costs $1027AUD, the choice is clear if your willing pay a little bit extra. The Vega64 IMO should be a $600AUD card, and should have a red label warning on the box telling people about the radioactive hazards of the card!
                      Was thinking the same!

                      Thought I'd like to switch party immediately, I'm still worried that some games might not run on AMD because of lack of OpenGL standards and/or forced incompatibility compared to nVidiar...

                      Also I'll wait for a Gen bump because money ;-)

                      Comment

                      Working...
                      X