Announcement

Collapse
No announcement yet.

NVIDIA GeForce vs. AMD Radeon Linux Gaming Performance At The Start Of 2018

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by duby229 View Post

    The same thing used to be true about 25w CPU's too, but heatsinks evolved....
    A room/case/climate can only take so much heat. Here in South Australia we see temps as high as 47c, while on those days I tend to run a air-conditioner, its not economical to be in a air-con situation all the time due to the extremely high price of power here.

    Also that heat needs to go somewhere, such as your room, heatsinks are not a solution forever and lets not forget geforce gpu's use to eat up 300TDP with the 680 series if I remember correctly (or was it 580s), so while some increases in temps can happen, overall we have been seeing reduced TDP in CPU's and GPU's NOT increased. (or watt usage). Your idea of never-ending increases in TDP is hilarious, thanks for the laugh.

    The 14nm Vega is just a bit of a bellyflop for AMD because they had to push a mobile chip so high to get it to be even remotely competitive to the 1080 card, but once they move Vega to 7nm and IF they release a consumer version, it should be a much more acceptable product (I expect at least 100 TDP drop).

    PS. Lower TDP = more stable performance, the under-clocking of Vega core voltages showed this to be the case. IMO 300 TDP is just pushing what heatsink tech can do, and how much heat a typical case can realistically handle (on top of other components). No magic nano heatsink from the future will change that, 300+ watts MUST go somewhere, it doesn't just vanish into the heatsink!
    Last edited by theriddick; 11 January 2018, 04:05 AM.

    Comment


    • #22
      About Egee, I've seen the video just out of curiosity, and seriously, I'm not really sure about the other videos he make because I currently never saw his videos.
      But it seems almost like raged teen fanboying against AMD and defending nvidia with a very very stupid tone and lack of proofs.

      Keep in mind that I'm a Windows user and I'm not suppose to be here, but I like to keep track of the benchmarks, also I have a 7850 2GB, but for some reason if I would want to play on linux with a non-rolling distro with a GCN 1.0, what was the best steps to take so that I could play better.
      As I can see the GCN 1.0 is not supported by the RadeonSi, right? RadeonSi looks excellent by now.

      Comment


      • #23
        Originally posted by ObscureAngelPT View Post
        About Egee, I've seen the video just out of curiosity, and seriously, I'm not really sure about the other videos he make because I currently never saw his videos.
        But it seems almost like raged teen fanboying against AMD and defending nvidia with a very very stupid tone and lack of proofs.

        Keep in mind that I'm a Windows user and I'm not suppose to be here, but I like to keep track of the benchmarks, also I have a 7850 2GB, but for some reason if I would want to play on linux with a non-rolling distro with a GCN 1.0, what was the best steps to take so that I could play better.
        As I can see the GCN 1.0 is not supported by the RadeonSi, right? RadeonSi looks excellent by now.
        All GCN products are supported by RadeonSI OpenGL implementation (SI stands for Southern Islands aka GCN 1.0). By default is uses Radeon kernel driver but it also has experimental support for AMDGPU kernel driver but you have to boot up with some parameters.

        Comment


        • #24
          valici And so that Radeon Kernel driver is too much outdated. Right?

          Comment


          • #25
            Originally posted by ObscureAngelPT View Post
            valici And so that Radeon Kernel driver is too much outdated. Right?
            No, performance is similar, very slightly less for Radeon but has more features that are not yet implemented yet in AMDGPU like video decode. You are fine with it, it's what AMD recommends for the moment for your card.

            Comment


            • #26
              Michael would be interesting to see CPU usage comparison between drivers with Dota 2 using OpenGL at 2560x1440, say between Fury and 1080 Ti.

              Comment


              • #27
                I want to see cpu usage wirh vsync on.

                Comment


                • #28
                  Originally posted by MrCooper View Post
                  Michael would be interesting to see CPU usage comparison between drivers with Dota 2 using OpenGL at 2560x1440, say between Fury and 1080 Ti.
                  That and it won't hurt couple up and down resolutions also, like FullHD and 4K.
                  Last edited by dungeon; 11 January 2018, 07:30 AM.

                  Comment


                  • #29
                    Michael, on pg 4 you wrote
                    But for the Vulkan renderer in this other Feral Linux game port, NVIDIA is running better currently than RADV on Mesa 17.4-dev.
                    but the benchmarks clearly show that the oppisite is true. The RX 580 is much better than the Geforce 1060 and the Geforce 1070 Ti can hardly compete with the Geforce 1070. The other cards line up as expected with the typical Vega weakness.

                    Comment


                    • #30
                      Thanks! Excellent with a test that covers most GCN generations.

                      Comment

                      Working...
                      X