Announcement

Collapse
No announcement yet.

Mesa 17.1-dev vs. AMDGPU-PRO 16.60 vs. NVIDIA 378 Linux Gaming Tests

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    I think that underlying cause is this They also there just *don't care* tm , when they use i7 6700K for eSports Gaming card advertising



    And why comparison against 750Ti at this point, thing is 3 years old? Bunch of ignorants
    Last edited by dungeon; 31 January 2017, 02:09 AM.

    Comment


    • #22
      Originally posted by debianxfce View Post
      .... Also phoronix benchmarks are out of real world with 640 usd cpu.
      Perhaps trying to avoid benchmarks showing only CPU limits?

      Comment


      • #23
        Originally posted by indepe View Post
        Perhaps trying to avoid benchmarks showing only CPU limits?
        I think debianxfce wanna see real word scenario, something like $100 CPU and $100 GPU... That sounds more real to happen than $600 CPU and $100 GPU combo

        Comment


        • #24
          Originally posted by dungeon View Post

          I think debianxfce wanna see real word scenario, something like $100 CPU and $100 GPU... That sounds more real to happen than $600 CPU and $100 GPU combo
          That's gonna make the $100 GPU look really good....

          Comment


          • #25
            Originally posted by indepe View Post
            That's gonna make the $100 GPU look really good....
            Yup, and usable for reading only for driver developers of some sort... not for users who wanna buy that GPU hardware, as if they buy just GPU by reading this, they might be surprised how slower things are on their average CPU
            Last edited by dungeon; 31 January 2017, 02:39 AM.

            Comment


            • #26
              Originally posted by dungeon View Post

              Yup, and usable for reading only for driver developers of some sort... not for users who wanna buy that GPU hardware, as if they buy just GPU by reading this, they might be surprised how slower things are on their average CPU
              Are you proposing a test matrix with CPU-cost vs GPU-cost so that everyone at every cost level can find the sweet spot of how to split money between CPU and GPU?

              Comment


              • #27
                Well, on PC market every mixages are possible There are no rules there of course, but equal or up of range of two is more likely to happen than else

                So on average if someone pick $100 GPU i don't think he will get CPU of more than $200 or something like that.
                Last edited by dungeon; 31 January 2017, 02:56 AM.

                Comment


                • #28
                  Originally posted by dungeon View Post
                  Well, on PC market every mixages are possible There are no rules there of course, but equal or up of range of two is more likely to happen than else

                  So on average if someone pick $100 GPU i don't think he will get CPU of more than $200 or something like that.
                  Unfortunately the best combination depends on the game as well... so I'd think you'd find a GPU using a test like this, and then look at some other tests to try to figure out which CPU you might need, for your favorite game, to support the same frame rate. Maybe.

                  And if that turns out to be too expensive together, you go back to picking a less expensive GPU.
                  Last edited by indepe; 31 January 2017, 03:09 AM.

                  Comment


                  • #29
                    Originally posted by smitty3268 View Post
                    My guess as to the real underlying cause is this: http://www.anandtech.com/show/10536/...ation-analysis which presumably lets them "reduce the memory bandwidth for rendering" which would lead to better performance, lower power usage, and in turn higher clocks (and more performance).
                    Tile Based Rendering is a software implementation dividing one frame into pieces being rendered simultaneously to achieve higher usage. AMD uses TBR on Windows as well, as far as I know. And when the game engine supports it it also works with OpenGL. Sadly TBR is not infinitely scalable.
                    Here is a web demo of TBR via WebGL. You can switch to TBR hitting 6 on your keyboard and turn it off via 7. Of course that only changes FPS when you are under the cap so you have to increase the number of lights to achieve that first. The frame rate nearly doubles in my case.

                    https://github.com/tiansijie/Tile_Ba...DeferredShader

                    I don't know whether RadeonSI has a corresponding workaround OpenGL implementation to force TBR for immediate rendering but pretty likely the proprietary NV driver has.
                    So pointing that out was a good idea in my opinion and perhaps when someone has the time he might have a look at how the current implementation is. It might also be useful for the guys porting games to improve the performance. Implementing it directly is of course the best way.
                    Last edited by oooverclocker; 31 January 2017, 03:12 AM.

                    Comment


                    • #30
                      Originally posted by indepe View Post

                      Unfortunately the best combination depends on the game as well... so I'd think you'd find a GPU using a test like this, and then look at some other tests to try to figure out which CPU you might need, for your favorite game, to support the same frame rate. Maybe.

                      And if that turns out to be too expensive together, you go back to picking a less expensive GPU.
                      Or don't forget you might want some CPU with iGPU or these with eDRAM Well i dunno, these much bellow $100 GPUs are always combined with much cheaper than $200 CPUs as if you go up $300 for both you might wanna forget about very cheap dGPU option

                      Sounds irregular that Michael's Xeon or even that AMD's i7 6700K to be combined with RX 460 That might be only for marketing provocations reasons as that dGPU is faster than *any* iGPUs ... as whatever iGPU you have does not matter if CPU cost $200 or $2000, that RX 460 is still better so applies anywhere
                      Last edited by dungeon; 31 January 2017, 03:39 AM.

                      Comment

                      Working...
                      X