Announcement

Collapse
No announcement yet.

AMD Radeon R9 Fury X Launches Today, Initial Results A Bit Of A Let Down

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    I just bought the Powercolor Fury X from TigerDirect about 40 min ago for 649.99 they had an instant 20$ off as well. They still had the VisionTek model available too for 30$ more. It's going to be a nice silent upgrade from my 2 reference 7970's.
    Those who would give up Essential Liberty to purchase a little Temporary Safety,deserve neither Liberty nor Safety.
    Ben Franklin 1755

    Comment


    • #42
      Originally posted by andresdju View Post
      Any web has made opencl benchmarks?
      Sweclockers did: http://www.sweclockers.com/test/2073...y-x/16#content
      King of OpenCL, but not really a surprise.

      Comment


      • #43
        Originally posted by xeekei View Post
        Sweclockers did: http://www.sweclockers.com/test/2073...y-x/16#content
        King of OpenCL, but not really a surprise.
        Why is fury X just a bit faster than r9 390X, but this much faster than r9 290X? Isn't 390X a rebranded 290X?

        Comment


        • #44
          Originally posted by andresdju View Post

          Why is fury X just a bit faster than r9 390X, but this much faster than r9 290X? Isn't 390X a rebranded 290X?
          Not sure what happened there but you can call R9 390X refreshed chip. Rebrend sounds more like copy/paste chip which is not entirely correct here Those are maded on same, but more mature process, then GPU/Memory clocks are different, bandwidth and fillrate too, so also graphic and compute performance are expected to be different then on 290X.

          Last edited by dungeon; 24 June 2015, 06:13 PM.

          Comment


          • #45
            Originally posted by Marc Driftmeyer View Post
            Your lack of professionalism is showing. An unfinished driver that hasn't be unified, optimized and tested matches performance by Nvidia whose cards have 12 months of maturity. Grow up already.
            Radeon with 4000 shaders vs Geforce with 3000 shaders at the same frequency and Geforce wins. That has nothing to do with drivers. Its usually that 1/3 of Nvidia shaders from Fermi and then, are 64bit and the second 1/3 (32bit) is capable to be used to double precision the first 1/3 and at 64bit!!! that is usually +1/3 gaming performance. Then you must multiply this 1.33x again with +10-15% because of hidden units like SFUs and you have 1.5x. So Geforce 3000 shaders = 4500 normal shaders.

            Comment


            • #46
              Why the bad overclocks? Maybe:
              1: There are timing issues between the rest of the GPU and new memory controller. Later revisions might help this.
              2: The memory can't handle the heat. What happens if you change the fan profile to allow the card to hit 80C+ at reference clocks? WIll the card crash under those conditions?
              3: The memory is a little higher/thicker than the GPU, meaning there's more thermal compound between the GPU and the cooler than there would normally be, and parts of the GPU are overheating faster. A little work on the baseplate of the cooler would fix that. Maybe a custom EKWB part would do that a lot better.

              I would still much prefer a GTX 980 over this card, or the Titan X/980Ti: it's much less power-hungry (165W vs. 250/275W), and delivers way more performance than I'll need for quite a while, and it stays at very reasonable noise levels with the blower cooler. If I wanted to play shiny AAA titles in 4k, right now though, I'd buy this thing in a heartbeat, based on what I've seen from reviews and AMD's general open-source policy.

              Comment


              • #47
                Many hoped that GTX 980 Ti would be slower than the Fury X but this is only the case with some games at 4k. Generally speaking it might be fine with a DP 1.2a (FreeSync) 4k monitor, a little bit cheaper than the Nvidia and a Gsync one. But i would not buy it for Linux gaming. Extra funny is that AMD wants to sell 2x Fury X together with the fastest Intel CPU (well you could get a variant with an AMD one too) as Project Quantum. The performance of AMD's latest card is not that bad but basically 1 year too late. But most money is made with mainstream cards - there i don't think it helps much if you increase the stock frequencies by 2.5% and use a new name...

                Comment


                • #48
                  Originally posted by artivision View Post

                  Radeon with 4000 shaders vs Geforce with 3000 shaders at the same frequency and Geforce wins. That has nothing to do with drivers. Its usually that 1/3 of Nvidia shaders from Fermi and then, are 64bit and the second 1/3 (32bit) is capable to be used to double precision the first 1/3 and at 64bit!!! that is usually +1/3 gaming performance. Then you must multiply this 1.33x again with +10-15% because of hidden units like SFUs and you have 1.5x. So Geforce 3000 shaders = 4500 normal shaders.
                  AMD shaders != Nvidia shaders

                  Comment


                  • #49
                    Originally posted by xeekei View Post
                    AMD shaders != Nvidia shaders
                    Nvidia shaders > AMD shaders, for ever. Regardless of that, i use Radeon and Gallium_Nine. Small sacrifice -20% performance - only D3D9 games, but i use only Linux and thats the right thing to do.

                    Comment


                    • #50
                      Originally posted by artivision View Post

                      Radeon with 4000 shaders vs Geforce with 3000 shaders at the same frequency and Geforce wins. That has nothing to do with drivers. Its usually that 1/3 of Nvidia shaders from Fermi and then, are 64bit and the second 1/3 (32bit) is capable to be used to double precision the first 1/3 and at 64bit!!! that is usually +1/3 gaming performance. Then you must multiply this 1.33x again with +10-15% because of hidden units like SFUs and you have 1.5x. So Geforce 3000 shaders = 4500 normal shaders.
                      Seriously? I can easily find block diagrams and describe them for you, but do I have to? You should look at several block diagrams for yourself. My objective is usually to find the most detailed one I can. My advice is that you should do that too.

                      Comment

                      Working...
                      X