I just bought the Powercolor Fury X from TigerDirect about 40 min ago for 649.99 they had an instant 20$ off as well. They still had the VisionTek model available too for 30$ more. It's going to be a nice silent upgrade from my 2 reference 7970's.
Announcement
Collapse
No announcement yet.
AMD Radeon R9 Fury X Launches Today, Initial Results A Bit Of A Let Down
Collapse
X
-
Originally posted by xeekei View PostSweclockers did: http://www.sweclockers.com/test/2073...y-x/16#content
King of OpenCL, but not really a surprise.
Comment
-
Originally posted by andresdju View Post
Why is fury X just a bit faster than r9 390X, but this much faster than r9 290X? Isn't 390X a rebranded 290X?
Last edited by dungeon; 24 June 2015, 06:13 PM.
Comment
-
Originally posted by Marc Driftmeyer View PostYour lack of professionalism is showing. An unfinished driver that hasn't be unified, optimized and tested matches performance by Nvidia whose cards have 12 months of maturity. Grow up already.
Comment
-
Why the bad overclocks? Maybe:
1: There are timing issues between the rest of the GPU and new memory controller. Later revisions might help this.
2: The memory can't handle the heat. What happens if you change the fan profile to allow the card to hit 80C+ at reference clocks? WIll the card crash under those conditions?
3: The memory is a little higher/thicker than the GPU, meaning there's more thermal compound between the GPU and the cooler than there would normally be, and parts of the GPU are overheating faster. A little work on the baseplate of the cooler would fix that. Maybe a custom EKWB part would do that a lot better.
I would still much prefer a GTX 980 over this card, or the Titan X/980Ti: it's much less power-hungry (165W vs. 250/275W), and delivers way more performance than I'll need for quite a while, and it stays at very reasonable noise levels with the blower cooler. If I wanted to play shiny AAA titles in 4k, right now though, I'd buy this thing in a heartbeat, based on what I've seen from reviews and AMD's general open-source policy.
Comment
-
Many hoped that GTX 980 Ti would be slower than the Fury X but this is only the case with some games at 4k. Generally speaking it might be fine with a DP 1.2a (FreeSync) 4k monitor, a little bit cheaper than the Nvidia and a Gsync one. But i would not buy it for Linux gaming. Extra funny is that AMD wants to sell 2x Fury X together with the fastest Intel CPU (well you could get a variant with an AMD one too) as Project Quantum. The performance of AMD's latest card is not that bad but basically 1 year too late. But most money is made with mainstream cards - there i don't think it helps much if you increase the stock frequencies by 2.5% and use a new name...
Comment
-
Originally posted by artivision View Post
Radeon with 4000 shaders vs Geforce with 3000 shaders at the same frequency and Geforce wins. That has nothing to do with drivers. Its usually that 1/3 of Nvidia shaders from Fermi and then, are 64bit and the second 1/3 (32bit) is capable to be used to double precision the first 1/3 and at 64bit!!! that is usually +1/3 gaming performance. Then you must multiply this 1.33x again with +10-15% because of hidden units like SFUs and you have 1.5x. So Geforce 3000 shaders = 4500 normal shaders.
Comment
-
Originally posted by artivision View Post
Radeon with 4000 shaders vs Geforce with 3000 shaders at the same frequency and Geforce wins. That has nothing to do with drivers. Its usually that 1/3 of Nvidia shaders from Fermi and then, are 64bit and the second 1/3 (32bit) is capable to be used to double precision the first 1/3 and at 64bit!!! that is usually +1/3 gaming performance. Then you must multiply this 1.33x again with +10-15% because of hidden units like SFUs and you have 1.5x. So Geforce 3000 shaders = 4500 normal shaders.
Comment
Comment