Announcement

Collapse
No announcement yet.

AMD Radeon RX 6600 XT Launching For 1080p RDNA2 Gaming At ~$379 USD

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • torsionbar28
    replied
    Originally posted by tildearrow View Post
    Come to think of it... The high end RX 480 was ~$379 in 2016...
    Pretty sure my 4GB Rx 480 was $199 brand new. That's USD.

    Leave a comment:


  • bridgman
    replied
    Originally posted by humbug View Post
    All true and valid points. Do you think that for next gen it will be worthwhile to keep the gaming GPUs a bit longer on 7nm rather than the cutting edge node where they are competing for wafers with all the other products?
    Tough question - I imagine that if we had known that COVID-19 and/or the semiconductor shortage was coming we might have hedged our bets a bit, but I don't think we had a lot of options other than maybe carving out a bigger chunk of GF 12LP/12LP+ capacity and spinning Polaris one more time. That would have worked out pretty well but we would have had to have known that another mining boom was coming as well.

    Going forward there are more options (5nm should be ramped up for a while before 7nm becomes obsolete) so I think we will have a better chance of spreading products across the fab processes more than we could in the last year.

    Originally posted by humbug View Post
    There is a big fear right now amongst PC gamers (windows and Linux alike) that their hobby is increasingly becoming more expensive and niche, something for the rich. And that the USD 200 mainstream GPU will never again be a thing.
    Yeah, one thing that nags at me is that while we are addressing that market pretty well via increasingly capable iGPUs an APU doesn't give a solution for someone who only wants to upgrade their GPU right now. Downside is that it seems to be a fairly small market these days and one which was driven more by fab processes driving cost per transistor down and making upgrades attractive (which isn't happening much right now) rather than by a market for new cards.

    We do have the 5500 XT in the sub-200 USD market today but IIRC it launched while 580's were still being dumped by both miners and retailers so it seemed expensive at the time despite being both cheaper and faster than the 580 at launch.

    Originally posted by humbug View Post
    It also seems like GPUs by their very nature take a lot of die size and transistors. So obviously when their is a chip shortage it makes more sense for companies like AMD to prioritize CPUs which are more profitable.
    Yep, the lower market price per transistor for GPUs has been a challenge for as long as I can remember, and I don't see it changing any time soon (other than mining booms and chip shortages, of course). NVidia worked around it by pushing into the ultra high end, and we worked around it by joining up with a CPU vendor.

    That said, having to prioritize fab capacity between CPUs and GPUs is something new - I don't remember it being a big issue until the last year or so.

    Fab capacity was always something that had to be managed carefully, and there were times when you were unable to sell as many of a hot product as you would like, but we all planned things out with the fabs far enough in advance that shortages could either be identified and eliminated (by bringing more capacity on line) or worked around (by shifting product schedules).

    The last year was different. I had a much longer paragraph typed but "the last year was different" sums it up pretty well
    Last edited by bridgman; 30 July 2021, 02:27 PM.

    Leave a comment:


  • perpetually high
    replied
    Shameless plug for my RX480.. best purchase I've made for my desktop, still going strong. Bridgman's link to Amazon is pretty hilarious.

    Got mine back in Feb 2017 (see Phoronix join date). AMD Fine Wineat its finest. Mined 1.5 ETH with it a couple years ago and blew it on online poker. Should've held on to it.

    Also - any love for cable management around here, or are you guys just a bunch of animals?? And no, we don't do RGB. We leave that to the Gen Z crowd (no offense)

    Leave a comment:


  • bridgman
    replied
    Originally posted by tildearrow View Post
    Wait what, 32GB of cache? H- Oh, it was 32MB...
    Yeah, sorry - typo and fixed. Even I was impressed by the 32GB cache

    Originally posted by tildearrow View Post
    I almost thought that AMD managed to wire a massive SRAM chip to the processing core...
    We probably have something like that in one of the labs... I bet there's something with a rack of GPUs and an octopus (or maybe a dolphin) if you look hard enough.
    Last edited by bridgman; 30 July 2021, 01:46 AM.

    Leave a comment:


  • bridgman
    replied
    Originally posted by idash View Post
    Ignoring the current ripoff market prices, nowadays the cheapest offering from the latest generation from each vendor is $350 MSRP? What the hell?

    Assuming that you work for AMD, sir, is there any chance of getting a low-cost (at least in terms of MSRP) RDNA2 discrete card targeted at 1080p 30 FPS gaming? Or at the very least an RDNA1 card? I mean the RX 550 from just a few years ago is a perfect example.
    Or is targeting the modest 1080p 30 FPS gamers not feasible anymore?
    The tricky part is that we are now putting that kind of performance into our integrated GPUs - a 5700G with decent memory is arguably a pretty decent 1080p 30fps gaming system all in one chip for only a bit more than the cost of an equivalent CPU.

    https://www.tomshardware.com/news/am...7-5700g-review

    There used to be a market for slightly-higher-than-APU-class dGPUs back when OEMs would pair an APU with an inexpensive dGPU in a Crossfire/SLI pair, but that has largely gone away (to the relief of engineers at both companies). Go back a few more years, and the GPU market was primarily driven by demand for entry-level GPUs going into entry-level and mid-range PCs.

    System builders all use APUs these days rather than discrete CPU/GPU, so the only apparent remaining market for entry level dGPUs seems to be the upgrade market for customers with older GPUs in the same range. That used to be a somewhat viable market when each new fab process brought improvements in price/performance through lower per-transistor costs, but right now it's tough to improve much over something like the RX550 at the same price.

    NVidia seems to be in much the same boat - I believe their latest card in that range is the GDDR5 1030 from the same era. There is a still a gap between top end APU and bottom end dGPU, but it gets smaller every year.

    We did introduce an RX580 replacement in the form of the 5500XT, but that got a bad rap because it was how everyone discovered that 7nm parts could not give the same kind of price-performance increase that had been seen with previous fab process jumps. It was a bit faster and a bit cheaper than the RX580 at launch but not enough so to get a lot of attention.

    Sorry I don't have a better answer.
    Last edited by bridgman; 30 July 2021, 02:20 AM.

    Leave a comment:


  • tildearrow
    replied
    Originally posted by bridgman View Post

    We do badly at dollars per number of letters in the product name as well, at least against the 3060Ti, although we do very well against the 3060. One could ask how NVidia thinks a GPU with <9MB of cache is worth $399 when our 6600XT with 32GB of cache is only $379.

    Too many possibilities. Perhaps we could find something more generic to compare, like performance ?
    Wait what, 32GB of cache? H- Oh, it was 32MB... I almost thought that AMD managed to wire a massive SRAM chip to the processing core...

    Leave a comment:


  • bridgman
    replied
    Originally posted by phoronix_is_awesome View Post
    I don't even want to waste time to figure out how AMD thinks a 128bit GPU is worth $379 when nvidia's 3060Ti's 256bit GPU is $399. I mean the 6700XT 192bit GPU at $479 is already outrageous . Dollar per bitness or GB/s bandwidth is higher at the lower end tier even if you compare AMD's own RDNA2 series.
    We do badly at dollars per number of letters in the product name as well, at least against the 3060Ti, although we do very well against the 3060.

    One could ask how NVidia thinks a GPU with <9MB of cache is worth $399 when our 6600XT with 32MB of cache is only $379.

    Too many possibilities. Perhaps we could find something more generic to compare, like performance ?
    Last edited by bridgman; 30 July 2021, 01:07 AM.

    Leave a comment:


  • idash
    replied
    Originally posted by bridgman View Post
    We may get lucky and find another sweet spot like 14nm where the increase in density (from finer process pitch) outstrips the increase in cost (from more complex fab process and cost of building yet another fab) and allows per-transistor pricing to head down some more, but I don't think we know yet.
    I really miss the days when "1080p gaming"-class cards were $150, and significantly lower TDP too.

    If you're fine with 1080p 30 FPS (which I actually prefer), the GTX 750 Ti is still holding up very well today, and I guess the same applies to the R7 260X too (but they're not gonna hold up much longer, so don't take that as a recommendation to buy a used one).
    Those were $150 cards from 2014 that would serve gamers for at least 6 years (at least modest gamers), which is an incredible value compared to how things are today.

    Ignoring the current ripoff market prices, nowadays the cheapest offering from the latest generation from each vendor is $350 MSRP? What the hell?

    Assuming that you work for AMD, sir, is there any chance of getting a low-cost (at least in terms of MSRP) RDNA2 discrete card targeted at 1080p 30 FPS gaming? Or at the very least an RDNA1 card? I mean the RX 550 from just a few years ago is a perfect example.
    Or is targeting the modest 1080p 30 FPS gamers not feasible anymore?

    Leave a comment:


  • arQon
    replied
    Originally posted by bridgman View Post
    I do sometimes wonder if we should port the RX580/90 to GF 12LP+ and use that to try to keep the miners fed.
    I'm with skeevy: I think that is absolutely a viable *graphics* card, still, once you give it that sort of node shrink. Hell, even a GCN1-era 7970 would still comfortably hit 60FPS on a LOT of games at High+ with the clocks available there: it just doesn't have the feature set caps for today, but IIRC that's not a problem for Polaris.

    Given that ALL the hugely-popular games for the last several years: Fortnite, Overwatch, CSGO, DOTA, TF2, GTA V, PUBG - you name it - are either literally running on engines over a decade old or ones equivalent to that even if new, GPU "power" is practically irrelevant from a sales standpoint, other than on the marketing front.

    Leave a comment:


  • smitty3268
    replied
    Originally posted by phoronix_is_awesome View Post
    I don't even want to waste time to figure out how AMD thinks a 128bit GPU is worth $379 when nvidia's 3060Ti's 256bit GPU is $399. I mean the 6700XT 192bit GPU at $479 is already outrageous . Dollar per bitness or GB/s bandwidth is higher at the lower end tier even if you compare AMD's own RDNA2 series. I guess that's the reason why ROCm still can't be enabled for RDNA cards, if the people at AMD can't even do elementary level math.

    Here is a bad joke for AMD GPU fans: considering that 6700XT had to bust Geforce 3070's 8GB's vram at 1440P gaming to convince that an inferior 192bit GPU should be $479, how about we compare 3060 and 6600XT at 1440P resolution to "intentionally" bust the 6600XT's 8GB 128Bit VRAM? Oh, wait, this time, the 6600XT is now "designed for 1080P gaming".
    Bitness doesn't matter, only performance does. It remains to be seen what that will be - you certainly can't trust vendor provided slides.

    Leave a comment:

Working...
X