Originally posted by bridgman
View Post
Announcement
Collapse
No announcement yet.
Hands On With The AMD Radeon RX 6600 XT
Collapse
X
-
Last edited by user1; 07 August 2021, 04:57 PM.
-
Originally posted by leipero View PostUsually at office, Intel CPUs are used (mostly i5 range or so), why? In reality because office computers do not need that much compute power as long as they operate well enough for the OS and applications (office, browsers etc.), so people don't upgrade them for years/decade. But another reason where AMD is potentially missing is because for every business owner, less investment = better, and all of those Intel CPUs have iGPUs, and none of those machines have (or need) dedicated GPU, so overall cost of say 20 PCs, even with more expensive boards and CPUs would be lower for that purpose on agregate in comparison to the equvalent X with similar compute power with dedicated GPU (cheapest one).
Another thing is, iGPUs can come up handy for troubleshooting, and can make a difference for average users between functional and non-functional PC when dGPU is under RMA or something like that. Something like "Vega 1-3" would be sufficient I guess, some basic iGPU that requires little space and is possible to produce and include in price without net-loss at the very least and without motherboard being overcomplicated to keep the price as it is. Arguments to consider.
This isn't always obvious because we did not sell Renoir into the DIY desktop market - qualification takes a long time and Cezanne was running right behind Renoir - but we did sell a lot of 4xxxG parts to OEMs which went into the same office systems you describe.
We are shipping 5xxxG parts into the DIY market now, although the "most office-y version" (5300G) is currently OEM only.
What we don't have today is "tiny little GPUs for troubleshooting a gaming/workstation system", whether they be integrated or discrete, and opinions vary re: how important those are. Most DIY'ers tend to have an old dGPU sitting around but not all of them do. The sad thing is that there are probably tens of thousands of those cards scrapped every year but still functional.
On the other hand my old Vesa bus dGPUs are probably running off the end of their usefulnessLast edited by bridgman; 07 August 2021, 04:29 PM.
Leave a comment:
-
Originally posted by bridgman View PostAgree that the HD 7750/7770 (Cape Verde IIRC) was faster than the iGPUs of the time, but I expect that the integrated GPUs in Renoir and Cezanne would be faster than the HD 7750 these days, even with shared and slightly slower memory.
There has been a lot of discussion about the lowest tier cards in other threads so I won't repeat it all but:
#1 - yes there is still a slot between APU and "great 1080p dGPU"
#2 - problem is that the market for those boards is much smaller these days and barely makes sense financially
The product would basically be "laptop dGPU stuck on a PCIE card" but that seems to be a tough sell to the board partners. I like the idea of a no-power-connector mini-ITX size card and own a few of them but that seems to be a minority view.
In fairness, the RX 5500XT was a bit faster, a bit cheaper and used less power. It just wasn't exciting I guess.
That's an even smaller market, unfortunately... while the cost of making new chips is going through the roof.
What kind of use cases do you see for graphics acceleration on the server ? My impression was that the GPUs integrated into BMCs were doing a decent job of covering user needs.
I think cache for iGPUs is an interesting option (as long as it's not expensive), because you don't need super fast memory like "gaming cache" (L3 on Zen3), assuming game have proper optimization for texture loading, such cache could make wonders, and because it doesn't need to be super fast, cheaper (and larger) can be used, even in systems with DDR5 and so on, there would be benefit of it IMO.
Such APUs would solve the issue of low end GPUs, ofc. with power limitations on how much one could do not to stress motherboard manufacturers, because games have different requirements in 2021 in comparison to 2011, drop the resolution and you solved most of the problems, add to that good upscaling solution, and you are done.
Usually at office, Intel CPUs are used (mostly i5 range or so), why? In reality because office computers do not need that much compute power as long as they operate well enough for the OS and applications (office, browsers etc.), so people don't upgrade them for years/decade. But another reason where AMD is potentially missing is because for every business owner, less investment = better, and all of those Intel CPUs have iGPUs, and none of those machines have (or need) dedicated GPU, so overall cost of say 20 PCs, even with more expensive boards and CPUs would be lower for that purpose on agregate in comparison to the equvalent X with similar compute power with dedicated GPU (cheapest one).
Another thing is, iGPUs can come up handy for troubleshooting, and can make a difference for average users between functional and non-functional PC when dGPU is under RMA or something like that. Something like "Vega 1-3" would be sufficient I guess, some basic iGPU that requires little space and is possible to produce and include in price without net-loss at the very least and without motherboard being overcomplicated to keep the price as it is. Arguments to consider.
Leave a comment:
-
Originally posted by DanL View PostYou all missed the point of my post, so I'll be more clear. Where is the Radeon RX6400 with 75W TBP? Or even better, an RX6200 with 35-50W TBP?
Leave a comment:
-
Originally posted by puleglot View Post1080p GPU with 3 FANs... When will I have a chance to replace my R9 380 ITX Compact?
And depending on your preferences you may not like this notion, but I can tell you from first-hand experience that a nVidia 1650 [GDDR6] is actually faster than my R9 380.
I know this because of my custom mpv.conf, where I had to slightly lower the 'error-diffusion' algorithm used on my AMD card because it couldn't keep up where the 1650 wasn't struggling, even though it draws considerably less power while delivering the better performance of these two cards.
On both setups I used the OpenGL output, because RadeonSI would consistenly outperform the Vulkan option with the RADV driver the last time I tested this, which must be around half-a-year by now.
So as long as you are fine with running a binary blob, nVidia right now has the only viable option for a 75 Watts GPU, unfortunately.
Hopefully Intel really delivers with their dGPU line-up, since competition is direly needed here!
Leave a comment:
-
Originally posted by user1 View Post
Actually, that's not corrrect. The low 75W> TDP cards can actually be way more powerful than iGPU's. For example, a long time ago I had the 2012 Radeon HD 7750, which is around 60-70W TDP, and it could easily run many triple A games of that time at high settings, which is way beyond what iGPU's could achieve at least then. I think there is no reason Nvidia / AMD can't make a 75W> card today that is way better than an iGPU. It's simply because both companies completely neglected midrange / entry level markets. I mean after 4+ years there is still no worthy successor to the midrange RX 580 / GTX 1060.
- Likes 1
Leave a comment:
-
1080p GPU with 3 FANs... When will I have a chance to replace my R9 380 ITX Compact?
- Likes 1
Leave a comment:
-
Originally posted by user1 View Post
But didn't both companies neglect that market way beyond the chip shortage? I mean it started with the release of Nvidia Turing.
At the same time, AMD's gpu group has been in a shambles the last few years. They've just been trying to hang on long enough to get out new architectures that are halfway competitive with Nvidia. They finally have done so, but they're now stuck with very limited ability to supply the market with silicon right now as their limited supply is much better off going towards Epyc CPUs, and other higher margin parts.
NVidia I think has been happy to be seen as a higher end product brand that charges a premium, and as long as people have been willing to keep paying the higher prices they've been asking for they've decided they didn't really care about the low end anymore. There's always the 1030 still for those people, but Nvidia isn't going to care much about that market.
I think hope for you may be coming next year in the form of Intel. They are supposedly going to be flooding the market with cheap gpus in order to gain market share. It remains to be seen how AMD or NVidia will respond, if at all.Last edited by smitty3268; 06 August 2021, 06:04 PM.
Leave a comment:
-
Originally posted by DanL View PostYou all missed the point of my post, so I'll be more clear. Where is the Radeon RX6400 with 75W TBP? Or even better, an RX6200 with 35-50W TBP?Originally posted by CaptainLugnuts View PostAMD is never going to make a card like that ever again because they're pointless.
If all you need is a low-end GPU, their integrated GPUs have that covered.Originally posted by user1 View PostActually, that's not corrrect. The low 75W> TDP cards can actually be way more powerful than iGPU's. For example, a long time ago I had the 2012 Radeon HD 7750, which is around 60-70W TDP, and it could easily run many triple A games of that time at high settings, which is way beyond what iGPU's could achieve at least then. I think there is no reason Nvidia / AMD can't make a 75W> card today that is way better than an iGPU. It's simply because both companies completely neglected midrange / entry level markets.
There has been a lot of discussion about the lowest tier cards in other threads so I won't repeat it all but:
#1 - yes there is still a slot between APU and "great 1080p dGPU"
#2 - problem is that the market for those cards seems to be fairly small these days, essentially "entry level DIY gaming PC" since even the OEMs don't seem to be doing much in that area.
The product would basically be "laptop dGPU on a PCIE card" but that seems to be a tough sell to the board partners, at least while there are shortages in many of the components they need.
Originally posted by user1 View PostI mean after 4+ years there is still no worthy successor to the midrange RX 580 / GTX 1060.
One thing that muddies the water a bit is that games have become a lot more demanding over the last 5 years, so "a great 1080p card" today has maybe twice the performance of the 1080p cards back in 2016, more like a GTX 1080ti than a GTX 1060.
Originally posted by Luzipher View PostUhm ... you sure realize that there are quite a few processors without integrated GPUs, right? Mostly from AMD (integrated graphics are even scarce here), but even intel has some of those again. I'd really like a extremely low-powered GPU for my home server (currently running on an unaccelerated AST2500 VGA adapter included in the BMC) - just to use a graphical desktop and browser.
What kind of use cases do you see for graphics acceleration on the server ? My impression was that the GPUs integrated into BMCs were doing a decent job of covering user needs.Last edited by bridgman; 07 August 2021, 03:35 PM.
- Likes 1
Leave a comment:
Leave a comment: