Originally posted by Terr-E
View Post
Announcement
Collapse
No announcement yet.
AMD Announces Navi 14 Based Radeon RX 5500 Series
Collapse
X
-
Originally posted by tuxd3v View Post
Maybe to be more competitive with the NVIDIA GTX 1600 series.. with low power consumption..?
Any way,
I think that the RX500 is a very nice peace of hardware, very good OpenCL 2.0, the problem is that is not a universal card( because it needs PCIe 3.0 atomic operations supported in the CPU and in the motherboard, to be compliant.. ).
This requirement in not good( PCIe 3.0 atomics.. ),
Maybe the RX 5500 series don't have that limitation..?
After all Nvidia cards are universal, you can trow a recent Nvidia card into a pcie 1.1/PCIe 2.0/PCIe 3.0 and it will work there, this is something very very valuable when people buy hardware, Nvidia got it right..
Sometimes you do have old machines around you, and if they work well, why should you be in need to change that..( some times they even make part of a bigger implementation, and you don't want to mess too much there, or you end in the need for a new set of projects with down times and losses, guaranteed.. )?!
right, you change the hardware that brakes( ..and here is were NVidia got it right.. ).
PCIe Atomics are afaik available since Ryzen on AMD (2017). That's surely a lot less, but let's face it: AMD was really really lagging behind in that period. Like really really behind. Bulldozer never performed especially well. I think it's not really worthwhile relying on that old tech definitions, when you're doing hardware right now, so building chips for todays standards is just more sensible.
- Likes 2
Comment
-
bridgman Is the Navi 14 (aka Radeon RX 5500) capable of 8k for desktop use? You pointed out that Navi is optimized for 4k and 8k, but nothing is given except FullHD like it would be a Polaris card of nearly 4 years ago. After product announcement I hope such basic info can be given?
What about the mobile version - is it a Navi APU for the desktop and if so - is it fully capable of 4k and 8k or is it a chipset limited version as Raven Ridge and its (too many) descendants?
Would be great to buy an AMD system this year - but without 8k my Haswell with iGPU would do the same job ... so I really hope these are cards/APUs to be still usable in 2020+.
And by the way - any hope to get Navi running 8k on Linux this year?
Comment
-
Originally posted by Hibbelharry View PostPCIe Atomics are afaik available since Ryzen on AMD (2017). That's surely a lot less, but let's face it: AMD was really really lagging behind in that period. Like really really behind. Bulldozer never performed especially well. I think it's not really worthwhile relying on that old tech definitions, when you're doing hardware right now, so building chips for todays standards is just more sensible.
But when I buy hardware, I think in buying it, with a longevity in mind..
Besides,
There are still a lot of production hardware out there, that is a lot older, and you need to replace, some spare parts, from time to time.. how will you buy new products for that machines, if the products does not support that technologies?
Because if you only build for today or tomorrow, you doesn't support yesterday products( this is a very big problem.. ).. you only have one small market, the bigger pie, you are not playing into it.
NVidia and others will capitalise on that, selling hardware that is supported today, and also support past technologies too, so they have all markets, without competition..
I think you will sell only a couple of products, but you will not succeed further in your business.
No longevity plan for hardware, will turn you in a very low tier Company..
- Likes 1
Comment
-
Originally posted by tuxd3v View Post
Maybe to be more competitive with the NVIDIA GTX 1600 series.. with low power consumption..?
Any way,
I think that the RX500 is a very nice peace of hardware, very good OpenCL 2.0, the problem is that is not a universal card( because it needs PCIe 3.0 atomic operations supported in the CPU and in the motherboard, to be compliant.. ).
This requirement in not good( PCIe 3.0 atomics.. ),
Maybe the RX 5500 series don't have that limitation..?
After all Nvidia cards are universal, you can trow a recent Nvidia card into a pcie 1.1/PCIe 2.0/PCIe 3.0 and it will work there, this is something very very valuable when people buy hardware, Nvidia got it right..
Sometimes you do have old machines around you, and if they work well, why should you be in need to change that..( some times they even make part of a bigger implementation, and you don't want to mess too much there, or you end in the need for a new set of projects with down times and losses, guaranteed.. )?!
right, you change the hardware that brakes( ..and here is were NVidia got it right.. ).
- Likes 1
Comment
-
Originally posted by Neuro-Chef View PostWell, it most likely won't handle AV1, at least the RX 5700 (XT) does not.
And with only RX 570 like performance it should be a little bit cheaper.
- Likes 1
Comment
-
Originally posted by tuxd3v View PostBecause if you only build for today or tomorrow, you doesn't support yesterday products( this is a very big problem.. ).. you only have one small market, the bigger pie, you are not playing into it.
NVidia and others will capitalise on that, selling hardware that is supported today, and also support past technologies too, so they have all markets, without competition..
I think you will sell only a couple of products, but you will not succeed further in your business.
No longevity plan for hardware, will turn you in a very low tier Company..
You just can't have it all.
- Likes 1
Comment
-
Originally posted by wizard69 View PostWell that would suck because AV1 was on my mind here. Hopefully details can be scraped up soon but their is little sense in buying a GPU that can’t do AV1 decode in hardware.
The only ASIC I know up to now is this one:
Comment
Comment