Anyone else waiting on, I'm gonna call it, a mid-range 6-8GB 6400XT in the $175-$250 range? I like that they're taking on Nvidia's best, but all I really need is something equivalent to my 580 with better power usage and thermals and the 5300XT just isn't it.
Announcement
Collapse
No announcement yet.
Linux Support Expectations For The AMD Radeon RX 6000 Series
Collapse
X
-
Originally posted by usta View PostI am really pretty far away from amd ( for gpu side ) could some one tell me some good reasons to give a try?
For example is it have AV1 hardware encode and decode ? is it have h265/254 hardware encode / decode ? is it have v9 hardware encode / decode ?
I am asking because more than gaming i am interested in streaming and i need hardware acc for encode/decode
(And i know it is a bit hard in here not to be trolling but pretty please just 1 time <troll off> )
Honestly if you are mainly concerned with video I'd say just stick with NVidia. AMD has the same stuff on paper now, but in practice I'd argue it's not really the strong point of their hardware and NVidia has focused more on the software side of it so far.
- Likes 1
Comment
-
Originally posted by smitty3268 View PostSummary:
AMD has cards with approximately the same performance as the 3090, 3080, and 3070 TI available for a bit cheaper, a bit less power usage, and a bit more memory. Ray tracing is slower and a DLSS like option (Super Resolution?) is "coming in a driver update"....
- Likes 1
Comment
-
Originally posted by ssokolow View Post
I'm still on a GeForce GTX750 and I always spec my CPU to 65W TDP or less because I have no air conditioning. This is the main detail I care about aside from whether the drivers will match nVidia's for stability when a desktop session is left open for months on end.
The only issue is, there's no decent 50-75W GPU on the market that is worth buying (I feel for 2020, 40GPixel/s 80+GTexel/s with about 3-4+ GFLOPs at around 50W TDP should be a standard for low end GPUs).
For about 80$ (or at least under 100$), such GPUs would be seeling like crazy IMO, for all sorts of purposes and low-end "gaming". I'm not sure if it's feasable from thecnological/financial perspective, but if it isn't in 2020, that would suggest that technology didn't really advance that much in the last decade or so, because those are similar specs to the high end GPUs from almost a decade ago.
- Likes 1
Comment
-
Originally posted by smitty3268 View PostSummary:
AMD has cards with approximately the same performance as the 3090, 3080, and 3070 TI available for a bit cheaper, a bit less power usage, and a bit more memory. Ray tracing is slower and a DLSS like option (Super Resolution?) is "coming in a driver update"....
Seems competitive, at least. Hopefully they can keep the cards in stock unlike NVidia so far with their launch, and hopefully it actually works on linux this time right from the start.
But yeah, AMD is spreading like COVID in the US . (Before anyone gets offended, US numbers are far away from what media makes it look like, the matter of fact is, it's bad everywhere, and US for sure isn't even close to be "the worst" as media wants to present it).
Comment
-
Originally posted by pal666 View Post"nvidiot"
I've even expressed interest in using AMD, just that it's not feasible commercially until the software vendors for the industry support it. Here's a popular open-source project with AMD feature request:
I just reported previosly the impossibility to render with Meshroom, probably cause despite I have an NVidia GPU, Nvidia does not provide any CUDA package for OpenSUSE 15.1 . I use Blender, GIMP .....
Originally posted by pal666 View Postsubj are gaming videocards.
You're thinking of workstation Quadro's I guess? The extra VRAM is nice, but other benefits aren't as important, mostly just need solid CUDA performance with enough VRAM for the given dataset to process.
Originally posted by pal666 View Postcuda is vendor lock-in, i.e. you have to be not very smart to depend on it for starters.
Originally posted by pal666 View Postamd has converter from cuda to their chips
Here the Meshroom project had trouble, but has since improved trying to gain HIP support, however the manual part is what's holding it back for over 9 months apparently:
I realize this is a lot to just ask without the time and ability to implement and maintain the necessary changes myself. Hence, I’ll spare you the long-winded political speech. The gist of it is: Y...
If it's so easy though, why not contribute to the project to promote AMD some more and help others not be stuck with CUDA at least in the open-source photogrammetry world
Or are you just going to deflect with some BS and name calling again?
Comment
-
Originally posted by leipero View PostBut yeah, AMD is spreading like COVID in the US . (Before anyone gets offended, US numbers are far away from what media makes it look like, the matter of fact is, it's bad everywhere, and US for sure isn't even close to be "the worst" as media wants to present it).
- Likes 4
Comment
-
From: https://www.anandtech.com/show/16202...ovember-18th/2
"As things currently stand super Resolution is still under development, so it won’t be available to game developers (or gamers) at the time of the RX 6000 series launch. But once it is, like AMD’s other FidelityFX graphics libraries, it will be released as an open source project on GPUOpen, and AMD is explicitly noting that it’s being designed as a cross-platform solution."
DLSS -> Super Resolution: open source and cross-platform.
That sounds good to me.
- Likes 5
Comment
Comment