Originally posted by coder
View Post
Announcement
Collapse
No announcement yet.
AMD Details The MI300X & MI300A, Announces ROCm 6.0 Software
Collapse
X
-
-
Originally posted by Panix View PostMissteps? AMD is a major disaster within GPU Compute - HIP-RT is still 'experimental' in Blender.
I'm not denying that AMD's GPU Compute situation is bad. The only point of contention is exactly what they should've done, instead. From the outside, I think it's impossible to know the exact dimensions of the solution space they were working within. That's not to say there was nothing they could've done better, but I just don't think we're well positioned to say exactly what they realistically could've done better or differently.
I'm not making excuses or saying you shouldn't complain. It's just the "Monday morning quarterbacking" that bothers me.
Comment
-
Originally posted by coder View PostBlender is an interesting case, because its developers are reportedly Nvidia fanboys who chronically neglected the OpenCL backend. HIP was basically the best AMD could do for a situation like that, without straight-up cloning CUDA, which they probably worried would make them vulnerable to copyright infringement.
I'm not denying that AMD's GPU Compute situation is bad. The only point of contention is exactly what they should've done, instead. From the outside, I think it's impossible to know the exact dimensions of the solution space they were working within. That's not to say there was nothing they could've done better, but I just don't think we're well positioned to say exactly what they realistically could've done better or differently.
I'm not making excuses or saying you shouldn't complain. It's just the "Monday morning quarterbacking" that bothers me.
Hello, did amd removed opencl for hawaii gpus? wikipedia says it supports opencl in 2.0+. i can remember that i used it someday. now i wanted to use opencl for adobe premiere pro again and was very dissappointed. NO OPENCL WTF AMD? how to i get those opencl drivers? amd app sdk? gone newest drive...
I understand it's been problematic or 'broken' for years and the investment/priority has been extremely lacking - and the lack of focus or addressing the problem has been demonstrated in fields like Blender and other areas that require Compute (and the OpenCL backend). They switched to HIP and that looks like a similar situation - perhaps, a little bit of progress but REALLY slow and mediocre performance when using it - and anything related - i.e. ray tracing library.
- Likes 1
Comment
-
Originally posted by Panix View PostI totally understand they couldn't clone or copy CUDA.
AMD claims you can write HIP code and run it on Nvidia hardware, though I'm sure more bugs, less performance, and fewer features are available via that route than native CUDA code. I'm guessing HIP's Nvidia support exists only as a way to claim you're not locked-in, with HIP, the way you are with CUDA. In practice, I don't really foresee anyone switching over CUDA codebases to use HIP, instead.
Comment
-
Originally posted by coder View PostLet's be clear, though: HIP is basically a CUDA clone. They just changed all of the names, to avoid getting slapped with copyright infringement. However, that + being slightly out-of-sync with CUDA, is enough to force a separate Blender backend for HIP. And that translates into more maintenance burden that I'm guessing falls squarely on AMD, thus denying them some key benefits you'd want from making a CUDA clone.
AMD claims you can write HIP code and run it on Nvidia hardware, though I'm sure more bugs, less performance, and fewer features are available via that route than native CUDA code. I'm guessing HIP's Nvidia support exists only as a way to claim you're not locked-in, with HIP, the way you are with CUDA. In practice, I don't really foresee anyone switching over CUDA codebases to use HIP, instead.
My point, before - is that HIP is still a WIP and they still haven't fixed the RT aka ray tracing acceleration in Blender - which would be the ONLY reason to even consider an AMD card for that. Nvidia is evil but most ppl still use it for whatever tasks - in these fields - GPU COMPUTE, Blender, ML/AI etc. etc. - the only reason ppl are buying AMD cards right now is the price is a bit less and in the Linux world - for gaming - since, there's (supposedly) less issues to deal with - but, it's my impression that it's mostly ppl who don't do much with their PC or haven't done much in Linux other than gaming - I guess the community here has a lot of those?
Because, when you try to do more than just gaming in Linux - you encounter stress and problems - ROCm, getting that configured w/ HIP is often a hassle - and it's my understanding that OpenGL can be implemented with open source components but NOT OpenCL - which is where AMD users get into trouble - trying to 'add' proprietary components into the FOSS world of their AMD software/files.... but, correct me if I'm mistaken there? I don't think so....
There's a lot of examples of ppl who wanted to use AMD hardware - but, had to switch to evil Nvidia hardware - even if they have to spend a lot more - either on the specific gpu - which might have less vram - or had to SPEND A LOT MORE - because, they need the vram and only Nvidia's expensive cards have more vram.....but, they did this because the AMD ROCm/HIP components they have to use - is a nightmare to install and configure - because a good portion of it is not included in the free/open source ecosystem - Linux gamers who don't use these programs have no idea or don't care..... Heck, AMD doesn't care, either.
- Likes 1
Comment
-
Panix coder There was a great interview with an AI researcher on Moore's Law is Dead today touching on this very subject with regards to AMD/Nvidia/Intel and their compute / AI offerings: .
tl,dr: Some well known problems with ROCm need to be fixed in order to get more traction (e.g. hardware / capability fragmentation on Linux/Windows; usability problems that Panix also mentioned); also Ryzen AI is Windows-only now and not available on Linux (AMD needs a more complete support for their features to make it appealing for developers to use them). The interview gets also in more detail about the other companies.
- Likes 1
Comment
-
Originally posted by ms178 View PostPanix coder There was a great interview with an AI researcher on Moore's Law is Dead today touching on this very subject with regards to AMD/Nvidia/Intel and their compute / AI offerings: .
tl,dr: Some well known problems with ROCm need to be fixed in order to get more traction (e.g. hardware / capability fragmentation on Linux/Windows; usability problems that Panix also mentioned); also Ryzen AI is Windows-only now and not available on Linux (AMD needs a more complete support for their features to make it appealing for developers to use them). The interview gets also in more detail about the other companies.
Also, I really prefer 'getting an AMD gpu' - for Linux but to use the software I wish to use - however, it's apparently not a good match - video editing, 3D Modeling/GPU Compute - with AI/ML being something I want to look at but the first two I will concentrate, the most. Gaming is something I dabble in - and I already know it's sufficient w/ AMD. Or so, I think.
But, I find it difficult to seriously pick an AMD gpu over a Nvidia one - examining all this.... the video should be helpful and informative.
Comment
-
Originally posted by Panix View Post
Thanks, looks really interesting, ms178. Pretty long video, though. :-) I just want to mention (briefly), that I think, ultimately, that AI is a bad thing. It's already big but will be huge - but, I think we will look back at it - and say, 'damn...' :-(
Originally posted by Panix View PostAlso, I really prefer 'getting an AMD gpu' - for Linux but to use the software I wish to use - however, it's apparently not a good match - video editing, 3D Modeling/GPU Compute - with AI/ML being something I want to look at but the first two I will concentrate, the most. Gaming is something I dabble in - and I already know it's sufficient w/ AMD. Or so, I think.
But, I find it difficult to seriously pick an AMD gpu over a Nvidia one - examining all this.... the video should be helpful and informative.
- Likes 1
Comment
-
Originally posted by ms178 View Post
Yeah, you can run it as a podcast in the background though, I found the unique perspective quite informative.
It is a perfectly valid reason to look at what works best on your specific workloads today, there might be no alternative to Nvidia in your case if you need to purchase in the timespan of this generation. But it is a terrible market situation to be in as Nvidia milks all these people, but they are the undisputed leader in specific areas that might be important to you. But I am not following these workloads too closely myself. AMD has still a lot of catching up to do on the software side which needs much more time, unfortunately. They also need to get buy-in from the software vendors which might be harder if their target audience is running on Nvidia anyways. Intel might be a good contender long-term but their first GPU generation is still a beta product and also needs much more maturing.
Yes, it's a terrible market situation in general but especially regarding Nvidia hardware these days - I did buy one card but I sold it and then bought another used gpu - but that one was used - although, I sold that one as well. The latest was a 3080 10gb - I want more vram. I have an older card at the moment that I'm borrowing until I save enough for a higher tier card. A 1660 Ti - but, it is only 6gb. So, I plan on seeing what experience I get with Fedora, Ubuntu, Debian, OpenSUSE etc. - and in Wayland. That should be fun?!?
I'm not in a rush but I am worried about availability/price of gpus in the next while - with AI taking off and all sorts of market surprises (see China) - the last time I 'waited' - crypto crazziness happened. I think Intel is still a ways off before really becoming a viable(?) alternative to the other big two. At least, that's my impression so far.
For now, I'm keeping an eye on how the 7900 series shapes up in being a potential option in these specific areas - or should I just hunt for a used 3090 or something?
- Likes 1
Comment
-
Originally posted by Panix View PostFor now, I'm keeping an eye on how the 7900 series shapes up in being a potential option in these specific areas - or should I just hunt for a used 3090 or something?
Comment
Comment