Announcement
Collapse
No announcement yet.
Radeon R600 Gallium3D NIR Backend Continues Advancing
Collapse
X
-
Out of two HD 6950 cards that did serve quite well during the good old bitcoin mining days, one is still in operation (no longer mining with it) to this day. The other one is sitting on the shelf.
-
Originally posted by gerddie View Post
R600_DEBUG=nir Defaulting to the nir backend is really not an option at this time. Some features are missing, some things are buggy, and currently I can only test on a HD5450 card. Dave Airlie contributed some patches to support some specifics of Cayman, but it will not be enabled by above flag.
Leave a comment:
-
Originally posted by ermo View PostI still have a running HD5770 card with a slightly modified aircooling solution (the original fan became noisy) that I use w/Linux.
One thing I've noticed is that it performed vastly better (2x) on OS X Sierra when I used it in an old hackintosh.
Does anyone know why the r600g OpenGL implementation is so relatively slow (and apparently buggy as well) compared to other OSes?
I mean, sure, I know the card is well past its prime, but if it works and it draws relatively little power at idle I see no real reason to replace it...
What I do find interesting is that HD 5770 (probably 6770 as well) performs vastly better on OS X Sierra, durring my testing with R7 250 passed to the VM it was buggy as hell on High Sierra, while significantly better on Catalina, but far away from what is the state on Linux, and even tho. R7 250 is GCN 1.0, I did not see any notable improvements over HD 6770 on Linux, aside from one thing that is done by LLVM on GCN (shader loop unrolling) while it would be quite terrible on 6770 because of that (with 99% GPU usage and slideshow).
Unfortunately, I don't have any of those GPUs to test now, but it's interesting to see that Juniper XT works much better on OS X in your experience.
Leave a comment:
-
I have a HD 3650 AGP collecting dust somewhere, but I also recently got a PC for free that has (I believe) an A4-3400 in it. I'm planning to run Linux on it, maybe even test out the NIR stuff.
Leave a comment:
-
Still running a Radeon HD 6650M coupled with the 6480G APU bit (A4-3300M). This Asus laptop is 8+ years old and still running very well (bar the weak CPU).
- Likes 1
Leave a comment:
-
Still crawlin' along with a passive modded HD 4650 AGP on an old P4 DAW, which is about a month away from retirement. Browsing overly bloated websites is a nightmare but it works great as a guitar amp sim with guitarix.
A few weeks ago installed Ubuntu 20.04 for an elderly person on a E-450 laptop with a 6470m GPU. Seems to be working okay with the VA-API supported Firefox on Wayland.
- Likes 2
Leave a comment:
-
Originally posted by bridgman View PostThere was a back-end optimizer written for the VLIW parts but i don't remember if it was enabled by default - it did improve shader performance quite a bit IIRC.
- Likes 5
Leave a comment:
-
Originally posted by staggerlee View PostDoes the NIR Backend need to be switched on or will it be switched on by default? If it needs switched on, how?
- Likes 7
Leave a comment:
-
Originally posted by ms178 View PostUnfortunately that generation of cards is not that well supported on Linux as probably AMD wasn't putting much effort in performance back then. On Linux, there were constant issues with the SB shader optimizer, OpenCL support is still at 1.1 level to this date with Clover and we are missing out on all the AMDGPU, RadeonSI and Vulkan efforts targeting GCN and newer.
IIRC back then the main priority from our users was improving functionality, primarily higher GL levels in order to support new games.
Originally posted by ms178 View PostAMD put much more effort with GCN on Linux as it was the time they wanted to enter the HPC GPU market and needed to invest more in their Linux software stack.
** although we had been building up the open source team incrementally since 2007 and are still expanding it even todayLast edited by bridgman; 21 July 2020, 04:57 AM.
- Likes 7
Leave a comment:
Leave a comment: