Announcement

Collapse
No announcement yet.

Radeon R600 Gallium3D NIR Backend Continues Advancing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • lowflyer
    replied
    Out of two HD 6950 cards that did serve quite well during the good old bitcoin mining days, one is still in operation (no longer mining with it) to this day. The other one is sitting on the shelf.

    Leave a comment:


  • staggerlee
    replied
    Originally posted by gerddie View Post

    R600_DEBUG=nir Defaulting to the nir backend is really not an option at this time. Some features are missing, some things are buggy, and currently I can only test on a HD5450 card. Dave Airlie contributed some patches to support some specifics of Cayman, but it will not be enabled by above flag.
    Thank you...

    Leave a comment:


  • leipero
    replied
    Originally posted by ermo View Post
    I still have a running HD5770 card with a slightly modified aircooling solution (the original fan became noisy) that I use w/Linux.

    One thing I've noticed is that it performed vastly better (2x) on OS X Sierra when I used it in an old hackintosh.

    Does anyone know why the r600g OpenGL implementation is so relatively slow (and apparently buggy as well) compared to other OSes?

    I mean, sure, I know the card is well past its prime, but if it works and it draws relatively little power at idle I see no real reason to replace it...
    That's interesting, I didn't have any issues with HD6770 (basically the same GPU = Juniper XT)m also changed cooling solution (to be almost completely silent), modded BIOS and undervolted GPU to 1.13v I think and to add GOP support and it was working fine on linux. Unfortunately that GPU "died" (wouldn't want to boot every time in linux, while it would boot Windows without any issue, maybe after year of being undervolted it finally got to the point where I should have bump up the voltage a bit since it was on "the edge" to begin with = 1.10v would render it unstable).
    What I do find interesting is that HD 5770 (probably 6770 as well) performs vastly better on OS X Sierra, durring my testing with R7 250 passed to the VM it was buggy as hell on High Sierra, while significantly better on Catalina, but far away from what is the state on Linux, and even tho. R7 250 is GCN 1.0, I did not see any notable improvements over HD 6770 on Linux, aside from one thing that is done by LLVM on GCN (shader loop unrolling) while it would be quite terrible on 6770 because of that (with 99% GPU usage and slideshow).

    Unfortunately, I don't have any of those GPUs to test now, but it's interesting to see that Juniper XT works much better on OS X in your experience.

    Leave a comment:


  • moriel5
    replied
    My HD6950 is waiting for that fresh breath of Vulkan air.

    Leave a comment:


  • StandaSK
    replied
    I have a HD 3650 AGP collecting dust somewhere, but I also recently got a PC for free that has (I believe) an A4-3400 in it. I'm planning to run Linux on it, maybe even test out the NIR stuff.

    Leave a comment:


  • Mez'
    replied
    Still running a Radeon HD 6650M coupled with the 6480G APU bit (A4-3300M). This Asus laptop is 8+ years old and still running very well (bar the weak CPU).

    Leave a comment:


  • [TV]
    replied
    Still crawlin' along with a passive modded HD 4650 AGP on an old P4 DAW, which is about a month away from retirement. Browsing overly bloated websites is a nightmare but it works great as a guitar amp sim with guitarix.

    A few weeks ago installed Ubuntu 20.04 for an elderly person on a E-450 laptop with a 6470m GPU. Seems to be working okay with the VA-API supported Firefox on Wayland.

    Leave a comment:


  • gerddie
    replied
    Originally posted by bridgman View Post
    There was a back-end optimizer written for the VLIW parts but i don't remember if it was enabled by default - it did improve shader performance quite a bit IIRC.
    The optimizer *sb* is enabled by default for supported shaders, but it is disabled for tesselation, compute, and any shader that uses image io, or atomics, and its code is in a state that makes it very difficult to fix things, let alone add the missing pieces.

    Leave a comment:


  • gerddie
    replied
    Originally posted by staggerlee View Post
    Does the NIR Backend need to be switched on or will it be switched on by default? If it needs switched on, how?
    R600_DEBUG=nir Defaulting to the nir backend is really not an option at this time. Some features are missing, some things are buggy, and currently I can only test on a HD5450 card. Dave Airlie contributed some patches to support some specifics of Cayman, but it will not be enabled by above flag.

    Leave a comment:


  • bridgman
    replied
    Originally posted by ms178 View Post
    Unfortunately that generation of cards is not that well supported on Linux as probably AMD wasn't putting much effort in performance back then. On Linux, there were constant issues with the SB shader optimizer, OpenCL support is still at 1.1 level to this date with Clover and we are missing out on all the AMDGPU, RadeonSI and Vulkan efforts targeting GCN and newer.
    Ahh, SB - that's the name I was trying to remember. Thanks !

    IIRC back then the main priority from our users was improving functionality, primarily higher GL levels in order to support new games.

    Originally posted by ms178 View Post
    AMD put much more effort with GCN on Linux as it was the time they wanted to enter the HPC GPU market and needed to invest more in their Linux software stack.
    It wasn't "much more effort"** as much as getting agreement to largely combine the open source and closed source driver efforts into a single all-open stack with a couple of optional closed-source components, giving us a big increase in the number people working on the upstream driver code.

    ** although we had been building up the open source team incrementally since 2007 and are still expanding it even today
    Last edited by bridgman; 21 July 2020, 04:57 AM.

    Leave a comment:

Working...
X