Announcement

Collapse
No announcement yet.

AMD Radeon RX 6400 On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • gukin
    replied
    If you're like me, and I know you are, you're going to buy this card for your small form-factor computer that's using integrated graphics (in my case a 5700g).

    I've found that there is very little loss in performance by using the HDMI output from my 5700g, and setting an environment variable to go through the 6400. That said, here are the pertinent variables:

    DRI_PRIME for OpenGL
    MESA_VK_DEVICE_SELECT for Vulkan.
    DXVK_FILTER_DEVICE_NAME for DXVK.

    To determine what to use for DRI_PRIME, run "xrandr --listproviders" and you should get something like:
    Providers: number : 2
    Provider 0: id: 0x55 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 4 outputs: 2 associated providers: 1 name:Unknown AMD Radeon GPU @ pci:0000:0f:00.0
    Provider 1: id: 0xa3 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 2 outputs: 2 associated providers: 1 name:Unknown AMD Radeon GPU @ pci:0000:03:00.0

    You can then determine which card is which by running this: "lspci | grep VGA" which returns something like this:
    lspci | grep VGA
    03:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Navi 24 [Radeon RX 6400 / 6500 XT] (rev c7)
    0f:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Cezanne (rev c8)

    So if provider 0 is pci:0000:0f that maps to 0f:00.0 Cezanne. and provider is 03:00 that maps to the 6400.

    To run an OpenGL application on the 6400, do this: DRI_PRIME=1 <command>

    An example would be "DRI_PRIME=1 glxgears -info | grep RENDER" which yields this:
    GL_RENDERER = AMD BEIGE_GOBY (LLVM 11.0.1, DRM 3.42, 5.15.35-desktop-2.mga8)

    Setting DRI_PRIME to zero yields this (DRI_PRIME=0 glxgears -info | grep RENDER)
    GL_RENDERER = AMD RENOIR (LLVM 11.0.1, DRM 3.42, 5.15.35-desktop-2.mga8)

    For Vulkan, you determine your arguments this way:
    MESA_VK_DEVICE_SELECT=list vkcube
    selectable devices:
    GPU 0: 1002:1638 "AMD RADV RENOIR" integrated GPU 0000:00:00.0
    GPU 1: 1002:743f "AMD RADV BEIGE_GOBY" discrete GPU 0000:00:00.0
    GPU 2: 10005:0 "llvmpipe (LLVM 11.0.1, 256 bits)" CPU 0000:00:00.0

    So to pick a device, use the argument after the "GPU X:" So to use the Cezanne GPU in GravityMark do this:
    MESA_VK_DEVICE_SELECT=1002:1638 ./run_windowed_vk.sh
    And you should see both devices BEIGE_GOBY mostly idle and Cezanne with high usage. To run the 6400, do this:
    MESA_VK_DEVICE_SELECT=1002:743f ./run_windowed_vk.sh
    and you should see the 6400 under high load and Cezanne under low load.

    Lastly, to pick the device for DXVK, run this command: " vulkaninfo | grep "GPU id :" | sort | uniq" and you should get something like:
    GPU id : 0 (AMD RADV RENOIR):
    GPU id : 1 (AMD RADV BEIGE_GOBY):
    GPU id : 2 (llvmpipe (LLVM 11.0.1, 256 bits)):

    You can verify what is happening in your DXVK enhanced game by setting the DXVK_HUD environment variable to "full" like this: "export DXVK_HUD=full" on the command line before running the commands below.

    According to the DXVK documentation, you only need to specify something that differentiates the GPUs so to use the 6400 to play Batman Arkham Knight do this:
    DXVK_FILTER_DEVICE_NAME=BEIGE_GOBY wine BatmanAK.exe

    To use Cezanne, do this:
    DXVK_FILTER_DEVICE_NAME=RENOIR wine BatmanAK.exe

    Finally to set a GPU in Steam/Proton, just set DXVK_FILTER_DEVICE_NAME before running steam:
    DXVK_FILTER_DEVICE_NAME=BEIGE_GOBY steam
    and all launched games that use DXVK will use the 6400.


    Last edited by gukin; 02 May 2022, 06:01 PM.

    Leave a comment:


  • Mez'
    replied
    Originally posted by coder View Post
    The RX 6400 is supposed to be primarily an OEM variant of the RX 6500XT.


    Maybe, but it doesn't make any sense to have a dGPU that's any less powerful than this. I think its main market is for non-gaming use in systems that lack an iGPU.

    It can also serve as a first step up for those currently using an iGPU, but I'd say it's only very compelling if they have an older iGPU.
    I think this would fit pretty well in a multimedia (live sports stream - the official kind, video playing and light gaming) even for more than just a first step. Especially with the likely low consumption and noise (single fan).
    Plus, my case is a flat HTPC case (the flat micro-ATX type with just 12 cms high for the graphics card (not much choice at this height, believe me) and this would fit the bill.
    One more thing to take into the mix, is that paying more than 200€ for a video card is a no-go for me and my life priorities (only occasional time to spend on video gaming).

    Leave a comment:


  • qarium
    replied
    Originally posted by tildearrow View Post

    What? Ray tracing?!

    Who makes the decisions for these cards?
    What's more interesting? Hardware-accelerated AV1 decode or 10fps ray traced game?
    i would buy a rx6400 if it had AV1 decode...

    i dont know maybe the raytracing feature is for ultra-low-end raytracing games like Doom1.0 RT mod???
    but on the other hand ultra-low-end games like Quake2 RTX also eat a lof of performance...

    lowest end amd who has AV1 is the 6600.
    i think intel will sell many 6400/6500 like cards with 64bit ram channel only because they have AV1 decode...

    Leave a comment:


  • otoomet
    replied
    Originally posted by Tomin View Post
    I'm curious, how much less is this card is affected by that narrow bus than RX 6500XT? Kind of wondering if having it in a PCIe 3.0 setup is leaves less performance on table with this card.
    In gaming? Hardware Unboxed has a review where they test both PCIe 4.0 and 3.0.
    Briefly: It depends on the game, from almost nothing to very much.

    They test it with 5950x though, not sure if it matters that much on other systems.

    Leave a comment:


  • Tomin
    replied
    I'm curious, how much less is this card is affected by that narrow bus than RX 6500XT? Kind of wondering if having it in a PCIe 3.0 setup is leaves less performance on table with this card.

    Leave a comment:


  • gukin
    replied
    Originally posted by coder View Post

    It can also serve as a first step up for those currently using an iGPU, but I'd say it's only very compelling if they have an older iGPU.
    It's three times faster than my 5700g and I must say that playing Shadow of the Tomb Raider at 80 FPS is quite a bit more pleasant than the 25 FPS in high settings at 1080.

    Leave a comment:


  • otoomet
    replied
    Originally posted by Mattia_98 View Post
    ... Like this it's only good as a display out, but at that price you might as well get a GT710 unless you need HDMI2.1
    gt710 can (perhaps) cope with a single 1080 screen, it is only good enough if you use very little graphical resources (desktops/windows/tabs). Even gt1030 may run out of its 2GB memory with a single 4k screen. So you want to have a card with at least 4GB ram if for 4k screens. gt710 is definitely not there.

    Being able to play a game once and a while is a bonus.

    Leave a comment:


  • Mattia_98
    replied
    If only it had VP8, VP9 and AV1 decode it would have been a perfect HTPC card...
    Or if it had 4 miniDP it would have been a perfect office/work card. With that limited VRAM and PCIe lanes it's not even good for gaming.. Like this it's only good as a display out, but at that price you might as well get a GT710 unless you need HDMI2.1

    Leave a comment:


  • Mike Frett
    replied
    I wonder if it could be offered in a fanless design, like the GT 1030? GPUs with fans don't like me, they always die within one year.

    Leave a comment:


  • coder
    replied
    Originally posted by r1348 View Post
    What a piece of crap.
    It's cheap, though. Name a card with a modern feature set that's cheaper (i.e. in terms of street price, new).

    If you're just a desktop user, it makes plenty of sense to me. At my job, developers use machines with either iGPU graphics or a GTX 1050 Ti (for those machines without an iGPU or that have a 4k monitor). Those are totally fine, and this is way faster than either.

    Leave a comment:

Working...
X