Announcement

Collapse
No announcement yet.

Open-Source Radeon Performance Is Very Good For DiRT Rally On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • kamelie1706
    replied
    Well seems it was as simple as
    Code:
    sudo cpupower frequency-set -g performance
    Now in 2560x1080 Ultra I get:
    min 25
    avg 34
    max 44

    Close to windows performance! Seems linux default governor is not scaling properly when it needs ....


    Leave a comment:


  • kamelie1706
    replied
    Bought it because of linux support

    But unplayable on my favrourite linux platform
    Code:
       PROCESSOR:          AMD Phenom II X6 1055T @ 2.80GHz (6 Cores)
        Core Count:       6
        Extensions:       SSE 4a
        Cache Size:       512 KB
        Microcode:        0x10000dc
        Scaling Driver:   acpi-cpufreq schedutil
    
      GRAPHICS:           MSI AMD Radeon RX 460 2048MB
        OpenGL:           4.5 Mesa 17.0.2 Gallium 0.4 (LLVM 3.9.1)
        Display Driver:   modesetting 1.19.3
        Monitor:          2963
        Screen:           2560x1080
    
      MOTHERBOARD:        ASUS M4A88TD-V EVO/USB3
        Memory:           12288MB
        Chipset:          AMD RS880
        Network:          Realtek RTL8111/8168/8411
    
      DISK:               2 x 500GB Western Digital WD5000AAKS-0 + 64GB KINGSTON SNV425S + 63GB SanDisk SDSSDP06 + 2000GB Expansion Desk
        File-System:      ext4
        Mount Options:    data=ordered relatime rw
        Disk Scheduler:   CFQ
    
      OPERATING SYSTEM:   Arch Linux
        Kernel:           4.10.6-1-ARCH (x86_64)
        Desktop:          KDE Frameworks 5
        Compiler:         GCC 6.3.1 20170306
    So sadly no brainer to play it on win 10 as far as I am concerned ... for now!
    Last edited by kamelie1706; 03 April 2017, 02:11 AM.

    Leave a comment:


  • marek
    replied
    Here are my results. Note that I have a much slower CPU than Michael. It's Core i5 3570. The GPU is Radeon Fury X.

    I was running the benchmark at 1920x1080, Ultra settings.

    Mesa 13.0 + LLVM 3.9: 69.6 FPS
    Mesa 17.1-git + LLVM 3.9: 69.6 FPS
    Mesa 17.1-git + LLVM 4.0: 70.4 FPS
    Mesa 17.1-git + LLVM 5.0svn: 71 FPS

    Leave a comment:


  • haagch
    replied
    I downloaded llvm 3.9.1 from http://releases.llvm.org/download.html#3.9.1 (careful: the debian binaries use the old c++ ABI, the ubuntu ones the new one) and build mesa the mesa 13.0.4 commit with it.
    OpenGL renderer string: Gallium 0.4 on AMD POLARIS10 (DRM 3.11.0 / 4.10.0-g8fc7b85a6845, LLVM 3.9.1)

    But I can't see much of a performance difference to my current setup with latest llvm svn and mesa git, around 80 fps on medium (I only tried actual gameplay, no the benchmark).
    OpenGL renderer string: Gallium 0.4 on AMD POLARIS10 (DRM 3.11.0 / 4.10.0-g8fc7b85a6845, LLVM 5.0.0)
    kernel is drm-next-4.12-wip

    Has anyone tested this on archlinux or so and got the "regression" reversed? I wanted to bisect but if I can't get the good performance, not much luck...

    edit: Okay, I think with echo high > /sys/class/drm/card0/device/power_dpm_force_performance_level I have some performance difference. It's not that big though, more like 80->100 fps. Still, only ingame, not benchmark. Maybe I should run that.
    Last edited by haagch; 03 March 2017, 07:50 PM.

    Leave a comment:


  • chithanh
    replied
    What is missing is maybe a Linux vs. Windows comparison here.
    This is especially interesting as there are different minimum hardware requirements between the Linux and the Windows version.

    E.g. while a Radeon 6870 should work fine on Windows, the game would probably not even start on Linux with the r600g driver.
    Also heise.de (link in German) tested a Bristol Ridge APU (A10-9700) to get 35 fps on Windows on 1080p with medium settings. I wonder if the Linux version would perform similarly.

    Leave a comment:


  • LeJimster
    replied
    Originally posted by theriddick View Post

    The RX 500 series comes out first, then a month later you could expect the RX VEGA to appear, which from the looks of it is going to have 3 flavors on top of the RX 580 performance level, still a bit confused exactly how their going to name them...
    This. AFAIK the 500 series are Polaris+ and the Vega lineup will be called RX Vega much as we have the R9 Fury.

    Leave a comment:


  • Nille_kungen
    replied
    Originally posted by theriddick View Post

    The RX 500 series comes out first, then a month later you could expect the RX VEGA to appear, which from the looks of it is going to have 3 flavors on top of the RX 580 performance level, still a bit confused exactly how their going to name them...
    I'm not to sure about that.
    If Vega is to be called RX Vega then i think it would make more sense to call the rebranded cards RX Polaris if there will be any rebranded cards..
    In the youtube link i posted it makes me think that there might not be any rx500 series at all.
    It will be interesting to see how it turns out.

    Leave a comment:


  • theriddick
    replied
    Originally posted by Nille_kungen View Post
    Will there be an RX 580 or RX VEGA?
    The RX 500 series comes out first, then a month later you could expect the RX VEGA to appear, which from the looks of it is going to have 3 flavors on top of the RX 580 performance level, still a bit confused exactly how their going to name them...

    Leave a comment:


  • ItzExor
    replied
    Originally posted by haagch View Post
    Mine does scale up, I think the load is just not high enough overall and both governors interact badly with each other or something. For that particular shader I mentioned above, If I set CPU to performance then GPU clocking trends upwards a bit and I go from 20 to around 40FPS. If I leave CPU on powersave and only set GPU to max state, then the CPU trends up a bit as well and I'm at around 40fps again. If I enable both, they both sit at their max state all the time and I get around 60FPS.

    Not every game/load changes that much. Games with higher load don't seem to change very much with these settings. Old games running on wine w/ gallium nine seem to get huge slowdowns if I leave it on standard. In CS:GO my fps varies wildly without this, but still averages pretty high > 200 fps.

    Leave a comment:


  • haagch
    replied
    Originally posted by ItzExor View Post
    I don't know if this is the same reason that the performance has regressed in this test, but I have found that lately I need to switch the CPU and GPU power profiles to performance to get the same performance that I did in the past on default settings.
    My bug report is maybe related: https://bugs.freedesktop.org/show_bug.cgi?id=99967

    Leave a comment:

Working...
X