Announcement

Collapse
No announcement yet.

AMD RadeonSI Gallium3D Performance Has A Long Way To Go

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • madbiologist
    replied
    Originally posted by mmstick View Post
    HAHAHAHAHAHAHA no. Actually get a Radeon HD 7xxx and try for yourself. Performance sucks and this will not help much at all. The only thing playable is 2D indie games or old games like HL1, and even then framerate is low. If you use 3.11 kernel and mesa git, many games are currently broken.
    Setting the power_profile to high does seem to help a fair bit - see http://phoronix.com/forums/showthrea...160#post341160

    And yes, we know that rendering in Xonotic was broken a month ago. Remember that the 3.11 kernel is still a work in progress.

    Leave a comment:


  • mmstick
    replied
    Originally posted by madbiologist View Post
    As others have said, RadeonSI performance can be greatly improved either by selecting profile based power management and then setting /sys/class/drm/card0/device/power_profile to high, or by using a 3.11 kernel.
    HAHAHAHAHAHAHA no. Actually get a Radeon HD 7xxx and try for yourself. Performance sucks and this will not help much at all. The only thing playable is 2D indie games or old games like HL1, and even then framerate is low. If you use 3.11 kernel and mesa git, many games are currently broken.
    Last edited by mmstick; 29 July 2013, 05:51 PM.

    Leave a comment:


  • madbiologist
    replied
    As others have said, RadeonSI performance can be greatly improved either by selecting profile based power management and then setting /sys/class/drm/card0/device/power_profile to high, or by using a 3.11 kernel. Interestingly, when using a 3.11 kernel it seems like it will not be necessary to enable the new "DPM" power mangement (by adding radeon.dpm=1 to the GRUB kernel boot options) to get good performance (although of course enabling DPM will reduce power consumption and lower heat output) - I have found the following patch in the 3.11 kernel:

    author Alex Deucher 2013-07-05 17:14:30 (GMT)
    committer Alex Deucher 2013-07-05 22:08:54 (GMT)
    commit c6cf7777a32da874fabec4fd1c2a579f0ba4e4dd
    tree 22a8b1f3b98714760a24b69f7d45d56c716dcfe0
    parent 338a95a95508537e23c82d59a2d87be6fde4b6ff

    drm/radeon: set default clocks for SI when DPM is disabled

    Fix patching of vddc values for SI and enable manually forcing clocks to default levels as per NI.

    This improves the out of the box performance with SI asics.

    Signed-off-by: Alex Deucher

    Leave a comment:


  • GreatEmerald
    replied
    Originally posted by ChrisXY View Post
    Schr?dinger's cat. Come on, you have unicode support, don't you?
    Indeed, and got to love that Compose button ☺

    As a side note, the Fedora codenames are pretty fun. I used both Spherical Cow and Schr?dinger's Cat quite extensively lately, since one of the courses I had was Physics.

    Leave a comment:


  • mmstick
    replied
    The same can be said for Catalyst, which performs horribly for GCN.

    Leave a comment:


  • smitty3268
    replied
    Originally posted by ChrisXY View Post
    xonotic on ultra was bugged, most of the level did not render. But every other test was rendering fine.
    Code:
    warning: failed to translate tgsi opcode DDX to LLVM
    If those DDX opcode warnings are the problem, then it might be solved pretty soon. I think i've seen patches on the mailing list implementing DDX and DDY.

    I also think that's one of the last things needed before GL3 support is done, although there might be a few others.

    Leave a comment:


  • curaga
    replied
    "6450 beats the 7850"

    Go 6450!

    Leave a comment:


  • ChrisXY
    replied
    Hm..... http://openbenchmarking.org/result/1...UT-1306287SO77
    drm-next-3.11-wip-5, llvm svn 185061, latest libdrm git, mesa git, xf86-video-ati-git, glamor git

    Maybe it was because of kwin's xrender compositing or maybe it was because of prime.
    (kwin opengl compositing has the problem that often it only shows a black screen for 3d using prime until you disable and reenable compositing with alt+shift+f12)

    xonotic on ultra was bugged, most of the level did not render. But every other test was rendering fine.
    Code:
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22



    And this dpm really needs the auto poweroff of hybrid gpus that are not used. dpm definitely seems to work as the fan goes off now and then, but then after a little while it goes on again. And while the test was running it did not go off.
    Last edited by ChrisXY; 28 June 2013, 04:28 PM.

    Leave a comment:


  • Veerappan
    replied
    Originally posted by storm_st View Post
    dont forget that 7XXX cards bios tend to start at lowest power profile, you need to change it to high, for example as root before benchmarking

    Code:
    echo "high" > /sys/class/drm/card0/device/power_profile
    I am sure that speed difference ~8x because of that. My 7850 work almost perfect here, after that trick. 2D scrolling, gnome-shell transitions, all become very smooth.
    You're right about the power profiles... I guess that VBIOS info was misleading.

    From /sys/kernel/debug/dri/0/radeon_pm_info (root-only) for my 7850:

    default engine clock: 860000 kHz
    current engine clock: 149990 kHz
    default memory clock: 1200000 kHz
    current memory clock: 149990 kHz
    voltage: 1075 mV
    PCIE lanes: 16

    Leave a comment:


  • Laughing1
    replied
    Do one covering radeon.dpm=1

    Leave a comment:

Working...
X