Do one covering radeon.dpm=1
Announcement
Collapse
No announcement yet.
AMD RadeonSI Gallium3D Performance Has A Long Way To Go
Collapse
X
-
Originally posted by storm_st View Postdont forget that 7XXX cards bios tend to start at lowest power profile, you need to change it to high, for example as root before benchmarking
Code:echo "high" > /sys/class/drm/card0/device/power_profile
From /sys/kernel/debug/dri/0/radeon_pm_info (root-only) for my 7850:
default engine clock: 860000 kHz
current engine clock: 149990 kHz
default memory clock: 1200000 kHz
current memory clock: 149990 kHz
voltage: 1075 mV
PCIE lanes: 16
Comment
-
Hm..... http://openbenchmarking.org/result/1...UT-1306287SO77
drm-next-3.11-wip-5, llvm svn 185061, latest libdrm git, mesa git, xf86-video-ati-git, glamor git
Maybe it was because of kwin's xrender compositing or maybe it was because of prime.
(kwin opengl compositing has the problem that often it only shows a black screen for 3d using prime until you disable and reenable compositing with alt+shift+f12)
xonotic on ultra was bugged, most of the level did not render. But every other test was rendering fine.
Code:warning: failed to translate tgsi opcode DDX to LLVM Failed to translate shader from TGSI to LLVM EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22 warning: failed to translate tgsi opcode DDX to LLVM Failed to translate shader from TGSI to LLVM EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22 warning: failed to translate tgsi opcode DDX to LLVM Failed to translate shader from TGSI to LLVM EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22 warning: failed to translate tgsi opcode DDX to LLVM Failed to translate shader from TGSI to LLVM EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22 warning: failed to translate tgsi opcode DDX to LLVM Failed to translate shader from TGSI to LLVM EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22 warning: failed to translate tgsi opcode DDX to LLVM Failed to translate shader from TGSI to LLVM EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22 warning: failed to translate tgsi opcode DDX to LLVM Failed to translate shader from TGSI to LLVM EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22 warning: failed to translate tgsi opcode DDX to LLVM Failed to translate shader from TGSI to LLVM EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22 warning: failed to translate tgsi opcode DDX to LLVM Failed to translate shader from TGSI to LLVM EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22 warning: failed to translate tgsi opcode DDX to LLVM Failed to translate shader from TGSI to LLVM EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22 warning: failed to translate tgsi opcode DDX to LLVM Failed to translate shader from TGSI to LLVM EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22 warning: failed to translate tgsi opcode DDX to LLVM Failed to translate shader from TGSI to LLVM EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
And this dpm really needs the auto poweroff of hybrid gpus that are not used. dpm definitely seems to work as the fan goes off now and then, but then after a little while it goes on again. And while the test was running it did not go off.Last edited by ChrisXY; 28 June 2013, 04:28 PM.
Comment
-
Originally posted by ChrisXY View Postxonotic on ultra was bugged, most of the level did not render. But every other test was rendering fine.
Code:warning: failed to translate tgsi opcode DDX to LLVM
I also think that's one of the last things needed before GL3 support is done, although there might be a few others.
Comment
-
Originally posted by ChrisXY View PostSchr?dinger's cat. Come on, you have unicode support, don't you?
As a side note, the Fedora codenames are pretty fun. I used both Spherical Cow and Schr?dinger's Cat quite extensively lately, since one of the courses I had was Physics.
Comment
-
As others have said, RadeonSI performance can be greatly improved either by selecting profile based power management and then setting /sys/class/drm/card0/device/power_profile to high, or by using a 3.11 kernel. Interestingly, when using a 3.11 kernel it seems like it will not be necessary to enable the new "DPM" power mangement (by adding radeon.dpm=1 to the GRUB kernel boot options) to get good performance (although of course enabling DPM will reduce power consumption and lower heat output) - I have found the following patch in the 3.11 kernel:
author Alex Deucher 2013-07-05 17:14:30 (GMT)
committer Alex Deucher 2013-07-05 22:08:54 (GMT)
commit c6cf7777a32da874fabec4fd1c2a579f0ba4e4dd
tree 22a8b1f3b98714760a24b69f7d45d56c716dcfe0
parent 338a95a95508537e23c82d59a2d87be6fde4b6ff
drm/radeon: set default clocks for SI when DPM is disabled
Fix patching of vddc values for SI and enable manually forcing clocks to default levels as per NI.
This improves the out of the box performance with SI asics.
Signed-off-by: Alex Deucher
Comment
-
Originally posted by madbiologist View PostAs others have said, RadeonSI performance can be greatly improved either by selecting profile based power management and then setting /sys/class/drm/card0/device/power_profile to high, or by using a 3.11 kernel.Last edited by mmstick; 29 July 2013, 05:51 PM.
Comment
-
Originally posted by mmstick View PostHAHAHAHAHAHAHA no. Actually get a Radeon HD 7xxx and try for yourself. Performance sucks and this will not help much at all. The only thing playable is 2D indie games or old games like HL1, and even then framerate is low. If you use 3.11 kernel and mesa git, many games are currently broken.
And yes, we know that rendering in Xonotic was broken a month ago. Remember that the 3.11 kernel is still a work in progress.
Comment
Comment