Originally posted by Luke
View Post
Announcement
Collapse
No announcement yet.
Here's Why Radeon Graphics Are Faster On Linux 3.12
Collapse
X
-
-
What my experiments show
Originally posted by s_j_newbury View PostLuke, while technically a graphically very simple GPU accelerated game might be well be non-CPU limited at the extreme, I'm not sure what you're trying to measure. It could well be you're hitting the GPU fillrate limit, exceeding the internal bandwidth of the card and at such high frame-rates I can't see how it's meaningful.
Until rather recently, Critter had issues with running rough on Mesa drivers, dropping frames and visibly stuttering unless the framerate never went under about 300fps. This was because frames would be dropped in strings, and a high framerate kept down the time covered by an entire string of missing or duplicate frames. Both Radeon and Nouveau drivers were affected, I'm pretty sure Intel was too, though it's been awhile since I've tried to ply Critter on the netbook. Nouveau, due to the lack of ability to reclock the GPU, had trouble getting a GTX450 over about 250fps at 1080p, at which point the stuttering was very noticable. A few months ago, another Nouveau test found the roughness in Critter greatly reduced, and the highest framerates in Scorched3d up around 30fps from around 18. About all I use Nvidia stuff for these days is to see how Nouveau is doing from time to time.
I thought and still think it is NUTS that a game whose progenitors ran on video arcade machines of 30 years ago would have issues running on ANY driver on ANY graphics card produced since whatever version of OpenGL it was written in came out, but it does or at least did due to the dropped frame roughness issue. I would have expected CPU and GPU to sit at stone cold idle playing a Galaxians style game, but due to how it was written this is far from the case. In fact, nothing else I have (game or otherwise) will generate more heat in my graphics cards, as Scorched3d and 0ad are both so CPU limited that the graphics card is of reduced importance it seeems. Due to that, I use both Critter and Scorched3d to benchmark graphics drivers in my machines with games I actualy play. Critter is more repeatable and works the smaller HD5570 as hot as it does an Nivdia GT520 on its blob, Scorched3d is much more graphically demanding but dislikes the sb backend that seems to help a bit in 0ad. 0ad is so CPU bound that even 3 or 4-way SLI on Nvidia's blob would probably show little or no benefit, while finding a way to run the AI in one instance of the game connected to the player game running as a multi-player game would possibly double framerates when a few hundred characters are on the board-and do so even on the ATI HD5570.
Sorry this is not using a test suite written for games I do not have and cannot easily download due to bandwidth issues, but I just try to add what information I do get to the total body of information that is out there. I'm using some less common hardware, so I figure the results might help somebody,
Leave a comment:
-
Default governor is performance.
Code:choice prompt "Default CPUFreq governor" default CPU_FREQ_DEFAULT_GOV_USERSPACE if ARM_SA1100_CPUFREQ || ARM_SA1110_CPUFREQ default CPU_FREQ_DEFAULT_GOV_PERFORMANCE help This option sets which CPUFreq governor shall be loaded at startup. If in doubt, select 'performance'.
Last edited by JS987; 16 October 2013, 01:56 PM.
Leave a comment:
-
Originally posted by marek View PostIt's not a default setting in the kernel. It's a setting forced by a startup script in Ubuntu and you can't turn it off (unless you delete the script manually).
Leave a comment:
-
Originally posted by gururise View PostAgreed! Everyone I know uses the default "ondemand" governor. While I'd like to use the 'performance' governor, the power consumption prevents me from doing so. Almost every user is going to be using 'ondemand'. A good user experience demands that things just work with the default settings. If thats not the case, then someone needs to change the defaults!
Leave a comment:
-
Originally posted by GreatEmerald View PostThere was a bug in kernel 3.10 where the intel_pstate driver used the turbo frequency as the base performance frequency, and that used to heat things up more than it should have. But that issue was solved in 3.11, but Ubuntu hasn't enabled it back just yet.
Leave a comment:
-
Originally posted by GreatEmerald View PostAnd will make your FPS dip to 30 if 60 can't be sustained, instead of just 59... No, you should have VSync on only for games you know will never dip below 60 (or whatever your refresh rate may be).
Leave a comment:
-
Originally posted by Michael View PostThen it should be fixed or the default setting changed within the kernel.
Leave a comment:
-
Originally posted by Ericg View PostOn any OTHER distro (not *Buntu based) Yuuuuuuup. But Canonical hasn't gotten their heads out of their asses yet and switched yet. Someone said that it was because of bugs in the driver but im finding it hard to believe that THEY are hitting bugs that no one else has.
Leave a comment:
-
Originally posted by ChrisXY View PostSo the benchmarks that showed the improvements were with intel cpus: http://www.phoronix.com/scan.php?pag...12_major&num=2
How come they even use ondemand? Has cpufreq not been superseded by pstate for several kernel versions now?
Leave a comment:
Leave a comment: