Originally posted by geearf
View Post
Announcement
Collapse
No announcement yet.
Feral's GameMode 1.1 Released For Optimizing Linux Gaming Performance
Collapse
X
-
I always have to manually set AMDGPU to run at max clock through sysfs, or else I get stuttering in games. I also set my CPU scheduler to performance, and kill compton to ensure there is no vsync. I have a keybind to do all this.
A big problem with Linux gaming is a lot of new people wouldn't know to do all this stuff, they'd be running games with schedutil or intel's powersave scheduler, with a compisitor running, and dynamic power management on their GPU.Last edited by czz0; 12 May 2018, 06:42 PM.
- Likes 2
Comment
-
Originally posted by Veto View Post
mprime is in no way representative of how a game works. Neither is it representative of typical interactive desktop use. Powersave makes a big difference on power consumption for typical desktop use on a laptop.
Comment
-
Originally posted by geearf View Post
Do you think a game will stress a CPU more than mprime? If so I am happy to retest with a CPU heavy game, but I doubt that will be any different.
Comment
-
Originally posted by Venemo View PostWhy not just use the tuned daemon for this? It has been around for several years and has the same features and a lot more.
Could tuned we expanded with a D-BUS endpoint and be changed into dynamically switching profiles when a game request it? Of course but #1 would that require more or less code than GameMode and #2 would Red Hat accept such patches? The answer to both might well be yes, I do not know I just wanted to say that it's not just as simple as "use tuned instead".
Comment
-
Originally posted by F.Ultra View Post
Something like mprime will put max strain on the CPU 100% of the time while a game will switch between roughly 0% and 100% quickly (depending on how fast your GPU is and how GPU/CPU intensive the game is). So it's not that a game will stress the CPU more than mprime, it's that a game will sometimes (often depending on game and/or system) put less stress on the CPU.
I have a desktop test (chrome opened with ~100 tabs, various services running in the back, etc), average 2% load, variation between 0 and 5 or so, showing an average 1w difference between powersave and performance.
That should be somewhat similar to your lowest points in game.
Then I have mprime option 3, not the craziest one on the CPU, which should be somewhat similar to your highest points in game, showing hardly any difference.
So far it shows pretty much no difference at close to min and max desktop consumption.
I can try mid desktop consumption, but I'd bet that if min and max are equivalent average would be as well. I'm happy to lose the bet though
I wouldn't mind running a game benchmark and measuring power then, but since my meters are not oscilloscopes but simple meters with only one number displayed, it'd be quite hard for me to get a good average when consumptions varies a lot. I guess I can film it with a camera and try to get a measure every 0.x seconds and plot that but that's fairly annoying to do, especially if I expect no difference.
Of course, I wouldn't make that bet for cpufreq where I expect powersave to behave quite differently than with pstate.
Changes in perf_bias might matter more than change of governor but I don't understand that very well, so not sure what to try.
Comment
-
So I've just ran a few more tests with changing perf_bias, here's the summary:
perf_bias: 0
diff idle: None
diff mprime 3: 2w
perf_bias: 6 (the default for my cpu or distro? not sure)
diff idle: None
diff mprime 3: 2w
diff Dolphin Soul Calibur 2: None to 5w (A lot of variation so hard to accurately say...)
perf_bias: 15
diff idle: None
diff mprime 3: 2 to 4w
(in this case idle was only Plasma and whatever systemd services running in the background, no browser or anything heavy app open, unlike my previous test with desktop stuff open)
With differences so small, staying with performance seems fine, at least on a desktop similar to mine using recent pstate.
- Likes 1
Comment
-
Originally posted by geearf View Post
Ok I understand that, but what's the point?
I have a desktop test (chrome opened with ~100 tabs, various services running in the back, etc), average 2% load, variation between 0 and 5 or so, showing an average 1w difference between powersave and performance.
That should be somewhat similar to your lowest points in game.
Then I have mprime option 3, not the craziest one on the CPU, which should be somewhat similar to your highest points in game, showing hardly any difference.
So far it shows pretty much no difference at close to min and max desktop consumption.
I can try mid desktop consumption, but I'd bet that if min and max are equivalent average would be as well. I'm happy to lose the bet though
I wouldn't mind running a game benchmark and measuring power then, but since my meters are not oscilloscopes but simple meters with only one number displayed, it'd be quite hard for me to get a good average when consumptions varies a lot. I guess I can film it with a camera and try to get a measure every 0.x seconds and plot that but that's fairly annoying to do, especially if I expect no difference.
Of course, I wouldn't make that bet for cpufreq where I expect powersave to behave quite differently than with pstate.
Changes in perf_bias might matter more than change of governor but I don't understand that very well, so not sure what to try.
Comment
-
Originally posted by F.Ultra View Post
The results that you see on your system might not be equal to that of others. E.g for you and your particular setup yes this seams to be a complete waste of time.
All my answers always state that, I'm not sure what your point is.
Even then I'd guess that few people actually measured this before starting to say that it matters (well on a laptop it's obviously easy to count hours).
To be honest, I used to think it mattered till someone challenged me about it on reddit and I thought why not measure it since I have the tools and then realize how useless all this was. (Of course not for a farm, saving 2w per CPU when you have thousands if not millions is a different thing...)Last edited by geearf; 14 May 2018, 01:43 PM.
Comment
-
Originally posted by geearf View Post
Well yes that's obvious, that's why I wrote "at least with a similar configuration to mine" and "at least on a desktop similar to mine".
All my answers always state that, I'm not sure what your point is.
Even then I'd guess that few people actually measured this before starting to say that it matters (well on a laptop it's obviously easy to count hours).
To be honest, I used to think it mattered till someone challenged me about it on reddit and I thought why not measure it since I have the tools and then realize how useless all this was. (Of course not for a farm, saving 2w per CPU when you have thousands if not millions is a different thing...)
Sorry about the "might not equal that of others", hadn't followed your posts back to your first one.
Comment
Comment