Originally posted by haagch
View Post
Announcement
Collapse
No announcement yet.
RadeonSI OpenGL Compute Shader Patches Published
Collapse
X
-
Originally posted by Ancurio View Post
Robustness is used exclusively for running WebGL, it's irrelevant for anything gaming or desktop related.
Robust buffer access can be enabled by creating a context with robust
access enabled through the window system binding APIs. When enabled, any command unable to generate a GL error as described above, such as buffer object accesses from the active program, will not read or modify memory outside of the data store of the buffer object and will not result in GL interruption or termination.
Comment
-
Originally posted by CrystalGamma View Post
Can you sum up what it actually does? I'm guessing process isolation has always been there, considering the security implications. So what does implementing this extension actually guarantee?
Replaces undefined behaviour with reads = 0, writes = discarded.
Comment
-
Well 50€...
I first wondered whether to buy from https://www.bundlestars.com/en/bundl...-mordor-bundle but this will no doubt count as a windows purchase...
Here is currently a sale for the same price: https://www.indiegala.com/store/prod...-edition/51209 and they advertise linux support... So if Feral doesn't get money there's not much more I can do about it.
Downloading now...
Comment
-
So Shadow of Morder.. Simply starting it with OpenGL 4.2 works, but then it looks like this:
I guess it's still this: https://bugs.freedesktop.org/show_bug.cgi?id=92059
Running it with MESA_GL_VERSION_OVERRIDE=4.3fc MESA_GLSL_VERSION_OVERRIDE=430fc produces a more complete image. Here is some gameplay:
I don't know whether everything is rendering or if still e.g. rain is missing, I've only played few minutes.
To be honest, with that kind of performance (i7 3632qm, HD 7970M) it's not really playable.
And a bit off topic: Intel's driver is STILL very buggy with PRIME. Here is an example for fun corruption in spotify after playing shadow of mordor with PRIME:
https://www.youtube.com/watch?v=sIhqLRp5VF8.
I had to record with a camera, because on a screencast there was no corruption visible. People keep saying intel has the best linux driver, yet bugs like this keep showing up again and again...
- Likes 1
Comment
-
-
Originally posted by zanny View PostNobody has GL_ARB_robust_buffer_access_behavior yet, and it seems like it might be a PITA to implement rigorous bounds checking on buffers like that throughout the driver stack. RadeonSI might only need four extensions to reach 4.3, but there is always a reason they aren't already done - they are usually the hardest, or at least most annoying, ones to implement.
- Likes 3
Comment
-
Originally posted by leonmaxx View PostTry "thread apply all bt" instead, this could be separate rendering thread or corrupted stack.
Anyway, I experimented a bit and as far as I can see, leaving all CFLAGS on default (i.e. not setting the variable at all) makes it segfault after the initial game loading screen, but setting the CFLAGS variable to -ggdb makes it work - even when the driver libraries are stripped. Which will make this difficult to debug. Oh well, at least it shouldn't slow down mesa significantly...
Originally posted by atomsymbolFor the sake of completeness: What's the in-game graphics quality setting?
Relatively high settings, I can later try with lowest settings. But I did notice that the GPU usage is relatively high - that is at least good.
Some games like Saints Row IV run with very bad performance, while having 15-30% cpu usage, and that shows then that there are huge bottlenecks somewhere.
Comment
Comment