Announcement
Collapse
No announcement yet.
With R600g Now Supporting OpenGL 4.1, See How The Open-Source Performance Compares To AMD Catalyst
Collapse
X
-
Originally posted by airlied View PostApple does something tricky with fp64 but I'm not 100% sure what.
https://developer.apple.com/opengl/capabilities/- ARB_gpu_shader_fp64 functionality is implied by OpenGL 4.0, but not exported on renderers marked by "~"
- ARB_vertex_attrib_64bit functionality is implied by OpenGL 4.1, but not exported on renderers marked by "~"
Seriously though, at least it's a precedent that game developers might have already seen.
Whether the shaders fail to compile or they produce garbage, either way actually *using* the extension is not going to end well, but I suppose one could argue that an untrue implication is less bad than a lie. I have trouble with them personally because I see more damage done by untrue implications than by lies (untrue implications appear to be the foundation of our modern political system) but if it turns out to be a generally accepted way of communicating the situation it might be worth following for now. Good catch !
- Likes 1
Leave a comment:
-
Originally posted by bridgman View PostAre you guys thinking about something like a build option to force-fake fp64 on hardware that doesn't have native support ? If you're actually asking the developers to lie about support then obviously you haven't been following the latest Dilbert story arc:
http://dilbert.com/strip/2015-12-14 (then follow it forward)Last edited by dungeon; 17 December 2015, 04:56 PM.
Leave a comment:
-
Apple does something tricky with fp64 but I'm not 100% sure what.
https://developer.apple.com/opengl/capabilities/- ARB_gpu_shader_fp64 functionality is implied by OpenGL 4.0, but not exported on renderers marked by "~"
- ARB_vertex_attrib_64bit functionality is implied by OpenGL 4.1, but not exported on renderers marked by "~"
So it seems they might advertise GL 4.1 but not advertise the extensions as a sign that they don't actually support it.
I've no idea what happens if you run fp64 code on this driver, whether it fails to compile the shaders or just produces garbage.
Dave.
Leave a comment:
-
Are you guys thinking about something like a build option to force-fake fp64 on hardware that doesn't have native support ? If you're actually asking the developers to lie about support then obviously you haven't been following the latest Dilbert story arc:
http://dilbert.com/strip/2015-12-14 (then follow it forward)
I think the main issue here is that the developers would rather see time going into a proper solution (emulating fp64) than a throw-away solution (announcing fp64 even if you can't support it). The over-ride mechanism is a good solution for temporary gaps, and AFAICS that's what this is.
EDIT - there seems to be something funny with the bar graphs - the only options seem to be no bars, 2 bars, all-but-2 bars... and I guess maybe all-bars ?Last edited by bridgman; 17 December 2015, 03:57 PM.
- Likes 1
Leave a comment:
-
But opensource drivers does not even have s3tc and/nor texture-float by default
For both those situation users need to do something about it, if they compile it or maintainer can turn on that or do something about it, package maintainer can also patch mesa and to fake advertise fp64 for all r600 if his users are so lazy
Leave a comment:
-
Originally posted by eydee View Post
Or -- as had been noted in just about every R600g thread -- it should simply return fake data/null like Intel and Nvidia cards do (some of them). While the chance of people wanting to run Metro 2033 on these cards is pretty high, people wanting to calculate Julia/Mandel/etc on these GPUs using GL shaders for fun is pretty rare. At this moment this is the only known usage scenario of fp64. Point is, you can't expect Average Joe to run Steam games with Mesa environment variables. Linux is no longer for the super tech-savvy geeks only.
- Likes 2
Leave a comment:
-
Originally posted by eydee View PostOr -- as had been noted in just about every R600g thread -- it should simply return fake data/null like Intel and Nvidia cards do (some of them).
Leave a comment:
Leave a comment: