Former AMD Developer: OpenGL Is Broken

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • V10lator
    replied
    Originally posted by Nille View Post
    Taxi thats not the point alone. The Problem is also that you have to do the whole compiling process at the first start. With a Bytecode you has only to compile for your hardware and save time.
    But you can't compile for all and every hardware (GPU) out there. If that would be the case we wouldn't need OpenGL nor DirectX, we would simply program the GPU directly.

    Originally posted by mdias View Post
    glProgramBinary? really!? have you even read about it? The binaries are hardware dependent! Good luck shipping your binary shaders with your games/apps.
    You completely misunderstood: The game should not ship with precompiled binaries but after it has compiles a shader for the first time (first run on the end-users hardware) it should save it to disk and then re-use it (AFAIK games like Serious Sam 3: BFE are doing that).
    A driver's job is to drive the hardware and expose it's functionality, not implement API specific work-arounds for it's mindless design!
    Good that we agree here. I was not talking about API specific workarounds, I was talking about workarounds for programs using the API in a bad way. Now please don't try to tell me windows GPU drivers don't have workarounds for almost every (AAA) DirectX game(/engine) out there...

    Leave a comment:


  • ChrisXY
    replied
    Originally posted by mdias View Post
    Because:[*]Bigger redistribution files.
    A little bit of shader source code? Modern graphical intensive games are easily 15+ Gigabyte...

    Originally posted by mdias View Post
    [*]Added software complexity.
    I don't imagine it would take more than one isolated if block with a write to and a read from the config file and a loop... Maybe add a progress bar? Is this really significant in relation to everything else a modern game contains?

    Originally posted by mdias View Post
    [*]Exposed shader source (important for some people).
    More exposed than now? You can already print their code as they are fed to the compiler:
    Code:
    MESA_GLSL=dump
    Originally posted by mdias View Post
    [*]We can do better.
    Yes, with a proper vendor independent redistributable binary format but until then...

    Leave a comment:


  • mdias
    replied
    Originally posted by ChrisXY View Post
    Why not simply compile all the shaders on first run, every time the driver version changes?
    Because:
    1. Bigger redistribution files.
    2. Added software complexity.
    3. Exposed shader source (important for some people).
    4. We can do better.

    Leave a comment:


  • Ancurio
    replied
    Originally posted by ChrisXY View Post
    Why not simply compile all the shaders on first run, every time the driver version changes? I mean, compiling shaders doesn't actually take that long, does? Especially if as we have read the shaders get distributed already "compiled" to a simple subset of GLSL.
    Still, loading times of up to 1 minute, even if only for the first run, can be incredibly frustrating for users (especially if this never happens on Windows): https://github.com/ValveSoftware/Dota-2/issues/661

    Leave a comment:


  • ChrisXY
    replied
    Why not simply compile all the shaders on first run, every time the driver version changes? I mean, compiling shaders doesn't actually take that long, does? Especially if as we have read the shaders get distributed already "compiled" to a simple subset of GLSL.

    Leave a comment:


  • mdias
    replied
    Originally posted by TAXI View Post
    That's what shader caches are for. AFAIK nvidia has a in-RAM and on-disk cache, no idea about fglrx but Mesa has a in-RAM cache and work has been started for a on-disk cache (see for example http://patchwork.freedesktop.org/patch/18964/ ).
    In other words: Shaders being compiled multiple times is a fault of the program: It could request the compiled shader after the first time and re-use it ( http://www.opengl.org/wiki/GLAPI/glProgramBinary ). Anyway drivers are able to implement a work-around (cache) for programs doing it wrong. So this is not OpenGLs fault.
    Wtf are you smoking? I want some of that too!

    A work-around is a work-around, not a solution. glProgramBinary? really!? have you even read about it? The binaries are hardware dependent! Good luck shipping your binary shaders with your games/apps.
    Are you also going to blame the stupid developers for asking for separate shaders? Perhaps khronos was drunk to obey, since there are work-arounds...

    A driver's job is to drive the hardware and expose it's functionality, not implement API specific work-arounds for it's mindless design! OpenGL is currently still a fail in many ways, even if it has come a long long way...

    Leave a comment:


  • ChrisXY
    replied
    Especially with games in wine there this problem and it sounds like it is the shader cache.

    For example if you play mass effect 2 with d3dstream patches it works with really good performance but after loading a level the first look around suffers from heavy stuttering and if it is a big room you basically can't play for 2-3 seconds. I have read that this is because the engine doesn't preload the graphics stuff and on direct3d/windows this isn't much of a problem, but translated to opengl it apparently is...

    Leave a comment:


  • Aleve Sicofante
    replied
    Originally posted by Ancurio View Post
    rofl, the lengths some delusional people will go to defend flaws in an API that they have probably never coded anything serious in themselves. In case you were unaware, both Rich Geldreich and Josh Barczak have worked on real world AAA codebases using OpenGL, I don't think they're in the process of "learning" the API.

    Also, GLuint is not "different", it's just plain horrible.
    I'm not defending anything. I used GL well before it was called OpenGL but that's not the point. I of course don't know these two guys and I've only said what this looks to me from the outside. The fact that they had a number of replies showing they don't quite understand how the OpenGL works, shows me it doesn't matter how many "AAA codebases" they have worked on.

    Leave a comment:


  • Aleve Sicofante
    replied
    Originally posted by BlackStar View Post
    No, he is developing for OpenGL. Where the hell did you get he is developing for Linux?
    Oh, he's developing Unity3D in a vacuum, not for Linux, Windows and OS X users to actually use, but just for the fun. I'm such a fool for not understanding that...

    Leave a comment:


  • johnc
    replied
    That is an annoying feature because the first time you play a game you have all kinds of jitters from the driver building the GL cache. It's not the best experience and the first time through should be the optimum experience.

    Leave a comment:

Working...
X