Announcement

Collapse
No announcement yet.

Gallium3D Gets New Geometry Shader Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
    BlackStar
    Senior Member

  • BlackStar
    replied
    Originally posted by Eosie View Post
    Are you sure? I haven't said anything like that. There is a difference between "must" and "will". Thank god I didn't say drivers "must" support OpenGL 3.2. (and I do contribute code to Gallium, which is why I am concerned about it)
    You most certainly did say "must" in your post:
    It was decided by certain developers that all Gallium drivers must support geometry shaders. From all the in-tree hardware drivers, 8 of them cannot support GS in hardware, so it must be done in software
    And I most certainly didn't say anything about OpenGL 3.2 in my reply.

    I simply cannot see how a software fallback for geometry shaders could be a bad thing. As far as I can tell, this code can be shared between all drivers and the effort, non-trivial as it might be, will certainly help the OpenGL stack move forward as a whole (more so than, say, implementing geometry shaders for R600+).

    Do you have a link for the developer discussion on this topic?
    BlackStar
    Senior Member
    Last edited by BlackStar; 29 December 2009, 08:21 AM. Reason: More context in the quote

    Leave a comment:

  • marek
    X.Org Developer

  • marek
    replied
    Originally posted by BlackStar View Post
    older cards will get geometry shaders emulated in software
    Are you sure? I haven't said anything like that. There is a difference between "must" and "will". Thank god I didn't say drivers "must" support OpenGL 3.2. (and I do contribute code to Gallium, which is why I am concerned about it)

    Originally posted by BlackStar View Post
    And why do you think this is a problem?
    Already answered:
    I am curious who will be implementing it, given the incomplete state of most of the drivers and a lack of manpower.
    ~ Marek

    Leave a comment:

  • BlackStar
    Senior Member

  • BlackStar
    replied
    Originally posted by Eosie View Post
    This is incorrect. I got full OpenGL Catalyst driver from windows update on a mobile Radeon.
    This has never happened on any of my nvidia, ati and intel systems. Additionally, I haven't been able to find any credible source that supports this. On the other hand, problems from missing ICDs are very common:

    - a recent example on opengl.org

    - a discussion on XBMC

    etc etc

    Originally posted by XBMC discussion
    Yeah, Microsoft does not and has never distributed drivers with Open GL ICDs.

    No, it won't, and the original topic is a perfect example. It was decided by certain developers that all Gallium drivers must support geometry shaders. From all the in-tree hardware drivers, 8 of them cannot support GS in hardware, so it must be done in software. I am curious who will be implementing it, given the incomplete state of most of the drivers and a lack of manpower.

    /rant
    And why do you think this is a problem? Instead of no geometry shaders, older cards will get geometry shaders emulated in software. This is an improvement - it will allow older cards to run software that they otherwise couldn't.

    Leave a comment:

  • marek
    X.Org Developer

  • marek
    replied
    Originally posted by BlackStar View Post
    you still need to install ICD drivers from the IHV's homepage (windows update won't install OpenGL ICDs).
    This is incorrect. I got full OpenGL Catalyst driver from windows update on a mobile Radeon.


    Originally posted by drag View Post
    Hopefully the Linux graphics situation will improve with Gallium.
    No, it won't, and the original topic is a perfect example. It was decided by certain developers that all Gallium drivers must support geometry shaders. From all the in-tree hardware drivers, 8 of them cannot support GS in hardware, so it must be done in software. I am curious who will be implementing it, given the incomplete state of most of the drivers and a lack of manpower.

    /rant

    Leave a comment:

  • benmoran
    Senior Member

  • benmoran
    replied
    Yeah, Gallium does look like the bee's knees. Can't wait to see it start to take over.

    Leave a comment:

  • drag
    Senior Member

  • drag
    replied
    Hopefully the Linux graphics situation will improve with Gallium.

    With Gallium you have a generic state tracker that will behave the same regardless of what hardware your using. As long as the driver supports Gallium then as far as the application goes it'll be compatible.

    Now for good performance this may not be true, but ideally we should only end up seeing two major OpenGL implementations:

    1. Proprietary Nvidia

    2. Open Source Nvidia, ATI, and Intel.

    (and 3. your stupid for buying Via graphics or anything else and expecting it to work well).

    This is not as good as the situation with DirectX and Windows, but it is much much much better then what we have to deal with now.

    Leave a comment:

  • BlackStar
    Senior Member

  • BlackStar
    replied
    Originally posted by Svartalf
    As for ease of use...heh... If you think coding for D3D is easy, I've got this bridge in Brooklyn to sell ya... Cheap price, even...
    Easy? Hell no, graphics programming ain't easy in any sense of the word.

    However, this doesn't change the fact that OpenGL is more painful than D3D by at least an order of magnitude:

    (a) bind-to-edit makes it very difficult to create performant, generic middleware in OpenGL (think e.g. XNA). You have to: restore state before and after every middleware call (killing performance); or thrash state with every middleware call and reset it afterwards (killing performance); or tightly couple the middleware library with its consumer (lose generality).

    (b) not only do you have to work against the API (issue a) but you also have to work against the drivers. Nvidia is generally fine. AMD is generally fine, too (they have many bugs, but most are well documented with known workarounds). Intel is bad, really bad. Chrome and the rest are simply non-existent (better to install software mesa and run with that).

    Anecdote: I was asked to make a relatively simple, 10Kloc, GL3.x application to run on GL2.x + Intel. I finished the 3.x -> 2.x transition in two days, the result running fine on older nvidia & amd cards. I spent the rest two weeks working around Intel driver bugs, bogus shader errors, strange driver limitations (DX10-capable card with 2K texture size limitation? Why?), blue screens and missing features (no FBO blit?) before admitting defeat and simply dropping Intel from the supported list.

    (I would post a screenshot of the final result, but I don't have access to the dev machine right now. Intel's drivers somehow managed to render the correct geometry, but fill it with multicolored TV snow. Were the shaders reading from invalid / uninitialized samplers? That's how it looked like - yet the same code ran fine on Nvidia / AMD.)

    Personally, I wouldn't mind lower GL versions if the drivers were stable. I can work with that. What I can't work with are random driver bugs on perfectly valid code: if the driver reports glCompressedTexImage2D as supported, I expect this call to fill the texture with data or fail with an OpenGL error - getting back random data or a bluescreen is simply unacceptable.

    And it's not wholly what you're saying. Microsoft didn't make it easy to choose OpenGL at the time they came out with DX8 and beyond. At that point, it was, while painful to use, much more credible due to feature set and the "easy" choice on Windows.
    I'm not sure I can parse this correctly. When you say "it was ...", are you referring to OpenGL or D3D? (Myself, I first touched DX with version 9. I was an OpenGL-only guy before that, but DX9, for all its drawbacks, was much more stable at the time. DX9 offered FBOs and the FBOs worked. OpenGL offered FBOs, but the Moon, Sun, Jupiter and Carmack had to align before they'd work. ATi's OpenGL drivers didn't help, either - they are much better now.)
    BlackStar
    Senior Member
    Last edited by BlackStar; 26 December 2009, 12:31 PM. Reason: sp

    Leave a comment:

  • BlackStar
    Senior Member

  • BlackStar
    replied
    Originally posted by V!NCENT View Post
    Mac OS X is becoming increasingly popular (Linux too, but not as significant as Mac OS X)... OpenGL is the only choice here. With Microsoft losing crazy marketshare to Apple... I don't know. The days of Windows are numbered... DirectX will not survive, but how long that will last is as good as anyone's guess...
    Yes, but Mac OS X still doesn't support OpenGL 3.x even one year after its release. Linux doesn't really support OpenGL 2.x reliably six years after its release. With, DirectX having XBox and pretty much every single non-indie game on its side, I just can't see DirectX being overturned by OpenGL any time soon.

    Were IHVs to get their acts together and provide OpenGL drivers you could count on (GL2.1 with GLSL 1.3 would be a very good baseline) and were Mac OS X and Linux to grab a 20-25% marketshare (so it would make sense for gamedevs to overcome the inertia of the win32/DX combo) I could see things starting to change. The synergy between desktop OpenGL, OpenGL|ES, WebGL and OpenCL would be too great to ignore then.

    However, something tells me that we'd sooner see Larabee sound the death knell of D3D/OGL before that kind of thing would come to pass.

    Leave a comment:

  • Svartalf
    Linux Game Publishing

  • Svartalf
    replied
    Originally posted by BlackStar View Post
    Very true.

    The failure of the ARB to produce OpenGL 2.0 in time also played a significant role (if ARB had followed 3d labs' vision for GL2.0 back then, the graphics programming world might have been a completely different place nowadays). Not to mention the numerous fcuk ups from ARB/Khronos between GL2.1 and GL3.2: the 1+ year delay and consequent letdown that was 3.0, the inability to push essential extensions to core (like EXT_anisotropy), the inability to cater to developer needs (not being able to create a common binary shader format after 6+ years is simply ridiculous).

    So now we are in the position were GL3.2 (barely) reaches feature parity with DX10, a 3 year old standard, when DX11 is already available. Even worse, Intel is somewhere between GL1.4 and 2.0 in support; Apple is at 2.1; Linux is somewhere between ~1.3/~1.4 by default. This means you cannot even use the improved 3.x API without jumping through vendor-specific hoops.

    No wonder why developers have flocked to D3D and XNA. As things stand, OpenGL doesn't really stand a chance to become mainstream again.

    /rant
    I'd have to say that'd be my gripes on the matter...in the large. You don't need the extra stuff for much of what you're doing coding-wise, but you've got to be much more careful with your coding if you don't have those API edges to get the same or similar results.

    And it's not wholly what you're saying. Microsoft didn't make it easy to choose OpenGL at the time they came out with DX8 and beyond. At that point, it was, while painful to use, much more credible due to feature set and the "easy" choice on Windows.

    Leave a comment:

  • Svartalf
    Linux Game Publishing

  • Svartalf
    replied
    Originally posted by BlackStar View Post
    Well, it is, if we want to be honest with ourselves. OpenGL 3.2 manages to close the gap somewhat, but it's still far behind D3D10 as far as API design, ease of use and stability is concerned. (Yes, it offers more functionality in general, but (a) it's still missing binary shaders and (b) 50% of the market is using Intel IGPs, which are synonymous with "bad OpenGL support" and (c) our OSS drivers don't even support GL2.1, much less 3.2).
    As for ease of use...heh... If you think coding for D3D is easy, I've got this bridge in Brooklyn to sell ya... Cheap price, even...

    The main reasons for why D3D was adopted is as follows:

    1) Microsoft implemented what the studios were asking for- which may/may not be a good thing.

    2) The ARB was slow to implement functionality that was needed by game devs and more modern 3D applications. They were more mired in the past with their CAD/Scientific rendering heritage- which is why the 3.0 move was disappointing. The CAD vendors don't want to overhaul their codebases and haven't moved from immediate mode rendering in the large.

    3) It's on Windows and with 1 and 2, it became more of a moot point- target the predominate platform with it's "native" rendering API first. If you can target OpenGL after the fact to get MacOS and other platforms, that's a win. Otherwise, you'll do good all the same if you don't.

    Leave a comment:

Working...
X