Announcement

Collapse
No announcement yet.

Gallium3D Gets New Geometry Shader Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
    Xheyther
    Junior Member

  • Xheyther
    replied
    Ho ! Ha? Nostradamus !

    To me, what you write sounds a bit pessimistic and unrealistic toward Microsoft (note that I'm not defending them here, just remember you the reality).

    Leave a comment:

  • V!NCENT
    Senior Member

  • V!NCENT
    replied
    Originally posted by BlackStar View Post
    As things stand, OpenGL doesn't really stand a chance to become mainstream again./rant
    Mac OS X is becoming increasingly popular (Linux too, but not as significant as Mac OS X)... OpenGL is the only choice here. With Microsoft losing crazy marketshare to Apple... I don't know. The days of Windows are numbered... DirectX will not survive, but how long that will last is as good as anyone's guess...

    Leave a comment:

  • BlackStar
    Senior Member

  • BlackStar
    replied
    Originally posted by drag View Post
    The reason DirectX gained popularity in the first place for games is because each vendor shipped a different OpenGL stack.

    This meant that developers had to troubleshoot and support no less then 3 different OpenGL implementations; each with their own quirks, limitations, and extensions.

    With DirectX they only had to support _one_ and that was Microsoft's. It does not really matter which API is better; they do just about the same thing. What matters is that they can always depend on DirectX working the same and being available.
    Very true.

    The failure of the ARB to produce OpenGL 2.0 in time also played a significant role (if ARB had followed 3d labs' vision for GL2.0 back then, the graphics programming world might have been a completely different place nowadays). Not to mention the numerous fcuk ups from ARB/Khronos between GL2.1 and GL3.2: the 1+ year delay and consequent letdown that was 3.0, the inability to push essential extensions to core (like EXT_anisotropy), the inability to cater to developer needs (not being able to create a common binary shader format after 6+ years is simply ridiculous).

    So now we are in the position were GL3.2 (barely) reaches feature parity with DX10, a 3 year old standard, when DX11 is already available. Even worse, Intel is somewhere between GL1.4 and 2.0 in support; Apple is at 2.1; Linux is somewhere between ~1.3/~1.4 by default. This means you cannot even use the improved 3.x API without jumping through vendor-specific hoops.

    No wonder why developers have flocked to D3D and XNA. As things stand, OpenGL doesn't really stand a chance to become mainstream again.

    /rant

    Leave a comment:

  • drag
    Senior Member

  • drag
    replied
    Originally posted by dl.zerocool
    Well this is partially true, even if 50% of the market is using Intel IGPs which I don't really care about, if they have a bad support of OpenGL that's their problem to solve, not OpenGL one plus if people are stupid enough to use their ultra cheap solution then why should we care about them ?
    50%? Bah.

    Try 80%.

    Intel's IGP are the most popular video chipsets used by a longshot. Dwarfs Nvidia's market.


    I guess it makes sense. But wouldn't a more straightforward explanation be that in the past there were no OpenGL drivers shipped by default with Windows XP? I think you only got them once you downloaded the drivers from the manufacturer site, and probably many people that would potentially use Google Earth would not know anything about this. Nowadays Vista and W7 have OpenGL drivers included by default, so this would not be an issue with these systems.

    The reason DirectX gained popularity in the first place for games is because each vendor shipped a different OpenGL stack.

    This meant that developers had to troubleshoot and support no less then 3 different OpenGL implementations; each with their own quirks, limitations, and extensions.

    With DirectX they only had to support _one_ and that was Microsoft's. It does not really matter which API is better; they do just about the same thing. What matters is that they can always depend on DirectX working the same and being available.

    Leave a comment:

  • BlackStar
    Senior Member

  • BlackStar
    replied
    Originally posted by yotambien View Post
    I guess it makes sense. But wouldn't a more straightforward explanation be that in the past there were no OpenGL drivers shipped by default with Windows XP? I think you only got them once you downloaded the drivers from the manufacturer site, and probably many people that would potentially use Google Earth would not know anything about this. Nowadays Vista and W7 have OpenGL drivers included by default, so this would not be an issue with these systems.
    Could be. Then again, OpenGL seems to be the default setting, so maybe it's just a plain old driver issue as usual (it is OpenGL we are speaking about, after all).

    Vista/Win7 emulate OpenGL 1.4 via D3D by default. Better than XP (OpenGL 1.1 without hardware acceleration), but still far from good. To get real OpenGL support, you still need to install ICD drivers from the IHV's homepage (windows update won't install OpenGL ICDs).

    Leave a comment:

  • yotambien
    Senior Member

  • yotambien
    replied
    Originally posted by BlackStar View Post
    It is safe to assume that Google Earth is forced to fall back to D3D on Windows due to OpenGL driver issues. It is also safe to assume that Google would have preferred to use OpenGL only, if that was feasible (due to #1, #2 and #3). Unfortunately, it isn't.
    I guess it makes sense. But wouldn't a more straightforward explanation be that in the past there were no OpenGL drivers shipped by default with Windows XP? I think you only got them once you downloaded the drivers from the manufacturer site, and probably many people that would potentially use Google Earth would not know anything about this. Nowadays Vista and W7 have OpenGL drivers included by default, so this would not be an issue with these systems.

    Leave a comment:

  • BlackStar
    Senior Member

  • BlackStar
    replied
    Originally posted by yotambien View Post
    These christmas liqours are probably killing my neurons. I can't follow your argument anymore. Why is it that Google Earth ships with both D3D and OpenGL?
    My liquor-induced thought process goes like this:

    Fact #1: Google Earth is a cross-platform application => it cannot be D3D only.

    Fact #2: OpenGL is a cross-platform graphics API => Google Earth could have been built on OpenGL only.

    Fact #3: a single codepath is better than two codepaths from a time and cost perspective => there must be some reason why Google Earth supports both OpenGL and D3D.

    It all starts to make sense when you check out Google Earth's configuration files(*). These files contain overrides for specific video chips and explanations for those overrides (e.g. chip x crashes when using mipmaps under OpenGL. Use D3D instead). The interesting thing here is that pretty much every single Intel chip (50% of the market) is switched to the D3D codepath in Google Earth, which means Intel's OpenGL drivers must be rather bad.

    It is safe to assume that Google Earth is forced to fall back to D3D on Windows due to OpenGL driver issues. It is also safe to assume that Google would have preferred to use OpenGL only, if that was feasible (due to #1, #2 and #3). Unfortunately, it isn't.

    (*) Last time I checked was on version 3, but I somehow doubt anything has changed since then.
    BlackStar
    Senior Member
    Last edited by BlackStar; 25 December 2009, 12:53 PM. Reason: syntax

    Leave a comment:

  • yotambien
    Senior Member

  • yotambien
    replied
    Originally posted by BlackStar View Post
    Because they are 50% of the market and Direct3d is working on that 50%. Why do you think Google Earth is shipping with both D3d and OpenGL support at the same time?
    These christmas liqours are probably killing my neurons. I can't follow your argument anymore. Why is it that Google Earth ships with both D3D and OpenGL?

    Leave a comment:

  • BlackStar
    Senior Member

  • BlackStar
    replied
    Originally posted by Remco View Post
    I can't really compare, as I have only learned to use OpenGL, but I find it very simple to use. From what I can remember from D3D-tutorials is that I have to muck about with the windows API, where everything has funny ALL_CAPS names. It's not pretty.
    It is the same with OpenGL, unless you use a library that takes cares of the details for you (like SDL). The setup process for OpenGL and D3D is very similar: you need a window and a device context. You then query available pixel formats, change the pixel format of the window and finally instanatiate an OpenGL/D3D context on that window/DC. (The process is analogous on X, where you have a display connection, screen, window and visuals and most other platform APIs).

    The differences start after you pass the setup stage.

    ...binary shaders...
    What's that for?
    Load times, mainly. With OpenGL, you need to compile/link shaders every time you startup your program. Modern games might run in the hundreds / thousands of shaders, which translates into very long load times.

    Many of our OSS drivers don't even support 3D at all. (and they will never support Direct3D)
    Well, Linux is used by about ~1-2% of the computer-literate population. If 10 or 20% of that number doesn't support 3d... you can see where I'm going. Numbers in the order of 0.1-0.4% can be safely ignored - these people wouldn't be interested in my applications anyway.

    The drivers that matter (at the moment) are the proprietary NVIDIA and AMD drivers. That's what the scientific community uses for Linux graphics.
    Indeed, if you are targetting the scientific community, you are more or less free to dictate your hardware requirements. If you are going after the home market, you cannot ignore Intel (as much as I hate to admit that). World of Goo, for example, would have lost more than half of its sales if it wasn't able to run on Intel (and I'm sure the developers hated every minute spent working around compatibility issues).

    Surprisingly enough, Intel's open source drivers are decent enough. They support GLSL, VBOs and FBOs which is decent enough (bugs can be worked around; missing features cannot). Their closed-source drivers are where the problem lies: they haven't updated GMA950 drivers since 2008 (that's what 99% of netbooks come with), their Poulsbo chips don't even support OpenGL, and their newer chips (HD4500 and the like) often fail to run perfectly valid OpenGL code.

    Check the forums at opengl.org to see how many problems is Intel causing to OpenGL developers.

    Originally posted by dl.zerocool
    Well this is partially true, even if 50% of the market is using Intel IGPs which I don't really care about, if they have a bad support of OpenGL that's their problem to solve, not OpenGL one plus if people are stupid enough to use their ultra cheap solution then why should we care about them ?
    Because they are 50% of the market and Direct3d is working on that 50%. Why do you think Google Earth is shipping with both D3d and OpenGL support at the same time?

    To say true what I would like to see in the future of OpenGL
    is a merge and standardization of all 3D features, input management and sound management like in DirectX.
    That will not happen as part of OpenGL. It's not its raison d' etre.

    OpenGL is equivalent to Direct3D. They are graphics APIs.
    OpenAL is equivalent to DirectSound. They are sound APIs.
    OpenCL is equivalent to DirectCompute. They are compute APIs.

    Input and windowing are handled by the underlying OS, neither DirectX nor OpenGL. (Yes, DX used to provide DirectInput but this has been deprecated in favor of win32 raw input. DX also provides XInput, but this is mostly meant for XBox interop).

    Leave a comment:

  • Remco
    Senior Member

  • Remco
    replied
    Originally posted by BlackStar View Post
    Well, it is, if we want to be honest with ourselves. OpenGL 3.2 manages to close the gap somewhat, but it's still far behind D3D10 as far as API design, ease of use and stability is concerned.
    I can't really compare, as I have only learned to use OpenGL, but I find it very simple to use. From what I can remember from D3D-tutorials is that I have to muck about with the windows API, where everything has funny ALL_CAPS names. It's not pretty.

    (Yes, it offers more functionality in general, but (a) it's still missing binary shaders
    What's that for?
    and (b) 50% of the market is using Intel IGPs, which are synonymous with "bad OpenGL support" and (c) our OSS drivers don't even support GL2.1, much less 3.2).
    Many of our OSS drivers don't even support 3D at all. (and they will never support Direct3D)

    The drivers that matter (at the moment) are the proprietary NVIDIA and AMD drivers. That's what the scientific community uses for Linux graphics.

    Leave a comment:

Working...
X