Why would d3d11x on Linux or OS X help Microsoft in the least? The strength of OpenGL (actually, OpenGL ES) in the marketplace is all about the handsets, which are all running completely proprietary driver stacks in any case. If you're using Linux on a mobile phone, chances are you're using the PowerVR SGX SDK, which is a proprietary EGL/OpenGL-ES/OpenVG stack.
In terms of higher-end rendering on desktops, gaming, and CAD, the divide is pretty much split with DirectX trouncing OpenGL on the desktop and the gaming end (we on Linux are nowhere close to the awesomeness that is DirectX 10 + Direct2D + DirectWrite) and OpenGL 1.x is dominant in CAD.
The major gaming platforms that aren't using DirectX -- being the Playstation, Wii, and DS -- don't use OpenGL, either. Yes, Sony has an OpenGL-ES-ish API, but nobody actually uses it, preferring instead the proprietary Sony API. The big rendering farms for movie studios and the like also don't use OpenGL, instead using highly specialized rendering frameworks that only use the GPU by way of OpenCL/DirectCompute/CUDA.
This whole fact is one of the reasons why the OpenGL 3 Longs Peak API revamp was so disappointing. The CAD people who pressured Khronos into keeping back-compat are not the big consumers of the newer, higher-end features of OpenGL or DirectX. In all frankness, games are by far the biggest users of these features, and all game developers I've worked with or talked to by far prefer DirectX (as in they actually like the API better after using both, not just because it's all they've used). It's not that DirectX does anything OpenGL can't do (although DX11 does have multi-threaded features that OpenGL doesn't have) or that DirectX is appreciably faster (but note that the OpenGL API does impose a small amount of unavoidable overhead that the DX API does not), but just that the DirectX API is flat out easier to use, easier to understand, and the shader language is apparently much better (I'm not a shader expert myself, but I've had so many game devs bitch to high heaven to me about how awful GLSL is compared to HLSL or Cg).
The idea of mimicking DX11 verbatim on Linux is not super interesting to me. The neat thing about DirectX is that there's so little to the API. Every last bit of it is _already_ abstracted away in any moderately intelligently written game engine or application. Essentially the API is just a way to create buffer objects, set a rendering context, and push vertex data to the GPU. The rest of the API is high-level support routines like the high-level shader compiler or texture and font loaders. The point is, you don't need DirectX on Linux so much as you just need something that isn't OpenGL on Linux. Something where the API just does what it needs to, gets rid of easy-to-get-wrong integer handle API, gets rid of the useless editable object states, gets rid of the extraneous state machine management, has a shader language that's actually designed for GPU programming and not just a cheap clone of C with no additional thought or design, and standard utility libraries that focus on _high performance_ vector/matrix math and image format loading and super basic font rendering. (And in a pinch, you can lose the utility library, as that would only be used by simpler apps and not anything serious.)
The cool thing is that Gallium3D now makes this really easy! By far the hardest part is the shader compiler, but even for that I'd be half-tempted to just clone Cg's language given how popular it is. The rest of it almost nothing more than just exposing a sanitized, stable version of Gallium's own API. Assuming Gallium can use DirectX or OpenGL as a backend (which, if it can't now, is certainly possible) rather than direct hardware/software rendering, such a new API can even be deployed to existing OSes that aren't using Gallium natively, and the API implemented that way should have no more overhead than using any other wrapper and Cg over DirectX has today.
In terms of higher-end rendering on desktops, gaming, and CAD, the divide is pretty much split with DirectX trouncing OpenGL on the desktop and the gaming end (we on Linux are nowhere close to the awesomeness that is DirectX 10 + Direct2D + DirectWrite) and OpenGL 1.x is dominant in CAD.
The major gaming platforms that aren't using DirectX -- being the Playstation, Wii, and DS -- don't use OpenGL, either. Yes, Sony has an OpenGL-ES-ish API, but nobody actually uses it, preferring instead the proprietary Sony API. The big rendering farms for movie studios and the like also don't use OpenGL, instead using highly specialized rendering frameworks that only use the GPU by way of OpenCL/DirectCompute/CUDA.
This whole fact is one of the reasons why the OpenGL 3 Longs Peak API revamp was so disappointing. The CAD people who pressured Khronos into keeping back-compat are not the big consumers of the newer, higher-end features of OpenGL or DirectX. In all frankness, games are by far the biggest users of these features, and all game developers I've worked with or talked to by far prefer DirectX (as in they actually like the API better after using both, not just because it's all they've used). It's not that DirectX does anything OpenGL can't do (although DX11 does have multi-threaded features that OpenGL doesn't have) or that DirectX is appreciably faster (but note that the OpenGL API does impose a small amount of unavoidable overhead that the DX API does not), but just that the DirectX API is flat out easier to use, easier to understand, and the shader language is apparently much better (I'm not a shader expert myself, but I've had so many game devs bitch to high heaven to me about how awful GLSL is compared to HLSL or Cg).
The idea of mimicking DX11 verbatim on Linux is not super interesting to me. The neat thing about DirectX is that there's so little to the API. Every last bit of it is _already_ abstracted away in any moderately intelligently written game engine or application. Essentially the API is just a way to create buffer objects, set a rendering context, and push vertex data to the GPU. The rest of the API is high-level support routines like the high-level shader compiler or texture and font loaders. The point is, you don't need DirectX on Linux so much as you just need something that isn't OpenGL on Linux. Something where the API just does what it needs to, gets rid of easy-to-get-wrong integer handle API, gets rid of the useless editable object states, gets rid of the extraneous state machine management, has a shader language that's actually designed for GPU programming and not just a cheap clone of C with no additional thought or design, and standard utility libraries that focus on _high performance_ vector/matrix math and image format loading and super basic font rendering. (And in a pinch, you can lose the utility library, as that would only be used by simpler apps and not anything serious.)
The cool thing is that Gallium3D now makes this really easy! By far the hardest part is the shader compiler, but even for that I'd be half-tempted to just clone Cg's language given how popular it is. The rest of it almost nothing more than just exposing a sanitized, stable version of Gallium's own API. Assuming Gallium can use DirectX or OpenGL as a backend (which, if it can't now, is certainly possible) rather than direct hardware/software rendering, such a new API can even be deployed to existing OSes that aren't using Gallium natively, and the API implemented that way should have no more overhead than using any other wrapper and Cg over DirectX has today.
Comment