Announcement

Collapse
No announcement yet.

DirectX 10/11 Coming Atop Gallium3D

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • haplo602
    replied
    Originally posted by BlackStar View Post
    Current D3D drivers are easily an order of magnitude more stable than OpenGL, not the least because they use a common HLSL compiler instead of 5 different ones.

    In fact, today I was debugging a GLSL shader that's interpreted differently by all three major vendors. With a tiny tweak, I can cause an access violation on Ati, a black screen on Nvidia and multicolored sparkly rendering on Intel (with current drivers). Three mutually incompatible interpretations of the same code - beat that!

    I dread to think what will happen if I add Mesa and OS X drivers into the mix...
    the solution is MESA ... once it is used for all the platforms on Linux and up to date with latest OpenGL version, you'll just need one code path :-)

    scary but true ...

    Leave a comment:


  • airlied
    replied
    Originally posted by md1032 View Post
    This is silly. New applications should use OpenGL, and Wine can use the DX interoperability features of OpenGL 3.2 (vertex_array_bgra, provoking_vertex, etc.). This is just wasted effort that would be better spent on OpenGL 3 support.
    Only if you no clue about what GL3 and 3.2 are. They are mainly GL implementations of DX10/11 bits.

    And since VMware need to write DX10/11 state tracker anyway for their OS drivers I guess its not a wasted effort for them.

    Dave.

    Leave a comment:


  • md1032
    replied
    This is silly. New applications should use OpenGL, and Wine can use the DX interoperability features of OpenGL 3.2 (vertex_array_bgra, provoking_vertex, etc.). This is just wasted effort that would be better spent on OpenGL 3 support.

    Leave a comment:


  • airlied
    replied
    no source for DX state trackers

    VMware won't be releasing any source for DX* state trackers.

    Zack was talking about the Gallium core getting support for DX10/11 *features* so they could build a DX10/11 state track on top of them.

    Leave a comment:


  • Remco
    replied
    Originally posted by BlackStar View Post
    Current D3D drivers are easily an order of magnitude more stable than OpenGL, not the least because they use a common HLSL compiler instead of 5 different ones.

    In fact, today I was debugging a GLSL shader that's interpreted differently by all three major vendors. With a tiny tweak, I can cause an access violation on Ati, a black screen on Nvidia and multicolored sparkly rendering on Intel (with current drivers). Three mutually incompatible interpretations of the same code - beat that!

    I dread to think what will happen if I add Mesa and OS X drivers into the mix...
    The nice thing about Mesa is that it will be used for all open source drivers. If it renders a particular shader instruction as a multicolored sparkly screen, at least it will do so across the board.

    Leave a comment:


  • BlackStar
    replied
    Originally posted by Remco View Post
    I'm pretty sure driver developers are perfectly able to load the Direct3D state trackers with bugs too.
    Current D3D drivers are easily an order of magnitude more stable than OpenGL, not the least because they use a common HLSL compiler instead of 5 different ones.

    In fact, today I was debugging a GLSL shader that's interpreted differently by all three major vendors. With a tiny tweak, I can cause an access violation on Ati, a black screen on Nvidia and multicolored sparkly rendering on Intel (with current drivers). Three mutually incompatible interpretations of the same code - beat that!

    I dread to think what will happen if I add Mesa and OS X drivers into the mix...

    Leave a comment:


  • Remco
    replied
    Originally posted by BlackStar View Post
    OpenGL drivers are so buggy it's not even funny anymore. Yes, that includes nvidia, too.
    I'm pretty sure driver developers are perfectly able to load the Direct3D state trackers with bugs too.

    Leave a comment:


  • deanjo
    replied
    Originally posted by droidhacker View Post
    2) Waste resources in bad areas that could be used in more universally useful areas, like GPU-independent video decode acceleration.
    DXVA2 on linux.... that's one hell of an idea!

    Leave a comment:


  • Craig73
    replied
    Originally posted by BlackStar View Post
    quad buffer stereo.
    Ha - the first thing that came to mind was you mean 4 layers of buffering for Stereo sound on Linux... that would be a unique Linux feature that explains the lag in Skype with PulseAudio :-)

    Leave a comment:


  • Kalessian
    replied
    Originally posted by RealNC View Post
    I just have to laugh at the responses here. Who do you think you people are? Seriously. Your community (as in "Linux community", whatever that is to you, to me it's the money behind Novell and Red Hat coupled with a few lunatics) is irrelevant. When are you going to grasp that?
    Who is this guy? I've never heard of him, so none of his post matters to me.

    Really, though, only the rich deserve to express their opinions.

    Leave a comment:

Working...
X