Announcement

Collapse
No announcement yet.

Gallium3D To Enter Mainline Mesa Code

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    right. the intel's api won't have something to do with graphics (at least primarly). http://techresearch.intel.com/articl...Scale/1514.htm

    Comment


    • #17
      Originally posted by chaos386 View Post
      That doesn't make much sense to me. From what I've read, the idea behind Larrabee is that, while DirectX and OpenGL will be supported through software libraries, developers will eventually program for it directly since it's x86, so making a brand new API would be a waste of effort.
      Uhm... about programming it directly... Wasn't Larrabee supposed to have something about 100 cores? I'm curious what percentage of game programmers have enough knowledge of multi-core systems to split the ray tracing algorithm optimally across all cores
      Well yeah, that's why we'll be stuck with DirectX 11 in the future, too, and Larabee won't be a big thing that solves all Graphics API mess

      Comment


      • #18
        According to an earlier post by phoronix, Gallium3D has a Direct3D state tracker internally at Tungsten Graphics, it isn't public, yet...

        If Gallium3D implements Direct3D, then Wine can simply pass Direct3D calls to G3D, which might allow OSS drivers to have better Windows gaming support than Nvidia.

        Also, having Gallium allows instant generic XvMC to any gallium driver, soon va-api will also be supported.

        I'm wondering if Gallium will support vdpau(to some extent at least...)

        Comment


        • #19
          Originally posted by chaos386 View Post
          That doesn't make much sense to me. From what I've read, the idea behind Larrabee is that, while DirectX and OpenGL will be supported through software libraries, developers will eventually program for it directly since it's x86, so making a brand new API would be a waste of effort.
          Considering that D3D and OpenGL do nothing to actually describe CSG type ray-traced rendering, the thing they're plugging (even in the latest Game Developer Magazine they're doing it...), they're going to have to come up with something that handles that description. Adding extensions on to D3D and OpenGL don't make sense because you'd make it quite a bit more painful to do. As for directly programming it...heh...I'd doubt that everyone's going to be driving it all themselves. There'll be some API or toolchain (a' la IXP Microengine C for their former network engine chip...), or both that people will use to make things go.

          Comment

          Working...
          X