Announcement

Collapse
No announcement yet.

Gallium3D To Enter Mainline Mesa Code

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Svartalf
    replied
    Originally posted by chaos386 View Post
    That doesn't make much sense to me. From what I've read, the idea behind Larrabee is that, while DirectX and OpenGL will be supported through software libraries, developers will eventually program for it directly since it's x86, so making a brand new API would be a waste of effort.
    Considering that D3D and OpenGL do nothing to actually describe CSG type ray-traced rendering, the thing they're plugging (even in the latest Game Developer Magazine they're doing it...), they're going to have to come up with something that handles that description. Adding extensions on to D3D and OpenGL don't make sense because you'd make it quite a bit more painful to do. As for directly programming it...heh...I'd doubt that everyone's going to be driving it all themselves. There'll be some API or toolchain (a' la IXP Microengine C for their former network engine chip...), or both that people will use to make things go.

    Leave a comment:


  • some-guy
    replied
    According to an earlier post by phoronix, Gallium3D has a Direct3D state tracker internally at Tungsten Graphics, it isn't public, yet...

    If Gallium3D implements Direct3D, then Wine can simply pass Direct3D calls to G3D, which might allow OSS drivers to have better Windows gaming support than Nvidia.

    Also, having Gallium allows instant generic XvMC to any gallium driver, soon va-api will also be supported.

    I'm wondering if Gallium will support vdpau(to some extent at least...)

    Leave a comment:


  • NeoBrain
    replied
    Originally posted by chaos386 View Post
    That doesn't make much sense to me. From what I've read, the idea behind Larrabee is that, while DirectX and OpenGL will be supported through software libraries, developers will eventually program for it directly since it's x86, so making a brand new API would be a waste of effort.
    Uhm... about programming it directly... Wasn't Larrabee supposed to have something about 100 cores? I'm curious what percentage of game programmers have enough knowledge of multi-core systems to split the ray tracing algorithm optimally across all cores
    Well yeah, that's why we'll be stuck with DirectX 11 in the future, too, and Larabee won't be a big thing that solves all Graphics API mess

    Leave a comment:


  • Regenwald
    replied
    right. the intel's api won't have something to do with graphics (at least primarly). http://techresearch.intel.com/articl...Scale/1514.htm

    Leave a comment:


  • chaos386
    replied
    Originally posted by deanjo View Post
    Rumors have it intel is going to try to push it's own API

    http://www.fudzilla.com/index.php?op...11361&Itemid=1
    That doesn't make much sense to me. From what I've read, the idea behind Larrabee is that, while DirectX and OpenGL will be supported through software libraries, developers will eventually program for it directly since it's x86, so making a brand new API would be a waste of effort.

    Leave a comment:


  • rohcQaH
    replied
    Originally posted by ethana2 View Post
    I'm talking about implementing a Direct3D API that WINE can just pass D3D to instead of having the WINE folks try to reimplemented it in OpenGL, so they can focus on the actual win32 API.
    The problem with implementing DirectX is figuring out what DX is supposed to do. It's poorly documented, and the internals are unknown to anyone but a few black voodoo priests at microsoft. A black-box-implementation will always be flawed, no matter if it's built on top of OGL or G3D.

    There are also versions of Wine for Mac, *BSD and possibly others. OGL is available on all of those, G3D isn't. Even if there was a DX implementation on top of G3D on Linux, the DX->OGL wrapper in Wine would have to be maintained. Adding G3D-support does not simplify maintenance, it adds additional code.


    Still, having a DX API on top of G3D might be good for the few companies trying to port their games, but most of them use engines with an OGL backend anyway (UT/iD). And there might be some performance improvements in Wine.
    If you compare that to the work of a DX-implementation on G3D, it's hardly worth it.

    Leave a comment:


  • NeoBrain
    replied
    Originally posted by mirza View Post
    Of course we will use some API, but drawing scene with ray-tracing is completely different then current rendering. Currently, there is lot of work CPU must do to paint image that "feels" realistic. Thats why we really need C/C++ right now. In ray-tracing, you just setup a scene, no object simplification (removing points) is needed, no tricks for shadows, no calculating parts of scene that you can/can't see (reducing scene size for faster rendering) etc. You can just send massive scene to GPU and use CPU only to re-arrange objects there. That would greatly simplify code for graphical part of the game. Still, AI and physics must be done on the CPU, but Java (or .NET in case of MS) can do that, I am pretty sure. Benefits are obvious (debugging multithreaded app is one clear example).
    Yeah, that's why I was pointing to the release of DirectX 11 (or 11.1?) which will support ray tracing (or, which at least is said to support it).

    However, I doubt that game developers will really accept Intel's own API (if it's really creating an own one) if they don't do something to keep up backwards compatibility. On the other hand, you can't really implement a backwards compatible ray tracer on current systems as that would simpley screw up the whole API ;D

    Look how the company (now NVIDIA ofc) behind PhysX handled the new API: they provided a general set of functions to do some drawing stuff which made use of PhysX technology if available but also provided a fallback mechanism with "conservative" methods. Thus, quite a few developers adopted it as there was no risk in losing support for older hardware.

    Leave a comment:


  • deanjo
    replied
    Rumors have it intel is going to try to push it's own API

    Leave a comment:


  • ethana2
    replied
    Originally posted by chelobaka View Post
    There is no reason to implement D3D in Gallium3D considering amount of work needed. WINE does its job well enough. Most of the apps already created for D3D will never be ported to Linux even if we'll have D3D compatible API due to devs stuck to M$ platform.
    I'm talking about implementing a Direct3D API that WINE can just pass D3D to instead of having the WINE folks try to reimplemented it in OpenGL, so they can focus on the actual win32 API. What we have now is NOT okay, it's buggy as heck and there's a performance hit from translation and poor optimization that gallium should eliminate. I'm thinking it'll be a ton easier to code it with gallium than it would be to do it in GLSL.

    Leave a comment:


  • rbmorse
    replied
    Originally posted by chelobaka View Post
    You can blame Khronos Group for their poor OpenGL 3.0 specs leaving free software far behind Windows in gaming world
    Look at the membership of Khronos. You won't find many game or FOSS proponents on that list.

    Leave a comment:

Working...
X