If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
Point taken, thanks for the educational session. I probably got on the wrong track when you started talking about VWWare instead of Tungsten Graphics. The most sensible approach by what you said seems imo to be that VMWare would also write one or more new Windows display drivers to be used inside a virtual machine that would in term pass instructions to the Gallium3D DDI state tracker so they'd get away easy without having to write that Direct3D API themselves at all. (unless they can get away with some existing driver in Windows) So yeah, probably nothing native in sight.
The most sensible approach by what you said seems imo to be that VMWare would also write one or more new Windows display drivers to be used inside a virtual machine that would in term pass instructions to the Gallium3D DDI state tracker so they'd get away easy without having to write that Direct3D API themselves at all.
I'm pretty sure that's where they're going, yeah. The glue between the VM and the host system would be some Gallium API derivative and at the host side Gallium state/TGSI is converted back to Direct3D or OpenGL. I'm not sure how that last part would go, but it's probably doable. Easier than the D3D on OGL stuff that Wine has to do anyway.
I worked on WineD3D for a while a couple of years ago and have chatted in IRC with the other developers recently about Gallium3D.
The general consensus was that we focus on OpenGL as a D3D backend for now since it will always need to be supported anyway. We definitely would look at some kind of wine state tracker given time and when we / they think that it is:-
a) a little further developed
b) we have the man-power to do it
Since I'd worked on wined3d a little in the past, I expressed my interest in working on this... however, I am also very busy right now and can't possibly fit it in given my other priorities - it's likely that by the time I become available to work on it, someone else already will have begun the work.
I would also caution that while theoretically it would provide a speed boost, that would come with time.. purely because for now, at least while using the features of D3D directly supported by OpenGL, we are still really only calling:
Game -> WineD3D -> OpenGL (into the driver)
Game -> D3D -> Driver Interface
The amount of indirection is similar, (with the exception of those vendors that provide their own D3D interface).
When going to Gallium, we theoretically have
Game -> GalliumWineD3D -> Driver
Which would still be roughly 3 levels... although the benefits are that the latest D3D features could appear in Wine more quickly than they do now since we would go through Mesa/Gallium rather than OpenGL for that middle-layer.
I would also caution that while theoretically it would provide a speed boost, that would come with time..
I wasn't purely talking about speeds while benefits might come on that area too. Unless WineHQ project's own documentation is out of date, the DirectX->OpenGL mapping needs to waste card constants. Then again, if choices are "runs slowly" and "doesn't run at all", might be a bit tricky to decide which one you want.
Hmm, also realized they probably want to do that same virtualization thing on Windows too and that's why they had Windows in the graph. As in, you run native Gallium3D on Windows and Gallium3D calls from within Linux in VMWare virtualization solution get passed to the native one which in term lets the system's 3D accel stuff handle the rest. I'd say Tungsten Graphics guys are pretty damn smart, this means a lot more simple 3D virt across all platforms Gallium3D will be ported on.
Im not really into graphic-programing, but I have a strong interest in FOSS and all the cool Gallium3D stuff emerging right now.
Despite the appealing "one framework to rule em all" idea behind Gallium3D there is one thing that makes me wonder:
There are people who predict the near end of traditional graphic cards with shaders and so on, one of them being Tim Sweeney in "The End of the GPU Roadmap". According to him graphics will soon be rendered 100% in software again on normal general purpose CPUs.
I guess the upcoming Larrabee GPU follows this direction he describes.
But how does Gallium3D fit into the picture? Does Gallium3D have a place in pure software rendering?
It does have a softpipe that you can do rendering with CPU but I doubt Tim Sweeney's predictions will come true at least in a sensible time period. Probabaly gonna happen like with IPv6, as in it's always coming in the next ten years. (as it has been for decades)
There's certainly going to be more convergence between GPU and CPU. Though even Intel with Larrabee admit that they need dedicated texture filtering in hardware, and a big part of the performance in AMD and NVidia GPUs comes from dedicated Z buffer optimizations. So it would be pretty natural to have Larrabee pipe driver for Gallium3D alongside other hardware drivers.