Intel & The Shortcomings Of Gallium3D
Phoronix: Intel & The Shortcomings Of Gallium3D
While Gallium3D is viewed as the future of the 3D Linux graphics driver architecture, it's been in development for a long time and still doesn't have a solid following. In particular, Intel is still missing from the Gallium3D party...
Well, firstly I have to say reading through the thread really helps understanding gallium3d and classic mesa DRI driver architecture. However, just as Keith Whitwell indicated, it is regretful that gallium3d doesn't have a showcase well working driver yet, which hampering the community put more efforts on it.
Despite all of these, I think the debate sounds like a "the chicken or the egg, Which came first?" question, while no one would invest and focus upfront efforts on a gallium driver. Nouveau might be one, but in my opionion it's not comparable with i915g or i965g provided that it lacks necessary official documents from Nvidia. Maybe we should learn something from the development cycle of KMS, GEM, UXA. Even considered pre-mature and buggy at very first stage, strongly forcing them into real-world use did make them what they are today. So, just start working on a nice shape i915g or i965g might be a better choice than going on wasting time on duplication codes in the already floated graphics stack.
Just some cents from a newbie, please correct me if I made a mistake.
heh, like I said in another thread. This is going nowhere fast (at least fast enough to ATTEMPT to maintain parity with graphics development).
this is once again the case of intel vs everybody else. Certainly I can see their point of view but the fact that gallium has not produced anything other than vapour in terms of working implementations means that:
1) everybody is stabbing in the dark nouveau and radeon have no choice as they don't have the manpower to do everything in mesa.
2) intel might have the manpower to do it but does not want to invest time in doing so.
Thus I would say that gallium will either never arrive or never even achieve any sort of feature parity to a closed driver (not considering performance parity of course; Hell, not even considering "good enough" performance".
I would say that this is an enjoyable little experiment. Somebody expecting something production ready is really expecting too much. Not saying it is impossible, just improbable in a reasonable time frame (i.e. not 5 years down the line)
The way I see it:
Intel: lots of money, simple and less powerful gfx to program for => drivers for browsing internet, good enough but nothing spectacular
AMD/ATI: less money than Intel, more complex and more powerful gfx to program for => drivers for gaming, good enough performance and improving
NVidia: much less money than Intel, more complex and more powerful gfx to program for => drivers for gaming, very good performance
So for now Intel is good enough for most users but it might not be this way for long. It used to be that browsing the internet was a simple operation to show some text and some static images. Now we have live streaming of movies and soon enough will be having 3D games with WebGL. So Intel needs to step up or be left behind.
Originally Posted by leeyee
They won't get left behind.
Originally Posted by karl
Why? If you compare the number of people using Intel IGP vs everything else the difference is staggering. Your looking at 70-80% of people using Intel graphics. The rest of ~20% is divided up between Nvidia, ATI, and Via.
If web designers don't write their applications to run on the hardware people are using then they are excluding the vast majority of potential customers.
What Intel is doing is making sense. ATI and Nouveau folks have proprietary drivers backing them up.. they can more easily push new technologies and users are not left in the lurch. There is low expectation of stability and functionality right now so it makes sense to write the drivers to be as modern as possible.
Meanwhile Intel is responsible for shipping working hardware to customers. Stuff that Linux has pre-installed on and it needs to 'just work'. Telling customers that they should go out and compile a custom libdrm and whatnot is just completely out of the picture. By using KMS + UXA + MesaGL then Intel folks have managed to get a much of the benefit of using Gallium with a fraction of the cost in terms of development.
If ATI OSS folks had to ship out working drivers _today_ they would not be taking the approach they are. They would be doing what Intel is doing. But the ATI OSS effort is still looking forward into the future and not required to have working stuff yesterday.
Plus since Intel's hardware is less capable then ATI's or Nvidia's then the crippling effect of Linux/Xorg's current asinine driver model is not affecting them as much. Once the ATI OSS folks start shipping working Gallium-based drivers and have the framework and the architectural design down pretty well I fully expect Intel will following.
I am hoping that most Linux users will understand this and purchase ATI hardware for it's performance and groundbreaking (for Linux) OSS graphics technology and depend on Intel to produce mobile devices that 'just work' and get terrific power savings and reliability.
I'd normally just say, "Go read my post in the thread," but it's not up there yet, so I'll just reproduce it here (don't worry, it's short):
1) I wrote most of a Gallium driver. By myself. It took OVER 9000
lines of code, but it happened. I'd say that an interface that permits
one mediocre coder armed with docs to craft a working, simple driver
in a couple months (effectively three man-months, by my estimate) is a
2) I worked by myself. Except for occasional patches from the
community (Marek, Joakim, Nicolai) and lately from Dave, the initial
bringup was something I had to do by myself, without assistance.
So what I'm seeing here is a chicken-and-egg problem where Gallium has
no drivers because nobody wants to write drivers for it because its
interface is unproven because it has no drivers... Now that we're
actually having real drivers for real hardware reaching production
quality, I think we can break this cycle and get people to start
contributing to Gallium, or at least bump down to the next level of
reasons why they won't write Gallium code. :3
Not that I'm saying excuses are bad or wrong, but in the end, r300g is
14.7klocs and r300c is 26.9klocs (and yes, I didn't count the shared
shader compiler code), so the goal of "Bring up drivers in less time,
with less code," appears to be achieved. We are almost reaching r300c
performance levels, and beating it handily in certain benchmarks, so
it is possible to write good new drivers on this codebase.
Yo man, I respect you, and I'mma let you finish, but r300-gallium is totally a working driver, and it spins glxgears faster than the old r300 driver. Just wanted to give that little shout-out.
Originally Posted by _txf_
Hope MostAwesomeDude gets a r600+ card soon... :P
I have an r600 (HD 3650) as well as an r800 (HD 5670). If you're asking about r600 in Gallium, Jerome and I talked about that, and we agreed that he should continue working on his r600g stack, which should be ready to share with the world soon.
Originally Posted by blindfrog
I get your point, but it's somewhat flawed. Indeed, the world does not revolve around Linux (at least not for the time being). There are Direct2D and DXVA capable drivers available for Intel/ATi/nVidia ATM, and considering that Windows ecosystem constitutes the vast majority of the desktop market, I'm not sure any web developer would hesitate developing something new, shiny, glossy and fun. Hell, lack of performance cards didn't stop Crytek in their endavours (and sprouted a joke or two).
Originally Posted by drag
Unless... you have a card with no vertex shaders
Originally Posted by MostAwesomeDude
All kidding aside, you're doing a great job, keep it up.
And don't forget about people with RS690 in the meantime