The Ohloh site includes rough estimates for how long each project would take to rewrite from scratch based primarily on code size. The estimate for Mesa is 291 developer-years :
Not something you knock together over the weekend
Announcement
Collapse
No announcement yet.
Mesa Receives Some OpenGL 3 Love
Collapse
X
-
One thing that's not obvious is just how big and complicated the hardware *independent* part of an OpenGL driver is. Mesa has almost a million lines of hardware-independent code, and proprietary OpenGL drivers are *much* larger.
The hardware-dependent part is tiny by comparison - maybe ~20,000 lines for an older GPU and ~50,000 lines for a newer GPU ("classic" mesa HW drivers in both cases), with generally smaller numbers for a Gallium3D driver.
Even a 20,000 LOC driver is a non-trivial development effort, of course - maybe 6-7 developer-years for "finished" code, or 2-3 developer-years to get to the level of current "production-ish" open source drivers.
Extrapolate that to a code base the size of Mesa, and you see why Mesa keeps evolving rather than being "replaced". That said, Mesa has been evolving for 17 years now and I imagine that most of the code has been completely replaced at least once during that time.
Leave a comment:
-
Originally posted by bridgman View PostI should mention that the emulated GPU on VMWare clients probably has the *best* Gallium3D support right now, but I imagine you want to run on a real GPU not the figment of some developers' imagination
Leave a comment:
-
Originally posted by whizse View PostNot using Mesa as a state tracker does seem to be a long term goal:
http://wiki.x.org/wiki/SummerOfCodeIdeas
This would keep the "GL to hardware layer" portions of Mesa (ie the state tracker) and use Gallium3D as the only hardware layer, so software rendering would use Gallium3D and softpipe/llvmpipe instead of the "classic" software renderer.
In other words, it would get rid of everything in Mesa that *wasn't* part of the state tracker.
Leave a comment:
-
I should mention that the emulated GPU on VMWare clients probably has the *best* Gallium3D support right now, but I imagine you want to run on a real GPU not the figment of some developers' imagination
Leave a comment:
-
Not using Mesa as a state tracker does seem to be a long term goal:
Add a pure Gallium state tracker for OpenGL 3.0. Right now, Gallium uses Mesa as its state tracker. However, since the Mesa source code also implements software rendering, as well as old-style DRI drivers, this results in a lot of cruft and in particular holds back the addition of new OpenGL features (as those features must be supported in the whole mesa first). The performance of Gallium also suffers, since the state tracker does a lot of things that are later on duplicated in Gallium. This project involves duplicating the current Mesa state tracker and removing all the legacy bits. Then the student will add the necessary state tracker functionality for supporting OpenGL 3.0.
Leave a comment:
-
Originally posted by FireBurn View PostBut is replacing it a long term goal?
KMS replaced UMS once it became stable in the DDX driver (yes not as much code) and I'm guessing the UMS code will be ripped out of the kernel as soon as Linus lets the developers - well for Intel anyway
Originally posted by FireBurn View PostI'm kind of disappointed that the Nouveau folk changed their minds about using Gallium for the fixed pipeline cards. It would have been nice if eventually all cards were supported natively under Gallium
Originally posted by FireBurn View PostI'd really like to test Gallium and lean how to add to it. Both on the desktop and my PS3 (cell driver). Do you know a good place to start?
It might be a bit early to test the xorg state tracker (a ddx that uses Gallium3D for acceleration rather than GPU-specific code).
Originally posted by FireBurn View PostAlso what's the Python statetracker? Also what's the difference between llvmpipe and gallivm? (The v isn't a typo)
AFAIK llvmpipe is a software renderer which uses llvm to translate shader programs into optimized CPU code, while gallivm does the same thing but generates GPU shader code and is part of a hardware accelerated driver. Stephane Marchesin's slides from FOSDEM 2009 mention gallivm :
Leave a comment:
-
If someone here is using Arch Linux - I have a great PKGBUILD script for mesa-git, which compiles r300g driver and installs it. Switching to gallium from classic driver (and vice versa) any time you wish is very convenient: simple terminal console does the trick.
Leave a comment:
Leave a comment: