Mixed open/closed source driver discussion continued here : http://www.phoronix.com/forums/showthread.php?t=7647
Intel document discussion, carry on...
It doesn't help that even IF the driver works well enough in 3D and doesn't boggle your system that the 2D performance is seriously subpar and things like FBOs are just simply not implemented in the new codebase. Seriously. Everyone's trying to get off of the PBuffers wagon (which is a dead mess to work with and horribly non-portable...) and onto the FBO wagon to do programmatically generated texture surfaces for rendering. FBO's make it part of the OpenGL API and make it really easy to use, less resource consumptive and doesn't need any UI environment to do it.
I could live with and even maybe tolerate that if the pace of opening up were moving faster. Because there'd be a concrete, definite time that we'd see being able to honestly use the hardware.
Wanted to comment on the Project Larrabee mention. Unless something has radically changed, Intel isn't using that project to get into competition with AMD and Nvidia. According to a report posted by ArsTechnica here : http://arstechnica.com/news.ars/post...-larrabee.html : The Larrabee Project isn't exactly shaping up to be a killer GPU to compete with the Shader Model based GPU's.
While Intel might indeed be aiming for the high performance GPU market... keep this article I wrote back in 2006 in mind : http://zerias.blogspot.com/2006/12/f...-or-intel.html
Well, the Larrabee will be a huge gift for video encoding/decoding.
Of course they are. Those are Intel's main competitors in most everything they do.Intel isn't using that project to get into competition with AMD and Nvidia.
It's just that it's confusing that Intel is not aiming for peak DirectX 10/11 gaming performance. Which is fine with me, that's not something I particularly care about.
The GPU is not just for accelerating games anymore. It's a co-processor that can be used to augment the overall performance of your machine. What runs on top of them is software like anything else... and Intel seems to be aiming at making their GPU easy to use for lots of different tasks in addition to gaming.
Ya.. OpenGL does not work like DirectX. If Mesa supports 2.1 then all the drivers that are based on that version support 2.1 also, more or less.According to Mesa's site, they're advertising 2.1 support where it's available and since the bulk of support isn't in the driver layer but in the API layer...
What matters is how much of the API the graphics card accelerates and how well it can do it.
The GMA X3000 and GMA X3100 are the most advanced IGP that Intel is offering at this time.
It supports pixel and vertex shading model 3.0. It can do Anisotropic filtering up to 16 times. It has a theoretical fill rate of 1067 megapixels/s and 2133 megatexel/s at 667mhz.
This puts it roughly on par with NV40 (Geforce 6) in terms of hardware features.
Does this mean that it will perform on par with Geforce 6 stuff? Nope. 'fraid not. Gaming performance of the GMA X3100 on Linux right now is best described as 'lousy'.
If your a open source purist (I am a dirty purist ) then that means that if you restrict yourself to open source drivers then Intel will outperform Nvidia at any time. But if your going to use Nvidia's proprietary drivers then Nvidia will provide a night and day performance increase over any of nvidia's offerings for any remotely modern Nvidia card.
If your gaming requirements are light (Intel GMA X3100 can drive 'Return to Castle Wolfenstein' comfortably), you only want a 3D desktop, or your aiming for best power management features (for a laptop) and that sort of thing then I'd recommend getting a Laptop with Intel IGP.
Otherwise if you want gaming performance then a low-end Nvidia card will serve you much better.
In the end, I think having the tech data, including how to drive those shaders through their opcodes, will end up being a boon. Right now, I'm seriously considering getting a G35 chipset motherboard to play with and see what all I can tweak. If I had AMD's stuff right now (hint...hint...) I would already be DOING it with machines in hand since I've got several R300/R400/R500 boards in hand. Heh... It's probably better, short term, though, as I've got too damn many irons in the fire already...
I'm going to get a mobo with G35 soon too. It would've been G33, but the model I want has been out of stock for months. So I'll just wait for the improvement of X3500
Hmm. Didn't Intel claim the X3500 will archieve double 3dmark score compared to X3100?
does the document release include thespec for the Clearvideo hardware, so the FOSS community can make use of hardware video acceleration?
Does this mean that the drivers for the 965 IGP will improve? Because I have a friend with that mobo, and a lot of games crash or can't run (for example, Regnum Online), while with his previous mobo(also an Intel, but it was 850 I think) it was working fine..
I was really disappointed at Intel :P