Announcement

Collapse
No announcement yet.

Intel Releases 965/G35 IGP Documents

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by val-gaav View Post
    IMHO AMD should drop the development of fglrx for some time and focus on doc and code releasing. Last fglrx driver seems to be quite nice already so people shouldn't complain if they know that AMD for the time being is focusing on the docs and open drivers...
    Yeah. The unrest in the AMD user camp (raises hand...got parts, I do, I do...) is, I think, due to the lack of truly quality results on the proprietary front along with a glacial pace for the documentation and the honestly opened stuff.

    It doesn't help that even IF the driver works well enough in 3D and doesn't boggle your system that the 2D performance is seriously subpar and things like FBOs are just simply not implemented in the new codebase. Seriously. Everyone's trying to get off of the PBuffers wagon (which is a dead mess to work with and horribly non-portable...) and onto the FBO wagon to do programmatically generated texture surfaces for rendering. FBO's make it part of the OpenGL API and make it really easy to use, less resource consumptive and doesn't need any UI environment to do it.

    I could live with and even maybe tolerate that if the pace of opening up were moving faster. Because there'd be a concrete, definite time that we'd see being able to honestly use the hardware.

    Comment


    • #12
      Mixed open/closed source driver discussion continued here : http://www.phoronix.com/forums/showthread.php?t=7647

      Intel document discussion, carry on...
      Test signature

      Comment


      • #13
        Wanted to comment on the Project Larrabee mention. Unless something has radically changed, Intel isn't using that project to get into competition with AMD and Nvidia. According to a report posted by ArsTechnica here : http://arstechnica.com/news.ars/post...-larrabee.html : The Larrabee Project isn't exactly shaping up to be a killer GPU to compete with the Shader Model based GPU's.

        While Intel might indeed be aiming for the high performance GPU market... keep this article I wrote back in 2006 in mind : http://zerias.blogspot.com/2006/12/f...-or-intel.html

        Comment


        • #14
          Well, the Larrabee will be a huge gift for video encoding/decoding.

          Comment


          • #15
            Intel isn't using that project to get into competition with AMD and Nvidia.
            Of course they are. Those are Intel's main competitors in most everything they do.

            It's just that it's confusing that Intel is not aiming for peak DirectX 10/11 gaming performance. Which is fine with me, that's not something I particularly care about.

            The GPU is not just for accelerating games anymore. It's a co-processor that can be used to augment the overall performance of your machine. What runs on top of them is software like anything else... and Intel seems to be aiming at making their GPU easy to use for lots of different tasks in addition to gaming.


            According to Mesa's site, they're advertising 2.1 support where it's available and since the bulk of support isn't in the driver layer but in the API layer...
            Ya.. OpenGL does not work like DirectX. If Mesa supports 2.1 then all the drivers that are based on that version support 2.1 also, more or less.

            What matters is how much of the API the graphics card accelerates and how well it can do it.

            The GMA X3000 and GMA X3100 are the most advanced IGP that Intel is offering at this time.
            It supports pixel and vertex shading model 3.0. It can do Anisotropic filtering up to 16 times. It has a theoretical fill rate of 1067 megapixels/s and 2133 megatexel/s at 667mhz.

            This puts it roughly on par with NV40 (Geforce 6) in terms of hardware features.

            Does this mean that it will perform on par with Geforce 6 stuff? Nope. 'fraid not. Gaming performance of the GMA X3100 on Linux right now is best described as 'lousy'.

            If your a open source purist (I am a dirty purist ) then that means that if you restrict yourself to open source drivers then Intel will outperform Nvidia at any time. But if your going to use Nvidia's proprietary drivers then Nvidia will provide a night and day performance increase over any of nvidia's offerings for any remotely modern Nvidia card.

            If your gaming requirements are light (Intel GMA X3100 can drive 'Return to Castle Wolfenstein' comfortably), you only want a 3D desktop, or your aiming for best power management features (for a laptop) and that sort of thing then I'd recommend getting a Laptop with Intel IGP.

            Otherwise if you want gaming performance then a low-end Nvidia card will serve you much better.

            Comment


            • #16
              Originally posted by drag View Post
              This puts it roughly on par with NV40 (Geforce 6) in terms of hardware features.

              Does this mean that it will perform on par with Geforce 6 stuff? Nope. 'fraid not. Gaming performance of the GMA X3100 on Linux right now is best described as 'lousy'.
              Heh... On paper, they're nice parts. The biggest problems with their real use in gaming is partly that we've not had a lot of stream processor development under our belts to know how to really optimize the shader operations needed for 3D rendering coupled with the hamstring of UMA operation. (Even WITH good drivers, the X3000/X3100 have been still lackluster- though vastly better than the past offerings. The X3500 is the first "real" contender in the IGP space from Intel- and we're still going to be fighting the uphill battle of the stream processor execution optimization for a while yet.)

              In the end, I think having the tech data, including how to drive those shaders through their opcodes, will end up being a boon. Right now, I'm seriously considering getting a G35 chipset motherboard to play with and see what all I can tweak. If I had AMD's stuff right now (hint...hint...) I would already be DOING it with machines in hand since I've got several R300/R400/R500 boards in hand. Heh... It's probably better, short term, though, as I've got too damn many irons in the fire already...

              Comment


              • #17
                I'm going to get a mobo with G35 soon too. It would've been G33, but the model I want has been out of stock for months. So I'll just wait for the improvement of X3500

                Hmm. Didn't Intel claim the X3500 will archieve double 3dmark score compared to X3100?

                Comment


                • #18
                  does the document release include thespec for the Clearvideo hardware, so the FOSS community can make use of hardware video acceleration?

                  Comment


                  • #19
                    Originally posted by curaga View Post
                    I'm going to get a mobo with G35 soon too. It would've been G33, but the model I want has been out of stock for months. So I'll just wait for the improvement of X3500

                    Hmm. Didn't Intel claim the X3500 will archieve double 3dmark score compared to X3100?
                    Yes. It's their first relatively credible IGP part. The X3000 and X3100 fell a bit short of that. I wish I had a better budget for things- because I could get one right now if it weren't for all the other things I need to be buying, including getting my roof fixed.

                    Comment


                    • #20
                      Does this mean that the drivers for the 965 IGP will improve? Because I have a friend with that mobo, and a lot of games crash or can't run (for example, Regnum Online), while with his previous mobo(also an Intel, but it was 850 I think) it was working fine..
                      I was really disappointed at Intel :P

                      Comment

                      Working...
                      X