Announcement

Collapse
No announcement yet.

Intel Linux Graphics Shine With Fedora 12

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by Kano View Post
    @AdamW

    How to use your repo with rawhide?
    just add it like any other repo, with a file in /etc/yum.repos.d/ . follow the format of the other files in there, change the name and use the appropriate path for my repo. something like:

    [video-experimental]
    name=Experimental video for Fedora Rawhide
    baseurl=http://www.happyassassin.net/video-experimental/rawhide/x86_64
    enabled=1
    metadata_expire=7d
    gpgcheck=0

    ought to do it. remember to change the arch if appropriate for your system.

    Comment


    • #22
      I second the call to see the performance of Ubuntu 9.10 included here... is there a benefit (video wise) to running a distribution using newer packages?

      Comment


      • #23
        Now we are waiting on the new stack to run on real 3D accelerator. Not this intel 3D joke. If Radeons can run as good as this, it will be one of it's own kind. sigh...

        Comment


        • #24
          Originally posted by FunkyRider View Post
          Now we are waiting on the new stack to run on real 3D accelerator. Not this intel 3D joke. If Radeons can run as good as this, it will be one of it's own kind. sigh...
          Intel is real 3D acceleration. The tests should have included Windows XP as a control group. This would add a measure to compare to.

          Consider adding UT2004 as a litmus test.

          Comment


          • #25
            I guess you could argue about what "real" means here, Intel really doesn't target gamers.

            Anyway, comparisons with Win would be welcome.

            Comment


            • #26
              Oh please, not a single 'real' Intel's 3D accelerator can beat the bottom of the line GeForce 8500 or even an 785G integrated chipset, and if you've ever seen PC games in recent like 10 years, there is no place for intel's graphics. We are living in the full HD gaming era where people treat 1920x1080, 4AA resolution as "normal". functional wise, intel's graphics can't even generate an acceptable render frame in modern games. I don't know if there is any better word than 'joke' to describe this. Intel never took gaming seriously. Or maybe they tried, it's either just too easy or too hard.

              “They’re the world’s leading designers and manufacturers of CPUs – how hard could it be to build a GPU? I mean, come on, how hard could it be? That crummy little company down the road builds them – we could build them in our sleep. Come on, how hard could it be?” ——NVIDIA David Kirk

              Comment


              • #27
                Originally posted by FunkyRider View Post
                Oh please, not a single 'real' Intel's 3D accelerator can beat the bottom of the line GeForce 8500 or even an 785G integrated chipset, ...I don't know if there is any better word than 'joke' to describe this. Intel never took gaming seriously. Or maybe they tried, it's either just too easy or too hard.
                I don't think that anyone was disputing that Intel accelerators have not served the needs of gamers. I would guess that 80% of the broader market has not cared about games nor 3D to any significant degree. So while Intel chips have not served the needs of games, that does not mean Intel is out of touch of the broader market.

                It's also not hard to argue that 3D is serving a useful need in the general desktop, and there are innovative UI designs that can be accelerated by a GPU, and that the GPU's role in general purpose computing is becoming very significant (offloading video decoding from CPI, compositing desktops, accelerating photo editing, etc.). And Intel can be seen bringing stronger 3D offerings to their product lineup.

                Either way, Intel is also very engaged in the Open Source community with open source drivers, the Moblin platform, etc., so finding other ways to describe this other than "joke", in order to soften your tone, would offer more respect to a significant contributor to the community.

                Comment


                • #28
                  Either way, Intel is also very engaged in the Open Source community with open source drivers, the Moblin platform, etc., so finding other ways to describe this other than "joke", in order to soften your tone, would offer more respect to a significant contributor to the community.
                  One word: Poulsbo.

                  Not to mention shitty, non-conformant OpenGL drivers across the board. Check out the OpenGL forums if you wish to how developers feel about the nightmare of supporting Intel hardware.

                  You may not like it, but Intel is single-handedly holding back the adoption of OpenGL. It's one of the leading reasons why Direct3D is the only real option for consumer graphics.

                  So you think *that* is good for the community? Heh, good one - tell us more!

                  Comment


                  • #29
                    Originally posted by FunkyRider View Post
                    Oh please, not a single 'real' Intel's 3D accelerator can beat the bottom of the line GeForce 8500 or even an 785G integrated chipset, and if you've ever seen PC games in recent like 10 years, there is no place for intel's graphics. We are living in the full HD gaming era where people treat 1920x1080, 4AA resolution as "normal". functional wise, intel's graphics can't even generate an acceptable render frame in modern games. I don't know if there is any better word than 'joke' to describe this. Intel never took gaming seriously. Or maybe they tried, it's either just too easy or too hard.

                    “They’re the world’s leading designers and manufacturers of CPUs – how hard could it be to build a GPU? I mean, come on, how hard could it be? That crummy little company down the road builds them – we could build them in our sleep. Come on, how hard could it be?” ——NVIDIA David Kirk
                    Please think before you post.

                    Intel graphics cards are not directed to "gamers" - they don't give a shit about those people. Integrated graphics are extremely useful for business people and mom/dad types which only want decent 2D and Google Earth. And guess what, they are the most lucrative market since they represent 80% of the graphics card sales.

                    Myself, I just bought a new laptop with the Intel X4500 because of 3 things:
                    a) I don't play games
                    b) Integrated graphics gives 25%+ more battery life
                    c) Intel has excellent working open source drivers, and having been burned by Ati and their crappy fglrx I really appreciate that

                    Sorry if I'm aggressive, but I'm tired of seeing people complaining about integrated graphics performance in games - come on, do you really expect them to play games? Just don't buy them for that.


                    BlackStar, can you provide a link for that claim? According to Intel, the latest drivers are OpenGL 2.1 compliant...
                    Last edited by caramerdo; 25 November 2009, 04:54 PM.

                    Comment


                    • #30
                      Originally posted by caramerdo View Post
                      BlackStar, can you provide a link for that claim? According to Intel, the latest drivers are OpenGL 2.1 compliant...
                      According to the 2.1 specs (pdf), "OpenGL 2.1 implementations must support at least revision 1.20 of the OpenGL shading language." (page 351, J.1). The latest drivers I could find would only support GLSL 1.10 - and badly at that.

                      Personally, I spent the better part of last week rewriting an application to work on a X4500 chip: downgraded to GLSL 1.10, disabled FBO blits, floating point attachments, MRT (which translates to no HDR, bloom, shadows or antialiasing) reduced texture resolution and finally... the driver produced an utterly broken picture. Imagine memory corruption, similar to TV snow, overlayed on top of the actual rendering.

                      That was on Vista, by the way. On Ubuntu, the driver simply refused to render any geometry touched by a vertex shader. In the end I simply gave up: Ati and Nvidia cards consume the application just fine, but Intel ones simply do not offer any meaningful OpenGL support. Maybe if you limited yourself to 15-year old GL1.1-level features the drivers might work correctly - but that's simply not an option.

                      It's sad to see the largest IHV (50%+ marketshare) produce such garbage. Their competitors are shipping OpenGL 3.2, when Intel is still struggling with 2.1 - and yes this *does* drive developers away from OpenGL and into Direct3D (and Microsoft).

                      Edit: Search for Intel on opengl.org to see how depressing the situation really is.
                      Last edited by BlackStar; 25 November 2009, 08:42 PM.

                      Comment

                      Working...
                      X