Announcement

Collapse
No announcement yet.

Intel Linux Graphics Shine With Fedora 12

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by Kano View Post
    @AdamW

    Did you enable the 965 driver in the libva package?
    In the package in my personal repository, yeah. I have a review request in for Fedora proper; if that's ever accepted, it'll have to be without the 965 driver, as it implements patented stuff.

    Comment


    • #17
      Originally posted by AdamW View Post
      we'll try. For now, the workaround of disabling KMS should work for most cases. If your graphics work okay with KMS disabled, it's alright to run that way.
      The problem is that disabling KMS breaks suspend on my graphics card on 2.6.31. It does suspend, but it doesn't resume. I even tried --quirk-vbestate-restore and that works once, but after initiating second suspend, the screen is corrupted and hard reboot is required.

      So for now, I'm using KMS for my everyday work and non-KMS when I need to perform an OOo Impress presentation.

      Comment


      • #18
        To first recap some important pieces of information for Fedora 10, Fedora 11, and Fedora 12 are the package versions found in each release. Fedora 10 shipped with the Linux 2.6.27 kernel, GNOME 2.24.1, X Server 1.5.3, xf86-video-intel 2.5.0, and Mesa 7.3-devel. Fedora 11 provided the Linux 2.6.29 kernel, GNOME 2.26.1, X Server 1.6.2 RC1, xf86-video-intel 2.7.0, and Mesa 7.5-devel. The brand new Fedora 12 release is using the Linux 2.6.31 kernel, GNOME 2.28.1, X Server 1.7.1, xf86-video-intel 2.9.1, and Mesa 7.7-devel. All three Fedora releases were left in their stock configurations after installation. The x86_64 version of all three Fedora releases was used.
        When doing such benchmarks you should issue an update first.

        F11 ships 2.6.30 now.

        Comment


        • #19
          dodoent: ah, I see. That does suck, then :/. Sorry about that. All I can say is to watch the bug report, Dave and Jerome will get to it.

          Comment


          • #20
            @AdamW

            How to use your repo with rawhide?

            Comment


            • #21
              Originally posted by Kano View Post
              @AdamW

              How to use your repo with rawhide?
              just add it like any other repo, with a file in /etc/yum.repos.d/ . follow the format of the other files in there, change the name and use the appropriate path for my repo. something like:

              [video-experimental]
              name=Experimental video for Fedora Rawhide
              baseurl=http://www.happyassassin.net/video-experimental/rawhide/x86_64
              enabled=1
              metadata_expire=7d
              gpgcheck=0

              ought to do it. remember to change the arch if appropriate for your system.

              Comment


              • #22
                I second the call to see the performance of Ubuntu 9.10 included here... is there a benefit (video wise) to running a distribution using newer packages?

                Comment


                • #23
                  Now we are waiting on the new stack to run on real 3D accelerator. Not this intel 3D joke. If Radeons can run as good as this, it will be one of it's own kind. sigh...

                  Comment


                  • #24
                    Originally posted by FunkyRider View Post
                    Now we are waiting on the new stack to run on real 3D accelerator. Not this intel 3D joke. If Radeons can run as good as this, it will be one of it's own kind. sigh...
                    Intel is real 3D acceleration. The tests should have included Windows XP as a control group. This would add a measure to compare to.

                    Consider adding UT2004 as a litmus test.

                    Comment


                    • #25
                      I guess you could argue about what "real" means here, Intel really doesn't target gamers.

                      Anyway, comparisons with Win would be welcome.

                      Comment


                      • #26
                        Oh please, not a single 'real' Intel's 3D accelerator can beat the bottom of the line GeForce 8500 or even an 785G integrated chipset, and if you've ever seen PC games in recent like 10 years, there is no place for intel's graphics. We are living in the full HD gaming era where people treat 1920x1080, 4AA resolution as "normal". functional wise, intel's graphics can't even generate an acceptable render frame in modern games. I don't know if there is any better word than 'joke' to describe this. Intel never took gaming seriously. Or maybe they tried, it's either just too easy or too hard.

                        “They’re the world’s leading designers and manufacturers of CPUs – how hard could it be to build a GPU? I mean, come on, how hard could it be? That crummy little company down the road builds them – we could build them in our sleep. Come on, how hard could it be?” ——NVIDIA David Kirk

                        Comment


                        • #27
                          Originally posted by FunkyRider View Post
                          Oh please, not a single 'real' Intel's 3D accelerator can beat the bottom of the line GeForce 8500 or even an 785G integrated chipset, ...I don't know if there is any better word than 'joke' to describe this. Intel never took gaming seriously. Or maybe they tried, it's either just too easy or too hard.
                          I don't think that anyone was disputing that Intel accelerators have not served the needs of gamers. I would guess that 80% of the broader market has not cared about games nor 3D to any significant degree. So while Intel chips have not served the needs of games, that does not mean Intel is out of touch of the broader market.

                          It's also not hard to argue that 3D is serving a useful need in the general desktop, and there are innovative UI designs that can be accelerated by a GPU, and that the GPU's role in general purpose computing is becoming very significant (offloading video decoding from CPI, compositing desktops, accelerating photo editing, etc.). And Intel can be seen bringing stronger 3D offerings to their product lineup.

                          Either way, Intel is also very engaged in the Open Source community with open source drivers, the Moblin platform, etc., so finding other ways to describe this other than "joke", in order to soften your tone, would offer more respect to a significant contributor to the community.

                          Comment


                          • #28
                            Either way, Intel is also very engaged in the Open Source community with open source drivers, the Moblin platform, etc., so finding other ways to describe this other than "joke", in order to soften your tone, would offer more respect to a significant contributor to the community.
                            One word: Poulsbo.

                            Not to mention shitty, non-conformant OpenGL drivers across the board. Check out the OpenGL forums if you wish to how developers feel about the nightmare of supporting Intel hardware.

                            You may not like it, but Intel is single-handedly holding back the adoption of OpenGL. It's one of the leading reasons why Direct3D is the only real option for consumer graphics.

                            So you think *that* is good for the community? Heh, good one - tell us more!

                            Comment


                            • #29
                              Originally posted by FunkyRider View Post
                              Oh please, not a single 'real' Intel's 3D accelerator can beat the bottom of the line GeForce 8500 or even an 785G integrated chipset, and if you've ever seen PC games in recent like 10 years, there is no place for intel's graphics. We are living in the full HD gaming era where people treat 1920x1080, 4AA resolution as "normal". functional wise, intel's graphics can't even generate an acceptable render frame in modern games. I don't know if there is any better word than 'joke' to describe this. Intel never took gaming seriously. Or maybe they tried, it's either just too easy or too hard.

                              “They’re the world’s leading designers and manufacturers of CPUs – how hard could it be to build a GPU? I mean, come on, how hard could it be? That crummy little company down the road builds them – we could build them in our sleep. Come on, how hard could it be?” ——NVIDIA David Kirk
                              Please think before you post.

                              Intel graphics cards are not directed to "gamers" - they don't give a shit about those people. Integrated graphics are extremely useful for business people and mom/dad types which only want decent 2D and Google Earth. And guess what, they are the most lucrative market since they represent 80% of the graphics card sales.

                              Myself, I just bought a new laptop with the Intel X4500 because of 3 things:
                              a) I don't play games
                              b) Integrated graphics gives 25%+ more battery life
                              c) Intel has excellent working open source drivers, and having been burned by Ati and their crappy fglrx I really appreciate that

                              Sorry if I'm aggressive, but I'm tired of seeing people complaining about integrated graphics performance in games - come on, do you really expect them to play games? Just don't buy them for that.


                              BlackStar, can you provide a link for that claim? According to Intel, the latest drivers are OpenGL 2.1 compliant...
                              Last edited by caramerdo; 11-25-2009, 03:54 PM.

                              Comment


                              • #30
                                Originally posted by caramerdo View Post
                                BlackStar, can you provide a link for that claim? According to Intel, the latest drivers are OpenGL 2.1 compliant...
                                According to the 2.1 specs (pdf), "OpenGL 2.1 implementations must support at least revision 1.20 of the OpenGL shading language." (page 351, J.1). The latest drivers I could find would only support GLSL 1.10 - and badly at that.

                                Personally, I spent the better part of last week rewriting an application to work on a X4500 chip: downgraded to GLSL 1.10, disabled FBO blits, floating point attachments, MRT (which translates to no HDR, bloom, shadows or antialiasing) reduced texture resolution and finally... the driver produced an utterly broken picture. Imagine memory corruption, similar to TV snow, overlayed on top of the actual rendering.

                                That was on Vista, by the way. On Ubuntu, the driver simply refused to render any geometry touched by a vertex shader. In the end I simply gave up: Ati and Nvidia cards consume the application just fine, but Intel ones simply do not offer any meaningful OpenGL support. Maybe if you limited yourself to 15-year old GL1.1-level features the drivers might work correctly - but that's simply not an option.

                                It's sad to see the largest IHV (50%+ marketshare) produce such garbage. Their competitors are shipping OpenGL 3.2, when Intel is still struggling with 2.1 - and yes this *does* drive developers away from OpenGL and into Direct3D (and Microsoft).

                                Edit: Search for Intel on opengl.org to see how depressing the situation really is.
                                Last edited by BlackStar; 11-25-2009, 07:42 PM.

                                Comment

                                Working...
                                X