Announcement

Collapse
No announcement yet.

Intel Graphics Regressions In Ubuntu 9.04?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    @SyXbiT: Ubuntu 9.04 is far from being just in bugfix. The feature freeze is on february 16th and the beta freeze is on march 20th. Source: https://wiki.ubuntu.com/JauntyReleaseSchedule.

    So I quite agree with jeffro-tull. I don't see the point of an article that only show the differences between a stable distro and an alpha one. It would have been great to explain why there was such difference though, like explaining that Ubuntu decided to use this driver having in mind that it would be completely operational with the right kernel, mesa and xserver versions.

    Once all these packages (that depend each others) will be in the alpha, maybe it'd be interesting to see the performance improvements.

    And, like jeffro-tull said, the 9.04 is still an alpha and must have a whole bunch of debugging symbols in each binaries. It would have been nice to precise it in the article I think.

    Comment


    • #22
      Originally posted by Creak View Post
      And, like jeffro-tull said, the 9.04 is still an alpha and must have a whole bunch of debugging symbols in each binaries. It would have been nice to precise it in the article I think.
      If it's anything like Debian, the debugging symbols are stripped and shipped in separate -dbg packages and only loaded once you fire up gdb.

      Comment


      • #23
        To echo what several others have already said, yes do not make final judgments on jaunty so soon.

        It's sort of like going to a restaurant with a glass wall where you can see into the kitchen, and then reporting about all the undercooked food you saw.

        In fact, we just merged this new -intel driver, along with new mesa, xserver, and libdrm last week. Especially with mesa and xserver, those are still -rc versions and likely to get upgraded. I also hope to see a .2 or newer -intel we can upgrade to.

        See my blog for more info:

        Comment


        • #24
          I'm not a huge ubuntu guy, so I don't know how they go about doing it. Even my "preferred" distros, I don't jump on to the new version until usually just before Release Candidate phase. At this point in the 9.04 life-cycle, I feel it'd be foolish to not have debugging symbols in there by default (since it's still very early and there is a very high likelihood that things will break), but, like I said, I'm not much on Ubuntu, so I don't know.

          I still stand by what I said the first time Phoronix felt like comparing 8.10 to 9.04 - don't! Too many variables! If you're that concerned with what's happening with their graphics stack, then pull the source code and compile it with an otherwise stable distribution. 9.04 is far enough along that (most likely) not a single dpkg is shared with 8.10, but not nearly far enough along that they're into "fine polish and wax" stage.

          Comment


          • #25
            At this point in the 9.04 life-cycle, I feel it'd be foolish to not have debugging symbols in there by default (since it's still very early and there is a very high likelihood that things will break), but, like I said, I'm not much on Ubuntu, so I don't know.
            Like whizse says, the -dbg symbols are shipped separately. We also have a tool called apport that catches crashes and uploads the core files (privately), which the retracer can use to generate the symbols.

            Comment


            • #26
              I've been running Mesa 7.3, Xorg 2.5.99.901-1 from Rawhide and then a 2.6.29-rc2 from Fedora's Koji build system on my laptop with GMA X3100 graphics. This is in a otherwise Fedora 10 system.

              After moving over from Debian to F10 on my laptop I was very dissapointed with the state of Intel graphics drivers in F10. They really dropped the ball. Had application compatibility issues with things like Blender and Nexuiz would just plain crash the thing. I didn't experience that with Debian.

              Now performance was very good.. according to things like glxgears, which scored higher then they ever did in Debian. However the crashing and application incompatibility was infuriating.

              Upgrading to those newer drivers and kernel that I described above actually created a much more stable OpenGL experience for me. I lost the ability to suspend reliably, which is really a shame. But other then that it was a very good thing for that laptop. Now I can run Blender and play games without the locking up.

              The downside is that 2D performance did seem to suffer. Enabling UXA seemed to make Firefox much happier though. Hard to say. But it seems to have gotten most of it's "snap" back.

              The thing that sucks is the Adobe Flash video playback performance took a nosedive. Can't play Hula.com at high res any more or at full screen. Things like that. I know that if I used Gnash or Swfdec or whatever it would probably perform fine, but those don't have compatibility with the majority of sites out there.

              For youtube and such I can use clive utility to download the videos and then play them with mplayer, which is much nicer, but still doesn't work with lots of sites I like to visit.


              Still don't have DRI2 going yet. Can't figure out what the issue it. It loads up DRI2 and the modules and I believe I have it enabled correctly in the xorg.conf. It seems that the Rawhide builds have it disabled in the xserver.


              ------------------


              For what they are and the large amounts of architectural changes that are being done with the Linux kernel, Mesa, and Xorg to modernize open source 3D graphics I am fairly happy with it so far.

              It's all a huge change and when it stabilizes I am not going to expect any great performance improvements... Actually I expect some, hopefully small, performance regressions in the first stable releases. Complexity and code churn tends to be the enemy of performance. But what I do expect is much better application and game compatibility. As well as stability and good performing 3D desktops on GMA X3000 and newer. (and hopefully acceptable performance for GMA 950 stuff)

              I figure once you see people realize they can depend on OSS 3D graphics drivers to work and that they are stable and offer good power characteristics in mobile devices then I figure you'll see a much more solid push for performance.

              Comment


              • #27
                Flash 10 32 bit is using OpenGL for fullscreen. But I think it checks for Nvidia or ATI vendor string. 64 bit uses always software rendering, so fullscreen flash performance is really bad.

                Hint: Force OpenGL vsync to avoid tearing with Flash 10.

                Comment


                • #28
                  BTW I miss the PTS compare url in the article. Would like to do a G31 test lenny vs. jaunty 64 bit.

                  Comment


                  • #29
                    Thanks for that great article.
                    Benchmarks like this shed some light no 2D performance too which often is overlooked.
                    I like the fact that end-users have evidence that its not just imagination that their desktop got slower and slower

                    Hopefully they'll have it fixed soon, and can work fixing other pending stuff and open bug-reports.

                    Comment


                    • #30
                      Originally posted by Kano View Post
                      Flash 10 32 bit is using OpenGL for fullscreen. But I think it checks for Nvidia or ATI vendor string. 64 bit uses always software rendering, so fullscreen flash performance is really bad.

                      Hint: Force OpenGL vsync to avoid tearing with Flash 10.

                      Ya thanks for the tip. I got rid of some 'optimizations' in my xorg.conf and that perked flash support up quite a bit. Still couldn't play Hulu.com 480p at full screen without occasional hiccups.

                      But your post reminded me of a old thing I read a while ago and forgot about. The Adobe Flash looks for 'SGI' string to disable GPU usage. If it finds that string then that kills GPU acceleration.

                      So the goal, I think, about that is to avoid the normally buggy and incomplete OSS OpenGL drivers since they use SGI strings. Proprietary ATI and Nvidia don't, I think.

                      Use 'glxinfo' to see your what your drivers are advertising support for.

                      But if you make a /etc/adobe/mms.cfg file and put:
                      OverrideGPUValidation=true


                      in there then that forces GPU acceleration. There is a Adobe README pdf that explains all of this.

                      With that trick enabled then it seems to be much much smoother and X doesn't use very much CPU compared to what it did before.. of course the flash itself is still using a lot of cpu playback, but it's something my laptop can handle.

                      Comment

                      Working...
                      X