No announcement yet.


  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    All of my dev machines are 64 bit now but I keep a couple 32 bit VMs around for compatibility. Those get run at just about every commit. I don't do PowerPC or SPARC builds more than once a week, but I get all the bases covered before proposing anything for release.


    • #32
      Originally posted by deanjo View Post
      Unfortunately, some developers are satisfied with code that is "good enough" and don't take pride in their work.
      Sadly that is all too true and its very pervasive within certain circles.


      • #33
        Originally posted by airlied View Post
        The problem was warnings on a platform the code wasn't developed on.
        I may be wrong but looking at the warnings and not at the code, they seem to be related to the code itself and not to the platform...


        • #34
          Don't even need to see the code for that...

          I don't know if Linus says so based upon of inspection of the code delivered, but it is really so crap that you don't even need to go down to the code to see it.

          Intel's linux drivers performance has been decreasing from Ubuntu 7.10 (and distros of its same age) to now, even before the GEM phase.

          With Ubuntu 7.04's intel drivers, you could easily get 1500fps at glxgears, on Ubuntu 7.10 you got less that 1500fps, 8.04 lowers the mark to 1100fps, 8.10 to 500pfs and in current JAUNTY alphas TO 300fps!!

          There are also rendering errors apart from the performance loss.

          I thought that the main goals of graphics driver software were stability AND performance. For the time being, both Windows closed source intel drivers and Nvidias ones (Win&Linux) KICKs Intel opensource's drivers ASS.

          That's a fact that is being proved TRUE since Ubuntu 7.10, that is October 2007, quite a long time if you ask me.


          • #35
            It's not clear what's causing the performace problems in Phoronix's test. Ubuntu has been known to package broken versions of stuff in their betas (and even sometimes in the final version of their distro). It's also not clear what the default settings were.

            That said, while GEM for intel ships with the 2.6.28 kernel, the drivers have to take advantage of it extensively (the only use so far is UXA which is still experimental and not enabled by default). As you can see by the Phoronix UXA test results, in most cases, UXA gives a substantial boost in performance.

            That said, stuff is going to be a bit unstable while the new technology is introduced. People in charge of releases and distros are doing their best to keep things from breaking in mainline versions, but not every possible problem can be discovered or weeded out by the devs prior to release. For the first time in a number of years, the entire graphics system is being redesigned, and there are likely to be some hiccups. However, the new design will ultimately suit modern hardware and software requirements much better, improving performance, and feature-set.


            • #36
              Here's a question: Is Gallium3D going to make GEM obsolete?


              • #37
                Originally posted by wswartzendruber View Post
                Here's a question: Is Gallium3D going to make GEM obsolete?
                No. Gallium3D a replacement for Mesa (3D framework). GEM is a memory manager. So they are orthogonal.


                • #38
                  Yep. Strictly speaking Gallium3D is a replacement for the HW driver subsystem of Mesa rather than replacing Mesa itself, and other acceleration APIs may call Gallium3D directly, but as Mithrandir said it is definitely orthogonal to GEM.

                  GEM and TTM are the ones which kinda compete -- in theory GEM is making TTM obsolete although in practice it seems that GEM may not be sufficient for discrete GPUs with separate video memory, so the current direction for ATI parts has been to build a GEM API over the current TTM implementation.
                  Test signature


                  • #39
                    When something is new and unstable it usually is wise to keep the old stuff around so users can choose to fallback to it while the new stuff gets stable and has a good performance.

                    Why didn't they choose this path?
                    Why should we suffer poor linux drivers while Windows works as expected?

                    Couldn't they keep the closed source versions around while the opensource ones mature?

                    You DON'T close the old road till the new highway is ready.

                    More examples?
                    - AMD/ATI is keeping their closed source drivers while the opensource ones are being enhanced.
                    - The OSS to ALSA path.

                    It is painfully long, but the alternative is also quite long and in the meantime you have nothing working fine.


                    • #40
                      Well, it is still using the old stuff, and that's really the problem. Most of the changes being made right now are designed to work with the "new stuff," which sometimes means that they don't work as well with the old stuff. The performance regressions are partially, I believe, because the drivers are using the old tech still, while all the software expects them to be using the new stuff.

                      I believe I read something about Ubuntu udating their intel graphics drivers to fix some issues for Jaunty, so the performance may be significantly better now.