Announcement

Collapse
No announcement yet.

ATI R600 Gallium3D Driver Continues Advancing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #81
    No, I need at least stable 60fps. Many servers nowadays use fps-independent physics, but the game is so fast that you need every frame your monitor can display. That, and I have a low-end 4550.

    But it sounds like it should be thoroughly playable at lower resolutions, though. Good news.

    Comment


    • #82
      The problem is that there is a serious bottleneck elsewhere.

      with that demo and r600c, I get:

      47fps @ 640x480
      32fps @ 1920x1080

      Granted, the driver is throttling at vsync, so these are not too accurate.

      Comment


      • #83
        Originally posted by pingufunkybeat View Post
        But it sounds like it should be thoroughly playable at lower resolutions, though. Good news.
        Well, as said, effects were also set to very high quality. That's the best way to see progress imo: give the system some really tough stuff to eat through.

        Comment


        • #84
          Okay, it looks a lot better now, at least with native games. The excessive memory consumption I'm seeing with hl2 is probably something with wine.

          Comment


          • #85
            Originally posted by pingufunkybeat View Post
            The problem is that there is a serious bottleneck elsewhere.

            with that demo and r600c, I get:

            47fps @ 640x480
            32fps @ 1920x1080

            Granted, the driver is throttling at vsync, so these are not too accurate.
            Wow, Tiling really does wonders.

            I've just upgraded to a recent kernel and I almost have solid 60 fps at 1920x1080!

            Almost. It's still a bit too jerky, so I play at 1280x1024 now.

            Comment


            • #86
              Note that it's possible that Fedora not using TLS might slow me down a bit. Dunno whether there's a significant penalty. (I'm not using SELinux)
              Will try with newer stuff soon and recompiling things with TLS a little later.

              Comment


              • #87
                Originally posted by monraaf View Post
                The excessive memory consumption I'm seeing with hl2 is probably something with wine.
                hmmmmm.. maybe. maybe not.

                Take a look at this file (current revision) and search for r600_delete_ps_shader.

                wine has some quirks where it's frequently re-creating shaders, so I'd imagine that this leak will hit harder in wine games.

                At least it's a known leak.

                Comment


                • #88
                  OK, I've tried r600g.

                  In my tests, it is 2x slower than r600c in openarena. I get solid 60fps with classic, struggle to get solid 30fps with gallium. This is with vsync, so only rough numbers.

                  KWin bails out complaining that the effects are too slow. I didn't try disabling the checks. It works fine with classic.

                  Still, nice progress. I haven't noticed any rendering artifacts or lockups, so things re progressing nicely.

                  Comment


                  • #89
                    R300g v. R300c

                    Originally posted by xiando View Post
                    I "imagine" development is going forward as indicated by mesa git log, http://cgit.freedesktop.org/mesa/mes...t=grep&q=r600g



                    My distro does not provide binary packages... and I do eselect mesa set r600 classic when I'm done checking out how the current r600g is doing.



                    The way FOSS works hasn't really changed much the last 15 years, but I am glad I could help you release all that supressed anger.
                    While R600c leads now, I honestly don't expect it to maintain that lead for long, if the performance gains by R300g are anything to go by. R300g is now actually starting to show up in distribution testing cyclss (and not just Ubuntu, where it has completely replaced R300c, but Debian/Arch/Gentoowide). However, there have been other changes in the Linux graphics firmament (Flash plugin behavior in particular), and if you have hardware that changes over to R300g for Xorg support (basically, Radeon 9200 through the X1K series) you will have no idea whether to yell at Tungsten (who did a ton of contribution work toward the new server) or Adobe (however, because of issues with Flash that are not unique to Linux, I feel on safer ground yelling at Adobe for the sorry state of their Flash plugin).

                    As I stated earlier, some distributions have already swapped out R300c for R300g during their test cycle; Ubuntu is the most well-known, with Maverick Meerkat kicking 10.10 RC out the door this week. I have two GPUs that I've been throwing at the new server - an AMD AIW 9700 Pro and an X1650 Pro, both in AGP flavor. The test platform otherwise is a P4 2.4 with 512 MB of DDRr333, 40 GB WD PATA HDD, and, as staed, Kubuntu 10.10 RC. From straightforward seat-of-pants observations, R300g loses nothing peerformancewise compared to R300c, and is actually more stable than R300c driving KDE (which, in the 4.5.1 iteration included with Maverick Meerkat, places more demands on Xorg than older versions of KDE); this would actually make KDE 4.5.x a viable option with legacy AMD hardware, which has not really been the case (except among masochistic types due to both performance and stability issues).

                    Because R300g is a drop-in replacement, as long as you have Xorg 1.9 and Mesa 7.9 or later, if your GPU is covered by this server, you need do nothing in the way of post-change fiddling, which ususally isn't the case with a point release of Xorg, let alone point releases of both Xorg and Mesa, coming down the pike.

                    Comment


                    • #90
                      Originally posted by PGHammer View Post
                      and not just Ubuntu, where it has completely replaced R300c
                      As I stated earlier, some distributions have already swapped out R300c for R300g during their test cycle; Ubuntu is the most well-known, with Maverick Meerkat kicking 10.10 RC out the door this week.
                      I believe this is all wrong. Quoting from the maverick mesa debian/rules file:
                      Code:
                      # We should switch to r300g soon, but it won't support UMS.  Stick
                      # with r300c for now
                      R300g loses nothing peerformancewise compared to R300c, and is actually more stable than R300c
                      Maybe for KMS, but UMS (classic) still beats the pants off KMS (classic or gallium) in many cases. And some users need UMS because of KMS bugs.

                      Comment

                      Working...
                      X