Announcement

Collapse
No announcement yet.

Radeon 3D Performance: Gallium3D vs. Classic Mesa

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by pvtcupcakes View Post
    Michael would have to spend money though.

    Edit: Oh, there is a demo at least.
    I already own Doom 3, Quake 4, ET:QW, etc. But last I knew Doom 3 didn't run well with Mesa -- either classic or Gallium3D.
    Michael Larabel
    https://www.michaellarabel.com/

    Comment


    • #22
      gallium versus no gallium etc...

      Originally posted by pvtcupcakes View Post
      Michael would have to spend money though.

      Edit: Oh, there is a demo at least.
      Folks, couple of things I would appreciate help in:
      1) I think I have almost everything working running Debian,
      except the shading setup with HD3870 X2
      Built KMS/drm support using a pristine 2.6.33.1 kernel, Mesa-7.8-rc1 (nouveau but not Gallium). Has anyone figures on what performance figures to expect (alternatives Gallium-Wine-Xorg-flgrx?)

      Right now I have:

      $ glxinfo | grep render
      direct rendering: Yes
      OpenGL renderer string: Mesa DRI R600 (RV670 9509) 20090101 TCL DRI2

      used latest firmware from linux-firmware.git:


      $ LIBGL_DEBUG=verbose glxinfo 2>/dev/null | grep -i opengl
      OpenGL vendor string: Advanced Micro Devices, Inc.
      OpenGL renderer string: Mesa DRI R600 (RV670 9509) 20090101 TCL DRI2
      OpenGL version string: 1.5 Mesa 7.7.1-DEVEL
      OpenGL extensions:
      ------> OpenGL shading language version string: xxx missing ???
      1) should I downgrade to R500?

      2) is performance what is expected:
      $ glxgears -info
      11337 frames in 5.0 seconds = 2267.288 FPS
      other more complete 3D benchmarks?

      Thanks in advance. -- Pat

      Comment


      • #23
        Hmm... well, Mesa itself should support all of those games almost perfectly, still not sure about any of the radeon drivers.

        Anyone feel like trying them out and updating http://www.x.org/wiki/RadeonProgram ?

        Comment


        • #24
          Originally posted by 0e8h View Post
          Probably better to be at a constant fps then climbing the highest hill as this would be best for power saving on laptops and smoothness. What's the point of rendering frames that missing the screen's refresh rate. 60fps should the be the cap on most gfx driver settings.

          Anyone think otherwise?
          Easy. Input lag. 1 second/60 frames = 16ms. 1/120 = 8ms. The difference is very noticeable, ask any decent musician or pro player.

          Why should I wait 16ms for my shot to fire when I can wait 8ms... I'll hear it and it'll get sent to the server sooner, even if I don't see it that fast.

          Comment


          • #25
            As Michael said, the constant framerate is probably due to syscall overhead. Dave pushed a handful of things that should amortize that a bit, and there's still more optimizations that could be done.

            Comment


            • #26
              Thanks for providing some developer insight rather than just a fuck ton of numbers like the benchmarks usually are. This actually made for an interesting read.

              I have one suggestion though, could you pick colours that are easier to distinguish. I have a very difficult time figuring out which line is which driver. Could be because i'm dichromatic (red/green colour blind). Maybe make one line a bright colour and the other a darker.

              Comment


              • #27
                Originally posted by garytr24 View Post
                Easy. Input lag. 1 second/60 frames = 16ms. 1/120 = 8ms. The difference is very noticeable, ask any decent musician or pro player.

                Why should I wait 16ms for my shot to fire when I can wait 8ms... I'll hear it and it'll get sent to the server sooner, even if I don't see it that fast.
                I'm guessing the amount of pro players here are pretty small

                Comment


                • #28
                  Thanks for the article. It was a good read. I liked the insight it provided along with the numbers too which cleared up many questions I had about the performance.

                  Comment


                  • #29
                    Hey that was pleasantly unexpected (the results)

                    When Gallium3D gets optimized (somewhere in the distant future, right?) then will it be faster than classic Mesa?

                    Comment


                    • #30
                      libdrm from git is also required

                      Originally posted by MostAwesomeDude View Post
                      When you build Mesa with --enable-gallium-radeon, you'll get a radeong_dri.so library.
                      There's also the small matter of needing libdrm to define:
                      Code:
                      #define RADEON_BO_FLAGS_MICRO_TILE_SQUARE 0x20
                      That's not in 2.4.18.

                      Comment

                      Working...
                      X