Announcement

Collapse
No announcement yet.

Radeon 3D Performance: Gallium3D vs. Classic Mesa

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    I don't understand your logic, maybe because I'm clueless about these things. Let's see. I'm assuming that no matter how much you crank up the FPS your screen is capping what you actually see to the refresh rate, right? I'm also assuming that save for brain injury of the developers, the physics simulation is uncoupled from the display, i.e., input events and motion calculation is done somewhere using a fixed clock independent of what a particular player is actually seeing on the screen. What you see displayed would be a discrete subset of the physics calculation (also discrete). If this is true, I don't quite see how the FPS rate is introducing any lag as long as it is equal or above the refresh rate. What's flawed here?

    I'm not considering undesired effects such as FPS-dependent physics ala Quake 3.

    Comment


    • #52
      I'm assuming that no matter how much you crank up the FPS your screen is capping what you actually see to the refresh rate, right?
      Only if you use vsync.

      If you don't you'll get tearing, but you'll be seeing things quicker as you'll get a combination of several frames at any given time.

      Of course, the main benefit is for people who can actually show more than 60Hz on their display.

      I'm also assuming that save for brain injury of the developers, the physics simulation is uncoupled from the display, i.e., input events and motion calculation is done somewhere using a fixed clock independent of what a particular player is actually seeing on the screen.
      AFAIK, no.

      And the Quake-related FPS-dependent physics are a result of this. QuakeLive uses float for coordinates instead of rounding to integers so the physics is fps-independent again (no compounded errors).

      At least this is how I understand Quake-based games (I haven't looked at the source, but I've read this sort of discussion many times). This might not apply to all engines.

      Comment


      • #53
        Right, I wasn't considering the case Q3 on purpose, since it has FPS-related physics (I think the intervals at which the movement was calculated were taken straight away from the frame rate). In any case, the result is undesirable, it doesn't make sense that different players experience different physics depending on their graphics cards. Again, I'm just assuming that in modern games physics are independent from frame rate.

        As for the tearing, I think this is a tricky one. It's already questionable that 16 ms lag between full frames has a measurable effect on game play. If we start talking about the effect of partial screen updates things become even more dubious.

        Comment


        • #54
          Well the thing is that many of the engines out there are based on some Quake engine, whether Q1, Q2 or Q3. And they all share the physics bugs.

          The major point is not whether the physics are FPS-dependent, but whether input handling is. AFAIK, in single-threaded engines, you have a big loop which renders a frame, collects the input, processes input, renders the next frame, etc. Generally, you want to pass any player input to the physics engine as soon as it is detected. And this doesn't mean that the physics are FPS-dependent, because you don't have to use monotonic intervals, you know how much time elapsed since the last iteration and can use that to calculate the new position, motion vectors, etc (the FPS-dependent physics were due to rounding errors, which were fixed in the meantime).

          With Doom3 and idTech 4 engine, the physics engine was capped to 60 Hz, so you're right -- physics updates will happen at 60Hz regardless of your frame rate. But even then, if you can get your click in there earlier, you might still catch the earlier 16ms timeslot, which can make a difference.

          Comment


          • #55
            Originally posted by bridgman View Post
            dfx, I don't understand your post. The 300g driver should work with 3xx through 5xx parts.

            The output you posted implies that you either haven't built the Gallium3D driver or at least aren't running it.
            and i don't understand yours... i building mesa with:

            Code:
            ./configure --prefix=/usr --build=x86_64-pc-linux-gnu --host=x86_64-pc-linux-gnu --mandir=/usr/share/man --infodir=/usr/share/info --datadir=/usr/share --sysconfdir=/etc --localstatedir=/var/lib --libdir=/usr/lib64 --with-x --enable-xcb --disable-motif --disable-debug --enable-glx-tls --with-driver=dri --with-x --with-dri-drivers=,swrast,radeon,r200,r300,r600 --enable-gallium-swrast --with-state-trackers=glx,egl,dri,vega,xorg --enable-gallium-svga --disable-gallium-intel --enable-gallium-radeon --disable-gallium-nouveau --disable-gallium-wmware --without-demos --disable-glut --enable-64-bit --disable-32-bit
            
            ...
            
                    Driver:          dri
                    OSMesa:          no
                    DRI drivers:     swrast radeon r200 r300 r600
                    DRI driver dir:  ${libdir}/dri
                    Use XCB:         yes
            
                    Gallium:         yes
                    Gallium dirs:    auxiliary drivers state_trackers
                    Target dirs:     
                    Winsys dirs:      xlib drm
                    Winsys drm dirs: vmware radeon swrast
                    Driver dirs:     softpipe failover trace identity svga r300
                    Trackers dirs:   glx egl dri vega xorg
            ...
            is it not enough ? what should i do to "run it" ?

            Comment


            • #56
              Originally posted by MostAwesomeDude View Post
              When you build Mesa with --enable-gallium-radeon, you'll get a radeong_dri.so library. Symlink or rename it to r300_dri.so and put it in your LIBGL_DRIVERS_PATH environment variable. That's it!
              this is pretty stupid way to "enable" anything 0_o. especially if you are using "package manager" for those compilations, that "portage" thing.

              should i completely wipe out r300_dri.so binary after compilation and use "manual" per-file\folder installation in ebuild instead of 'emake DESTDIR="${D}" install'-directive or to spoil the system by hands ? pretty stupid...

              Comment


              • #57
                i did a work-around for that:
                Code:
                1) mkdir /usr/local/lib/dri && cd /usr/local/lib/dri
                2) ln -s /usr/lib/dri/radeong_dri.so r300_dri.so
                ln -s /usr/lib/dri/radeon_dri.so
                ln -s /usr/lib/dri/swrast_dri.so
                3) echo "LIBGL_DRIVERS_PATH="/usr/local/lib/dri"" >> /etc/env.d/00local && env-update
                works fine now, thanks:
                Code:
                OpenGL vendor string: X.Org R300 Project
                OpenGL renderer string: Gallium 0.4 on RV515
                OpenGL version string: 2.1 Mesa 7.9-devel
                OpenGL shading language version string: 1.20
                but the way of doing things that way is still pretty... you know

                Comment


                • #58
                  Originally posted by dfx. View Post
                  i did a work-around for that:
                  Code:
                  1) mkdir /usr/local/lib/dri && cd /usr/local/lib/dri
                  2) ln -s /usr/lib/dri/radeong_dri.so r300_dri.so
                  ln -s /usr/lib/dri/radeon_dri.so
                  ln -s /usr/lib/dri/swrast_dri.so
                  3) echo "LIBGL_DRIVERS_PATH="/usr/local/lib/dri"" >> /etc/env.d/00local && env-update
                  works fine now, thanks:
                  Code:
                  OpenGL vendor string: X.Org R300 Project
                  OpenGL renderer string: Gallium 0.4 on RV515
                  OpenGL version string: 2.1 Mesa 7.9-devel
                  OpenGL shading language version string: 1.20
                  but the way of doing things that way is still pretty... you know
                  My workaround:
                  Code:
                  ./autogen.sh -prefix=/opt ...
                  make && make install
                  mkdir /opt/lib/dri/g/
                  ln -s /opt/lib/dri/radeong_dri.so /opt/lib/dri/g/r300_dri.so
                  ln -s /opt/lib/dri/swrast_dri.so /opt/lib/dri/g/swrast_dri.so
                  export LIBGL_DRIVERS_PATH=/opt/lib/dri/g/
                  And now if I recompile r300g gets updated automatically.

                  Comment


                  • #59
                    kinda OT: Is Gallium3D going to replace Mesa or they will coexist like ati(radeon) and radeonhd?

                    Comment


                    • #60
                      Gallium3d is not meant to replace Mesa, but to move card-specific stuff out of Mesa. It's a major rearrangement of code, basically.

                      The "classic Mesa" drivers will be around for older distros, but I'm not sure if they will be getting any major upgrades once Gallium3d takes over.

                      At least that's how I understood bridgman.

                      Comment

                      Working...
                      X