Announcement

Collapse
No announcement yet.

Radeon Gallium3D MSAA Mesa 10.1 Git Benchmarks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Radeon Gallium3D MSAA Mesa 10.1 Git Benchmarks

    Phoronix: Radeon Gallium3D MSAA Mesa 10.1 Git Benchmarks

    It's been a while since last looking on the anti-aliasing performance of the R600 Gallium3D driver so for this article we have some fresh MSAA benchmarks of the driver from Mesa 10.1-devel and using a Cayman-based high-end AMD Radeon graphics card.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    SMAA needs to be implemented. It is much better than MSAA as it gives both better image quality and doesn't use as much resources. The only problem is, it's only programmed for Windows currently, but the code is MIT licensed and available here https://github.com/iryoku/smaa

    Comment


    • #3
      Originally posted by siavashserver
      SMAA, MLAA, FXAA, etc are no business of drivers and they should be implemented on graphics engines side.
      Nonsense; next you'll tell me OpenGL/Direct3D are no business of drivers because they should be implemented on graphics engines. These are graphics technologies -- they belong in drivers.

      Comment


      • #4
        Originally posted by mmstick View Post
        Nonsense; next you'll tell me OpenGL/Direct3D are no business of drivers because they should be implemented on graphics engines. These are graphics technologies -- they belong in drivers.
        MSAA is a hardware component of the GPU that must be exposed by the driver. There are many ways to do that, but OpenGL and Direct3D are the industry standards.

        SMAA, MLAA etc are software algorithms that can be implemented on top of OpenGL and Direct3D. They exist on a different level.

        You could argue that SMAA is useful enough that it should be exposed universally through the driver. I would even agree with that. But equating SMAA with OpenGL/Direct3D is simply nonsense.

        Comment


        • #5
          FXAA TXAA SMAA. is just blur imho, kills all the detail. I don't like it.

          MSAA or SSAA is way better.

          Comment


          • #6
            Originally posted by rudl View Post
            FXAA TXAA SMAA. is just blur imho, kills all the detail. I don't like it.

            MSAA or SSAA is way better.
            FXAA and MLAA maybe, but SMAA looks better than MSAA, and I was unable to tell the difference between SSAA and SMAA other than SMAA having much higher framerates. Why the industry hasn't adopted SMAA yet is beyond me, considering the only way to use it on Windows is to inject it into your game's dlls.

            Comment


            • #7
              Originally posted by mmstick View Post
              Nonsense; next you'll tell me OpenGL/Direct3D are no business of drivers because they should be implemented on graphics engines. These are graphics technologies -- they belong in drivers.
              siavashserver is right. You are wrong. Applications are responsible for implementing other AA modes if necessary. Applications are also responsible for making use of MSAA such that it actually improves visual quality. 3D drivers cannot help there, because the drivers don't know how the rendering works and where the AA should be applied to make it work. That's why I advise everybody not to use environment variables which enforce MSAA and other AA modes, because in 50% of cases it will make no visual difference, but it will still decrease performance. Drivers can only expose AA features, but applications also have to properly integrate them to their rendering pipelines. More and more games are starting to use deferred rendering and other techniques where MSAA is completely useless and that's why MLAA other techniques have been developed - to add AA to graphics engines where MSAA or even SSAA cannot be used. However the other AA techniques are mostly blur-based anyway (like MLAA and FXAA), so they cannot be better than SSAA/MSAA in general. They are only used in apps as a last resort where no other AA is applicable.

              Comment


              • #8
                Originally posted by siavashserver
                SMAA, MLAA, FXAA, etc are no business of drivers and they should be implemented on graphics engines side.
                In an ideal world, yes.

                However in the real world, you have thousands of binary-only applications that will never get any updates, and don't support any AA mode.

                Comment


                • #9
                  Originally posted by mmstick View Post
                  SMAA needs to be implemented. It is much better than MSAA as it gives both better image quality and doesn't use as much resources. The only problem is, it's only programmed for Windows currently, but the code is MIT licensed and available here https://github.com/iryoku/smaa
                  I have seen SMAA, temporal AA and FXAA in Tesseract, which runs on all major OSes.

                  Comment


                  • #10
                    Originally posted by marek View Post
                    graphics engines where MSAA or even SSAA cannot be used.
                    Are you sure that there are engines which cannot use SSAA in principle? I understand that those folks which strive for maximum visual quality keep requesting forced SSAA as driver function (only to be repeatedly turned down by AMD).

                    Comment

                    Working...
                    X