Announcement

Collapse
No announcement yet.

MSAA RadeonSI Gallium3D Performance Preview

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Hi,

    First of all, MSAA works really well and the article is WRONG. Michael was using the GALLIUM_MSAA variable in a way that couldn't have worked. If you enable MSAA in the game and also use the variable, both configs will conflict and you will see garbage (as can be seen in the article). Either turn off MSAA in the game and use the variable (if it has any effect), or turn on MSAA in the game and don't use the variable.

    Another misconception in the article is that GALLIUM_MSAA enables MSAA. This isn't true. It only forces an MSAA visual for the window. Depending on the game, this may or may not have any effect. Unfortunately, it doesn't have any effect on the Source engine (Team Fortress 2, etc.). Even if GALLIUM_MSAA doesn't do anything, there still is a performance hit and wasted VRAM. Whether GALLIUM_MSAA has any effect depends on how games do rendering and drivers cannot do anything about it.

    If this is too confusing, I'll just remove the GALLIUM_MSAA override. What do you think?

    Comment


    • #12
      Originally posted by FutureSuture View Post
      RadeonSI works with the AMD Radeon R9 290X now? And with 8x MSAA at that? How the times have changed!
      Yes :-) I even get the dpm reclocking, but others have problems with that (I have a non-reference Sapphire Tri-X model with a different VBIOS). HL2 E2 is very playable with 8x MSAA and all settings maxed.

      And yes, you need 3.17, new ucode, recent libdrm, mesa and xf86-video-ati ... and I also use llvm from git.


      Originally posted by marek View Post
      If this is too confusing, I'll just remove the GALLIUM_MSAA override. What do you think?
      I'd remove it. I mean, I like MSAA, but if it doesn't work in all cases or may lead to screen corruption, I'd rather go the safe route. Stability and solidity is more important for me. Just my 2c of course ;-)
      Last edited by mazumoto; 16 September 2014, 01:25 PM.

      Comment


      • #13
        Basically all modern games render to an internal render target and blit it to the window surface after rendering. So this is a feature that only is useful/meaningful for older games that don't support MSAA as an ingame setting.
        Also in Linux it is a bit of a pain to switch different resolutions on the fly. And that's why libs like SDL2 use a internal render target and then blit to native desktop resolution to avoid that "pain".

        Originally posted by marek View Post
        If this is too confusing, I'll just remove the GALLIUM_MSAA override. What do you think?
        Maybe give it a more brutally descriptive name? Like GALLIUM_FORCE_MSAA_WINDOW_SURFACE

        Comment


        • #14
          Originally posted by marek View Post
          If this is too confusing, I'll just remove the GALLIUM_MSAA override. What do you think?
          Remove it Or just explain, that it is _only_ for forcing on apps who does not know and does not do anything about it, otherwise you can break an app

          Comment


          • #15
            2 more things:

            1) The screenshot with random garbage shown in the article can be seen with all Gallium drivers, not just radeonsi.

            2) radeonsi doesn't support 6x MSAA. It can only do 2x, 4x, 8x. If you ask for 6x, you'll get 8x.

            Comment


            • #16
              Originally posted by marek View Post
              2 more things:
              1) The screenshot with random garbage shown in the article can be seen with all Gallium drivers, not just radeonsi.
              Actually i can't get corruption at least with HL2 and Portal . With current git it is OK for me, but if i use 10.2 branch app just freeze if i do env + settings in those games, again with no corruption . Maybe it broke dri2 fbos somehow or only appear with some compositors, something like that

              I will doublecheck this, but for now it does not happen for me with current git

              Comment


              • #17
                Originally posted by dungeon View Post

                I will doublecheck this, but for now it does not happen for me with current git
                So doublechecked, those must match with number of samples setted in the game... So x2 as env and x2 in game works and other values too, it only produce black screen or freeze the game if env value and value in game does not match (eg. x2 env, but x4 in the game, etc)

                Comment


                • #18
                  Originally posted by marek View Post
                  Another misconception in the article is that GALLIUM_MSAA enables MSAA. This isn't true. It only forces an MSAA visual for the window. Depending on the game, this may or may not have any effect. Unfortunately, it doesn't have any effect on the Source engine (Team Fortress 2, etc.). Even if GALLIUM_MSAA doesn't do anything, there still is a performance hit and wasted VRAM. Whether GALLIUM_MSAA has any effect depends on how games do rendering and drivers cannot do anything about it.

                  If this is too confusing, I'll just remove the GALLIUM_MSAA override. What do you think?
                  How about printing a warning to the console whenever the option is enabled. Something like:

                  "Warning: GALLIUM_MSAA forces a MSAA visual for the window only. This will have no effect on most modern games other than wasting performance and may cause corruption. Do not use if the application provides a setting to allow MSAA."
                  Last edited by smitty3268; 17 September 2014, 02:34 AM.

                  Comment


                  • #19
                    I agree, this seems better than removing the variables altogether.

                    Or maybe adding something alike glxgears: "iacknowledgethatthistoolisnotabenchmark" but for msaa...

                    Comment

                    Working...
                    X