Announcement

Collapse
No announcement yet.

Marek Lands Radeon Gallium3D MSAA Changes

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    so we basically have the same setup... reboot? ;-)

    Comment


    • #12
      Originally posted by ChrisXY View Post
      Code:
         --with-dri-driverdir=/usr/lib/xorg/modules/dri \
         --with-gallium-drivers=r600 \
         --with-dri-drivers= \
         --with-egl-platforms=x11,drm,wayland \
         --enable-texture-float \
         --enable-gles1 \
         --enable-gles2 \
         --enable-egl \
         --enable-xorg \
         --enable-xa \
         --enable-vdpau \
         --enable-gallium-g3dvl \
         --enable-glx-tls \
         --enable-glu \
         --enable-gbm \
         --enable-gallium-gbm \
         --enable-shared-glapi \
         --enable-xorg \
         --enable-gallium-llvm \
         --enable-openvg \
         --enable-gallium-egl \
         --enable-osmesa \
         --enable-r600-llvm-compiler
      Silly question, but do you possibly have multiple versions of Mesa + the drivers installed on the system? Maybe the distro-provided version is being loaded?

      Comment


      • #13
        Originally posted by Veerappan View Post
        Silly question, but do you possibly have multiple versions of Mesa + the drivers installed on the system? Maybe the distro-provided version is being loaded?
        Pretty sure it's the current one. How would I test MSAA besides compiling and running piglit myself?
        Furmark in wine crashes instantly in trackmania nations in wine the Anti Aliasing doesn't work but I think it should use MSAA and xonotic doesn't succeed in setting Anti Aliasing...

        Comment


        • #14
          Originally posted by Veerappan View Post
          Silly question, but do you possibly have multiple versions of Mesa + the drivers installed on the system? Maybe the distro-provided version is being loaded?
          N/M on my own question... It's still probably a valid question, but I've got the following:

          Code:
          OpenGL renderer string: Gallium 0.4 on AMD BARTS
          OpenGL version string: 2.1 Mesa 8.1-devel (git-565a4e2)
          OpenGL shading language version string: 1.30
          Once netbeans stops freaking out on me, I'll see if I can step my debugger through the logic and see what's wrong...
          Last edited by Veerappan; 15 August 2012, 08:08 PM.

          Comment


          • #15
            You need the kernel from Linus's tree (I think he merged the MSAA patch just yesterday).

            Also the MSAA can be used through OpenGL's framebuffer objects only. Gallium doesn't implement MSAA through the GLX configs yet. (that will require core Gallium work only though)

            Comment


            • #16
              Originally posted by marek View Post
              You need the kernel from Linus's tree (I think he merged the MSAA patch just yesterday).

              Also the MSAA can be used through OpenGL's framebuffer objects only. Gallium doesn't implement MSAA through the GLX configs yet. (that will require core Gallium work only though)
              Thanks for the insight. I'll wait for 3.6.0rc2 and give it another shot (on 3.6.0rc1 currently).

              Comment


              • #17
                marek or veraapaen could you give me some tips to debug prime code in kernel 3.5/3.6 cuz it seems the code don't like at all my 4850X2.

                things i noticed:

                1.) regnum online with kernel 3.4.5 is 100 fps stable but in 3.5 or 3.6 constantly flip between 10[ish] and 120 fps
                2.) hard lockups very randomly(reason i don't know how to debug it)
                3.) other subsystems crash on me[my realtek wifi is quite common] if i use the r600g driver but if i boot nomodeset and force llvm renderer no more crashes at all[weird thing]

                if you ask me it seems some part of the code think my 4850X2 is a single GPU and another part randomly try to render stuff in the second GPU and send it to the first GPU and eventually shit hit the fan and fail to read an empty framebuffer in the first GPU cuz is actually in the second GPU ram and BANG[very empiric deduction]

                i use gentoo so i can recompile anything you need i just need to know what to do it stress me to no end have an ancient 3.4 kernel running [live git or no ballz jajajaja]

                Comment


                • #18
                  Bisect the kernel between 3.4 and 3.6?

                  Comment


                  • #19
                    well not very useful cuz prime or dma buff are not in 3.4, it landed in 3.5 and since 3.5 upto 3.6 my X2 is unusable and handle 5m tops without hardlockups

                    and the issue seem to be that my x2 card is perceived as optimus or it simple cant understand 2 GPU in 1 PCB dunno

                    i know the issue is prime commit what i need is a way to see what is the GPU doing that lock it up so maybe one of the gurus can fix it [ky kernel C kung fu is weak and may kill kittens] having enough debug info

                    Comment


                    • #20
                      Originally posted by jrch2k8 View Post
                      well not very useful cuz prime or dma buff are not in 3.4, it landed in 3.5 and since 3.5 upto 3.6 my X2 is unusable and handle 5m tops without hardlockups

                      and the issue seem to be that my x2 card is perceived as optimus or it simple cant understand 2 GPU in 1 PCB dunno

                      i know the issue is prime commit what i need is a way to see what is the GPU doing that lock it up so maybe one of the gurus can fix it [ky kernel C kung fu is weak and may kill kittens] having enough debug info
                      Unfortunately, I don't think I'll be of much help. My experience is more on the user-land side of OpenCL, and I don't have any hardware that I can really use to reproduce the issue.

                      Comment

                      Working...
                      X