Announcement

Collapse
No announcement yet.

Catalyst 8.6 + Cedega = graphics corruption

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    This seems to work yes.

    Removing the fglrx-glx/fglrx-glx-ia32 packages in debian worked (had to reinstall libgl1-mesa-glx afterwards, didn't restore redirections)

    LIBGL_DEBUG=verbose reports that it's using the fglrx_dri.so driver also so acceleration should be working correctly.

    This also fixes the switching workspaces while fullscreen OpenGL apps running issue curiously.

    (edit): Requires to make a manual copy of the fglrx_dri.so if you remove fglrx-glx-ia32 for 32bit apps.
    Last edited by maligor; 08-05-2008, 10:25 AM.

    Comment


    • #62
      No, doesn't do the trick on ArchLinux. I uninstalled fglrx and installed the mesa-libgl, copied that to a safe location and reinstalled catalyst. Then tried to use the mesa-libGL.so both with LD_PRELOAD as well as by copying over the catalyst libGL. Still corruption.

      Switching display mode by Ctrl-Alt-+, running wine, then (still within wine) Ctrl+Alt+- works. No corruption then and after playing back to the original desktop. Weird.

      Comment


      • #63
        Same problem with opensuse 11.0 32 bits, catalyst 3.7 and radeon hd3850

        Comment


        • #64
          dscharrer - THANK YOU!!!!!!!!
          Your solution fixed everything for me here!
          When updating to 8.6 from Mandriva repos (by accident), Cedega couldn't run Warcraft III (screen corruption), Urban Terror switching to window mode would cause corruption, and so on. Copying the Mesa libGL.so to fglrx folder fixed everything!

          Comment


          • #65
            Ok, I tried this again, I downloaded ATI's installer, installed 8.7, before copying the file, everything is as usual with the corruption. After I copy /usr/lib/mesa/libGL.so.1 to /usr/lib , it worked!

            Comment


            • #66
              My only question in this regards would be about the performance. I'm sure that ATi ships their own libGL.so due to some optimizations or similar. What about the performance with the Mesa libGL.so instead of the fglrx shipped one.

              Comment


              • #67
                Originally posted by Thetargos View Post
                My only question in this regards would be about the performance. I'm sure that ATi ships their own libGL.so due to some optimizations or similar. What about the performance with the Mesa libGL.so instead of the fglrx shipped one.
                Well, it looks the same to me(same fast performance) although I noticed the slight graphical glitches in Cedega with Warcraft III have gone.
                And my glxgears have rosen from about 3300~ FPS to 4500~ FPS (I have an ATI x1650 Pro)
                Thanks again to dscharrer, now I can feel easy about upgrading to the newest driver release, with a hotfix in hand in case something goes wrong :P

                Comment


                • #68
                  I will have to try this work around...

                  @Bridgman:
                  What may be the impacts of using Mesa libGL instead of the shipped fglrx libGL with fglrx? Should there be any features missing or performance issues?

                  Comment


                  • #69
                    Ok. I have tried this work around for the corruption with Wine, and since I run a 64-bit system, I did that for both /usr/lib and /usr/lib64. While older games like Starcraft run now much better and if used some of the tricks to speed it up (for instance, force OpenGL rendering for DirectDraw) it is orders of magnitude faster than previously using libGL.so.1.2 (through the libGL.so.1 symlink). However, I lost the ability to run Compiz, even though glxinfo does report that GLX_EXT_texture_from_pixmap is present.

                    Comment


                    • #70
                      Originally posted by Thetargos View Post
                      Ok. I have tried this work around for the corruption with Wine, and since I run a 64-bit system, I did that for both /usr/lib and /usr/lib64. While older games like Starcraft run now much better and if used some of the tricks to speed it up (for instance, force OpenGL rendering for DirectDraw) it is orders of magnitude faster than previously using libGL.so.1.2 (through the libGL.so.1 symlink). However, I lost the ability to run Compiz, even though glxinfo does report that GLX_EXT_texture_from_pixmap is present.
                      Hmm.. I didn't try Compiz, I will give it a try today, if I'm free.
                      For me, gaming is much more important than desktop effects

                      EDIT: I tried it out now, here is what it gives as an error:
                      Code:
                       [ahmad@localhost ~]$ compiz --replace
                      compiz (core) - Fatal: No GLXFBConfig for default depth, this isn't going to work.
                      compiz (core) - Error: Failed to manage screen: 0
                      compiz (core) - Fatal: No manageable screens found on display :0.0

                      Comment


                      • #71
                        Hmm... Is there an equivalent for ATi drivers to force a mode? Something like the environment variables for nVidia?

                        Comment


                        • #72
                          I hope this is fixed in the 8.8 release.
                          I get the same corruption as everybody on Archlinux, Radeon 9800 Pro (both 8.6 and 8.7, 8.5 worked fine).
                          And with fullscreen opengl games, if you switch workspaces the game doesnt go away, even keeps updating, but it doesnt respond to mouse or keyboard presses until you switch back to the same workspace where it was.

                          Comment


                          • #73
                            Do you really think they fix something? You must be lucky when they don't add a new error!

                            Comment


                            • #74
                              I think I found a workaround on Archlinux

                              http://bbs.archlinux.org/viewtopic.p...395056#p395056

                              http://www.thinkwiki.org/wiki/Proble...ted_3D_display

                              From the second link

                              2/ As suggested by ATI support, edit the /etc/X11/xorg.conf and find the section "Display". Add the following line into the "Display" section:

                              Code:
                               Virtual   <width> <height>
                              where <width> is the width of your screen in pixels rounded up to the next multiple of 64 and <height> is the height of your screen in pixels. For example, if your native resolution is 1400x1050, use
                              Code:
                               Virtual 1408 1050
                              After starting the X server you can run xrandr -s 0 to restore the X server to a native display resolution, and 3D rendering will still work.
                              I also use the start up script for playing WoW in it's own X server http://gentoo-wiki.com/HOWTO_Install...Startup_Script. Specifically for me, since I have a 1024x768 screen, I used
                              Code:
                              Virtual 1048 768
                              . Once I'm in an X session, I run
                              Code:
                              xrandr -s 0
                              then execute the script.

                              It's working great so far on catalyst 8.7.

                              Comment


                              • #75
                                That sounds great, but using xrandr immediately messes up things again, just like the post from the Arch Forums..
                                ATI better not have this problem in 8.8..

                                Comment

                                Working...
                                X