Announcement

Collapse
No announcement yet.

Open source "Radeon" driver: vsync, vblank_mode, tearing and (open)GL

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Open source "Radeon" driver: vsync, vblank_mode, tearing and (open)GL

    I'd like to continue the discussion about the open Ati driver and GL/vsync support here. Previously is was discussed in the 'Closed Drivers' section (link), but it does not belong there. My top post:


    ----------------------------------------------------------------
    I am under the impprssion there is an issue w/ Ati cards and the open Xorg 'Radeon' driver considering GL.

    If you want to use GL (video) output (example: mplayer -vo gl) then video stuttters, tears etc. See this bug report on bugs.freedesktop.org. You always get an error about "vblank_mode" (a dri setting GL uses), vsync etc. Why is vertical synchronisation (if that really is what it is called) so important for GL and why is it broken in Linux's open and proprietary drivers since 2001?

    There are many topics and wiki's everywhere on the 'net about this but no solution nor a workaround. Problem also is that recently the drivers changed and sometimes I don't know if they are talking qbout the open or the closed drivers...

    The only solution some give is: "buy an Nvidia card". That's gonna be pretty impossible for a laptop and it isn't a solution, more of a weird workaround advice.

    What exactly is the problem and is there any chance of it ever being addressed/solved?
    ---------------------------------------------------------------------



    Another post:
    --------------------------------------------------------------------
    The specific problem I have is as follows. I also use my PC as a mediacenter. I like to use XBMC for tha which can only use GL as video out (vo). I use Metacity as my window manager then, without compositing. I use an Ati Radeon 9600(RV350 chip) videocard now, I used to have an Nvidia card before, which worked just fine w/ XBMC. The old Ati card is not supported by the proprietry 'fglrx' driver from Ati/AMD. So I use the (exellent) open Xorg driver that's simply called "radeon". I activate the TV-out on the Radon to watch my movies and DVD's of course. No clone mode: I disable the CRT monitor (xrandr --output VGA-0 --off).

    Using XBMC to watch movies is somewhat the same thing as using mplayer with the command line option "-vo gl". On my TV the video tears in XBMC and w/ 'mplayer -vo gl' and I get the following error from mplayer:
    Code:
    do_wait: drmWaitVBlank returned -1, IRQs don't seem to be working correctly. 
    Try adjusting the vblank_mode configuration parameter
    If I use 'mplayer -vo gl' on the CRT monitor video stutters a lot and I get the (classic) error that my PC is 'too slow too play this'. If I use 'xv' as vo then there's no error/no problem. That's why I think vsync is broken in the 'radeon' driver for GL. I don't know exactly which version of the radeon driver I use, but I think it's the latest because I use Ubuntu Lucid (10.04)and Synaptic says about radeon: version 6.13.0.
    ------------------------------------------------------------

  • #2
    It's an r300 chipset so you're using Mesa's r300 driver for OpenGL rendering.

    Comment


    • #3
      Ie the card in the bug report. Damn the edit period. (your card uses the same Mesa driver)

      Comment


      • #4
        I don't know a whole lot about the driver history of graphic cards in Linux. So it appears that I use the Mesa driver. I believe that the open source drivers in the past were called 'Mesa'. Does that project still exist? Is it something completely different from the Radeon driver? I don't mean the redeonHD driver: that one's for >R500 Ati chips. The Ati that I use has an RV350 chip.

        It all gets very confusing to me... Is work still being done on the GL/vsync part of the free Ati driver? Can anybody explain?

        Comment


        • #5
          "radeon" and "radeonhd" (which is pretty much dead now) are DDX drivers, think of them as 2D drivers. The 3D driver is called r300.

          I guess this can be a bit confusing, but there's a bunch of FAQs and sticky threads with explanations.

          As far as vsync goes, I doubt there's anything inherently wrong with it, it seems to be working fine here at least. The bug you linked to seems to be about a specific configuration.

          To be honest, I think you have a bunch of different bugs/problems here:

          * Vsync on a secondary monitor.
          * -vo gl being too slow on r300
          * The IRQ error (some mismatch between components?)

          Try to reproduce the problems with up to date components (and try the r300g Gallium3D driver) and file proper bug reports.

          Comment


          • #6
            The first thing you need to do is upgrade to a recent kernel, since the IRQs have been working for many months now.

            The driver is spread across the kernel (DRM/hardware interface), Mesa (3d and OpenGL) and the radeon DDX (2d and X integration). You're running quite an old version of the kernel (and maybe other components too), and the first thing to do would be to update to the latest kernel, mesa and the radeon DDX driver.

            Comment


            • #7
              Originally posted by Meneer Jansen View Post
              I don't know a whole lot about the driver history of graphic cards in Linux. So it appears that I use the Mesa driver. I believe that the open source drivers in the past were called 'Mesa'. Does that project still exist? Is it something completely different from the Radeon driver? I don't mean the redeonHD driver: that one's for >R500 Ati chips. The Ati that I use has an RV350 chip.

              It all gets very confusing to me... Is work still being done on the GL/vsync part of the free Ati driver? Can anybody explain?
              AMD Minister of Propaganda bridgman has written a very nice summary in this post: http://www.phoronix.com/forums/showthread.php?t=7221

              Basically MESA is backend which does the OpenGL regardless of graphics card driver. There are now to types of MESA backends who are hot and trendy, "classic" and "gallium".

              x11-drivers/xf86-video-ati is the small video driver and this alone is enough if you do not want 3D.

              Comment


              • #8
                Another thing to keep in mind is that the GLX "vsync" extensions have nothing to do with tearing. They merely provide synchronization events which apps can synchronize with to render at a fixed frame rate. To avoid tearing (updating the visible part of the framebuffer while scanout is running) you need to wait for the vertical blanking period to copy new content to the visible area, or wait until the current vline (currently refreshing vertical line) is past the area of the screen that you want to update.

                The GLX "vsync" extensions are supported via mesa and the drm. They are implemented via vertical blank interrupts and frame counters. These provide the events apps can synchronize with.

                To avoid tearing, we wait for the current vline to get past the rendering area before writing new data to the front buffer. This is done for both GL buffer swaps and Xv (when rendering to the visible framebuffer).

                If you are using a compositing manager, Xv windows are redirected to an offscreen buffer so they do not wait for the vline before rendering. The compositing manager copies the content to the visible buffer when it composites the front buffer. As such, it might still tear unless that copy waits for vline as well. When the compositing manager uses GL, you get this automatically (as per the above paragraph). When it does not use GL, you can still get tearing. In the radeon driver, you can force all operations that touch the visible buffer to wait on the vline using the EXAVSync option, however, this has a noticeable impact on performance as you spend a lot of time waiting.

                To make this all work, you need support in all 3 pieces of the stack (ddx, mesa, drm).

                Comment


                • #9
                  Wow. I'm awfully sorry, but things are sort of spinning round my head now.

                  All I want is for my computer to work w/ my good 'ol Ati Radeon 9600 (RV350) and GL video. I guess I'll never understand why it doesn't work right now, so I'll try to forget about the 'understanding' part as of now. What do you think: wait for the Ubuntu people to offer a 2.6.33 kernel via Synaptic/apt-get?

                  Thank you all for your kind replies.


                  P.S. Dunno if the following is of any help but if I shutdown my PC, screw out the Ati card and screw in an old (antique) Nvidia one, all gl video probs are gone. So is it really the kernel that causes the problems?

                  Comment


                  • #10
                    You'll need 2.6.35 and mesa and ddx from git to use GL "vsync" with kms.

                    Comment


                    • #11
                      Originally posted by agd5f View Post
                      You'll need 2.6.35 and mesa and ddx from git to use GL "vsync" with kms.
                      Thanks for the info. And if you do not use kms? Because if I use kms then TV out (on my PAL TV) has that very severe "combing" effect. That's completely gone when I disable kms (in grub).

                      Comment


                      • #12
                        That should be fixed in 2.6.35:
                        http://git.kernel.org/?p=linux/kerne...68cff9bfdc0b37
                        In general, please report bugs in kms so we can fix them as we are trying to transition way from ums.

                        Comment


                        • #13
                          Finally seem to have cracked it. But oh boy did this take a whole lotta testing, cashes and forum hopping! Ever since I bought this GraCa (way, way back in 2003) I never got the darn thing to work properly in Linux. Mostly used it for gaming in Wondows. Now I know why we linuxers have been advising to use NVidia GraCa's for ages.

                          Many thanks to this forum topic which explains how to install the latest mesa, radeon (etc.) drivers and a supernew kernel (from "git"). The driver stuff and kernel are of versions that are officially not yet available for the distro that I use. And the distro that I use is quite new (Ubuntu 10.04 'The Lucid Lynx' at time of writing.). Things I needed to get GL video working (i.e. XbMC mediacenter, mplayer -vo gl):
                          • Kernel 2.6.35.
                          • Now KMS works w/ YV out, so I did not deactivate that anymore (removed the "nomodeset" in /etc/default/grub).
                          • mesa-glx and mesa-dri 7.9.0.
                          • drm 2.4.21.
                          • Xorg 7.5
                          • And probably some more things that could be updated after adding the xorg-edgers repo/ppa from aforementioned forum topic.
                          So after "waiting" (read: using NVidia cards) for 7 years I can finally use my Ati card for 3D desktop effects as well as full screen (and overscanned!) video on my TV. No thanks to Ati/AMD by the way. Who said open source was bad for business?

                          One final remark: somebody here said that 2.6.32 is an "old" kernel. It is, however, the latest avaliable and supported kernel for Ubuntu 10.04. That's not old in my book. It's also a bit of a disappointment that a long term support (LTS) distro uses drivers and a kernel that does noet work very well with a brand of graphics cards that half of the PC users owns. That is: should have worked out of the box by now... But you can't always get what you want.

                          Comment


                          • #14
                            P.S. If drivers or kernel, irq's, vsynch or what ever, ever gets broken again (update??) then I'm gonna kill my computer!

                            Comment


                            • #15
                              Nope. Seems there are some more quirks to this. Now I can't overscan anymore. The picture does not cover the whole TV screen anymore. That is: without KMS the output of the command "xrandr --prop" is:
                              Code:
                              S-video disconnected (normal left inverted right x axis y axis)
                              	tv_standard: pal
                              	tv_vertical_position: 0 (0x00000000)	range:  (-5,5)
                              	tv_horizontal_position: 0 (0x00000000)	range:  (-5,5)
                              	tv_horizontal_size: 0 (0x00000000)	range:  (-5,5)
                              	load_detection: 0 (0x00000000)	range:  (0,1)
                              Meaning that I can overscan (i.e. set hor/vert position and the hor. size). However if I boot with KMS then these parameters do not appear when I ask for the TV's properties w/ "xrandr --prop". And only with KMS the GL output (vsync) does noet tear or stutter. AAaaaaaaaaaaaaaaaaaaarrrrrrrrrrrggggghhhhhhhhhhh.

                              How can I fix this and what's the matter w/ xrandr if KMS is activated?

                              Comment

                              Working...
                              X