Announcement

Collapse
No announcement yet.

SDL blit vsynch

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • SDL blit vsynch

    Hello,

    First of all I'm very happy to join this forum and I thanks all of you as in the last year of game develop the rich and professional content about tecnical ussies helped me a lot.

    Recently, as I said, I've been played around with readon open source driver. I notice that in configunction with SDL 1.2 surface blitting was very poor compared to the proprietary AMD fxlrg driver. In principle I thought that, for some tecnical issuies regarding the immaturity of radeon open source driver, there still missing some 2D optimization on it. But recently I've beens studing a lot and writing some benchs. I notice that even if I disable swapbufferswait, exavsync in xorg.conf nor/and v_blankmode in /etc/drirc and/or usign $vblank_mode=0 <command> in console, the benchs FPS always match the vsync rate I was using. I've tried disabling AIGLX and usign swrast driver with DRI. Meanwhile, the frustrating thing, is that even if the FPS match exactly the vsync "visually and in practice" I notice the annoying tearing effect. I've tried with different monitors and different vsync rates, the result was the same: FPS matching vsync with tearing effect. Hardware tested was r300 and r600 based cards. 3D bench seems not to be affected. I'm thinking about EXA acceleration is the problem but I tried to use old "xxa" accel method with no succes as seems that with radeon 7.0 the xxa was broken and exa still enable by default even with Option "accelmethod" "xxa" in xorg.conf.

    I'm using xorg-server 1.13.1 with gentoo linux (kernel v3.6.11 x86_64) radeon driver v 7.0.0.

    Anyone has a suggestion?

    Thanks in advance Massimo.

    EDIT: I'm using KMS enabled becouse radeon driver and xorg 1.13.1 support without KMS was broken.
    Last edited by Dhalsim; 10 February 2013, 12:38 PM.

  • #2
    IIRC the fglrx driver uses triple-buffering by default to get a tear-free display when drawing via the X APIs; radeon does not.

    i believe the rationale there is that radeon is normally used with a compositor, while fglrx is more likely to be used for workstation systems where compositing is generally not used.

    I don't think UMS is "broken" as much as "deprecated and removed", btw. I don't think UMS support has even been implemented for the last few generations of hardware.
    Test signature

    Comment


    • #3
      Originally posted by bridgman View Post
      IIRC the fglrx driver uses triple-buffering by default to get a tear-free display when drawing via the X APIs; radeon does not.

      i believe the rationale there is that radeon is normally used with a compositor, while fglrx is more likely to be used for workstation systems where compositing is generally not used.

      I don't think UMS is "broken" as much as "deprecated and removed", btw. I don't think UMS support has even been implemented for the last few generations of hardware.
      Hi bridgman

      UMS starting with xorg-server 1.13.1 and radeon driver 7.0 is supported but "broken": simply when usign this configuration radeon driver reports "no compatible devices found" and kills X.

      BTW the problem still there and is not the tearing effect (have you ever seen any X server without tearing effects ? I'm still not.) but the slow FPS rate.

      Comment


      • #4
        If I am reading the commit logs correctly, they suggest that UMS support was dropped in June 2012 and the 7.0 release came out in Nov 2012.

        EDIT -- commit message for the 7.0 release is "7.0.0, RIP UMS"...
        Last edited by bridgman; 10 February 2013, 01:29 PM.
        Test signature

        Comment

        Working...
        X