Hello,
First of all I'm very happy to join this forum and I thanks all of you as in the last year of game develop the rich and professional content about tecnical ussies helped me a lot.
Recently, as I said, I've been played around with readon open source driver. I notice that in configunction with SDL 1.2 surface blitting was very poor compared to the proprietary AMD fxlrg driver. In principle I thought that, for some tecnical issuies regarding the immaturity of radeon open source driver, there still missing some 2D optimization on it. But recently I've beens studing a lot and writing some benchs. I notice that even if I disable swapbufferswait, exavsync in xorg.conf nor/and v_blankmode in /etc/drirc and/or usign $vblank_mode=0 <command> in console, the benchs FPS always match the vsync rate I was using. I've tried disabling AIGLX and usign swrast driver with DRI. Meanwhile, the frustrating thing, is that even if the FPS match exactly the vsync "visually and in practice" I notice the annoying tearing effect. I've tried with different monitors and different vsync rates, the result was the same: FPS matching vsync with tearing effect. Hardware tested was r300 and r600 based cards. 3D bench seems not to be affected. I'm thinking about EXA acceleration is the problem but I tried to use old "xxa" accel method with no succes as seems that with radeon 7.0 the xxa was broken and exa still enable by default even with Option "accelmethod" "xxa" in xorg.conf.
I'm using xorg-server 1.13.1 with gentoo linux (kernel v3.6.11 x86_64) radeon driver v 7.0.0.
Anyone has a suggestion?
Thanks in advance Massimo.
EDIT: I'm using KMS enabled becouse radeon driver and xorg 1.13.1 support without KMS was broken.
First of all I'm very happy to join this forum and I thanks all of you as in the last year of game develop the rich and professional content about tecnical ussies helped me a lot.
Recently, as I said, I've been played around with readon open source driver. I notice that in configunction with SDL 1.2 surface blitting was very poor compared to the proprietary AMD fxlrg driver. In principle I thought that, for some tecnical issuies regarding the immaturity of radeon open source driver, there still missing some 2D optimization on it. But recently I've beens studing a lot and writing some benchs. I notice that even if I disable swapbufferswait, exavsync in xorg.conf nor/and v_blankmode in /etc/drirc and/or usign $vblank_mode=0 <command> in console, the benchs FPS always match the vsync rate I was using. I've tried disabling AIGLX and usign swrast driver with DRI. Meanwhile, the frustrating thing, is that even if the FPS match exactly the vsync "visually and in practice" I notice the annoying tearing effect. I've tried with different monitors and different vsync rates, the result was the same: FPS matching vsync with tearing effect. Hardware tested was r300 and r600 based cards. 3D bench seems not to be affected. I'm thinking about EXA acceleration is the problem but I tried to use old "xxa" accel method with no succes as seems that with radeon 7.0 the xxa was broken and exa still enable by default even with Option "accelmethod" "xxa" in xorg.conf.
I'm using xorg-server 1.13.1 with gentoo linux (kernel v3.6.11 x86_64) radeon driver v 7.0.0.
Anyone has a suggestion?
Thanks in advance Massimo.
EDIT: I'm using KMS enabled becouse radeon driver and xorg 1.13.1 support without KMS was broken.
Comment