Announcement

Collapse
No announcement yet.

AMDGPU & Radeon DDX Updated - Better 2D Performance, Tear Free, DRI3 Default

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by droste View Post

    You can have tearing with 60fps on a 60Hz monitor!

    Tearing happens when the monitor and the gpu are not synced so the gpu renders a new image while the old image is not completely displayed and that's independent from Hz or framerate.
    So the only solution is to synchronize the gpu with the display and that will always cost performance. With single monitor setups vsync is enough to remove tearing, this option is for multi monitor setups where you simply can't sync the gpu with each monitor because they all read the image from the gpu at a different time.
    You're right, of course, but I wasn't trying to explain v-sync. Only why the so called "performance impact" should be disregarded. It's simply not a noticeable difference.

    Comment


    • #22
      Originally posted by cen1 View Post
      I never said I don't like it. I love it! I just don't love the tearing lol.
      use Compton compositor.

      Comment


      • #23
        Originally posted by starshipeleven View Post
        use Compton compositor.
        Compton has some extreme lag. Moving the mouse cursor and/or windows feels extremely sluggish.

        Comment


        • #24
          DRI3 support is enabled by default on X.Org Server 1.18.3 and newer.
          Cool, let me disable it now

          Comment


          • #25
            Originally posted by bug77 View Post

            Technical implementation doesn't matter. Tearing is caused by your video card rendering more frames than your monitor can show.
            Tearing is when you monitor can display 60 frames each second and your card can render 90. If I fix tearing and make the card render only 60 fps, are you actually loosing performance?
            Are you sure? My understanding is that tearing is, somewhat, orthogonal to paint times.
            Thought experiment (and this isn't using Wayland so the buffers aren't controlled the app): an app has exclusive access to the scanout buffer; it draws SLOWER than whatever the current modeset time is; it is in the middle of drawing over the previous frame when it is stopped by the crtc; the result should be new data on top and old on bottom (assume) which is ALSO what you get when the drawing application is too fast.
            Of course this can all be avoided regardless of the application (or gpu) being too fast or slow by app level, system level and driver level synchronization and having multiple scanout buffers. How that's all accomplished is irrelevant, but they should all provide a "Wayland-like" experience with the downside, as mentioned, of, typically, adding a vsync or two worth of latency unless you can modeset the monitor (and crtc) fast enough to only scanout out when you've just finished with the single scanout buffer (and one is all you'd ever need).

            Comment


            • #26
              Originally posted by Chewi View Post
              I recall similar tear-free options that have a negative performance impact. Is that the case here?
              There is an impact, but it's not too bad, on the order of single-digit percent per CRTC actively receiving updates.

              You were probably thinking of the "EXAVSync" option. TearFree (which BTW works with EXA as well) works completely differently from that, it should generally have smaller performance impact than EXAVSync, while eliminating tearing more reliably. The only downside should be the additional VRAM usage for the dedicated scanout buffers.

              Is there a particular reason why it's not enabled by default?
              The biggest reason is that enabling TearFree currently disables DRI page flipping, which has a negative performance impact for fullscreen OpenGL / VDPAU / ... apps (which includes compositors). Once that is fixed, maybe we can try enabling it by default, at least on newer cards with enough VRAM that the dedicated scanout buffers don't matter much.

              BTW, apparently my wording in the release announcements was confusing: TearFree has been available for over a year, since the 7.6.0 / 1.0.0 releases. What's new in these releases is that it can now eliminate tearing in all possible display configurations.

              Comment


              • #27
                Originally posted by pq1930562 View Post

                The manpage of xf86-video-intel lists an option for triple buffering:

                Code:
                [B]Option "TripleBuffer"[/B] "[U]boolean[/U]"
                [...]
                But the manpage of xf86-video-ati and xf86-video-amdgpu does not.

                bridgman agd5f why not?
                This option only applies to DRI2. DRI3 always allows triple buffering, because the buffers are managed on the client side. Triple buffering with DRI2 is very tricky, so we decided not to put in the effort for that in favour of DRI3.

                Comment


                • #28
                  Originally posted by liam View Post

                  Are you sure? My understanding is that tearing is, somewhat, orthogonal to paint times.
                  Thought experiment (and this isn't using Wayland so the buffers aren't controlled the app): an app has exclusive access to the scanout buffer; it draws SLOWER than whatever the current modeset time is; it is in the middle of drawing over the previous frame when it is stopped by the crtc; the result should be new data on top and old on bottom (assume) which is ALSO what you get when the drawing application is too fast.
                  Of course this can all be avoided regardless of the application (or gpu) being too fast or slow by app level, system level and driver level synchronization and having multiple scanout buffers. How that's all accomplished is irrelevant, but they should all provide a "Wayland-like" experience with the downside, as mentioned, of, typically, adding a vsync or two worth of latency unless you can modeset the monitor (and crtc) fast enough to only scanout out when you've just finished with the single scanout buffer (and one is all you'd ever need).
                  Again, technically you're right, but this is where sync modes come in. If the monitor and the video card can agree on timings, the monitor can be served tear-free images.

                  Comment


                  • #29
                    Originally posted by M@GOid View Post

                    There is a option to enable compositing in the settings of XFCE. Enable it and the tearing goes away.
                    Or turn default compositor off and install compton.

                    Comment


                    • #30
                      Originally posted by pal666 View Post
                      actually there is the other way: to slow down the monitor, which is done by amd freesync
                      There is also another way. Switch to Windows where you have better vendor drivers for your hardware. Or use earlier versions of distro with also a better driver support from the hardware vendor. I can't find any other solution to this but problem with screen tearing on newer distros or with Xfce is on the driver or compositor side. With mint 17 or windows and don't have any problems.

                      Comment

                      Working...
                      X