Announcement

Collapse
No announcement yet.

It Looks Like AMD Will Support FreeSync With Their New Linux Display Stack

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • It Looks Like AMD Will Support FreeSync With Their New Linux Display Stack

    Phoronix: It Looks Like AMD Will Support FreeSync With Their New Linux Display Stack

    While NVIDIA has long supported G-SYNC on Linux as their adaptive sync technology for eliminating screen tearing, AMD hasn't supported their FreeSync tech via their open or closed-source Linux drivers. Fortunately, it's looking like that will change...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    two years after windows, amd support for linux sucks

    Comment


    • #3
      Good to see that they are adding support. Once they have done the cleanup to get it fully integrated into the kernel it will also be of assistance in spreading Freesync support to additional hardware since Intel has announced that they intend to support in the near future (maybe Kaby Lake IGPs?)

      Comment


      • #4
        Originally posted by andre30correia View Post
        two years after windows, amd support for linux sucks
        FreeSync has been supported on Windows since March 2015 (11 months ago), it became usable in late 2015 with some important fixes and LFC support.

        Comment


        • #5
          FreeSync is designed to reduce tearing
          Does it mean that Wayland doesn't need FreeSync?

          UPD:Ah, I found this thread https://www.reddit.com/r/linux/comme...ync_be_useful/

          To sum up: it's not enough to be just tear-free, the adaptive monitor synchronization does more — it does synchronizes with GPU itself, i.e. it waits for GPU to send the next frame, and doesn't do a screen update until that very moment.
          Last edited by Hi-Angel; 13 February 2016, 05:36 PM.

          Comment


          • #6
            Actually Freesync completely removes tearing in between its frequency range (e.g. 30 - 144hz)

            Comment


            • #7
              Originally posted by Hi-Angel View Post
              Does it mean that Wayland doesn't need FreeSync?
              No, because Freesync and G-sync address *physical* tearing while Wayland addresses *logical* tearing. The physical tearing occurs when V-sync is turned off and the monitor refreshes part of the way through an update from the GPU, which is below Wayland's level. Wayland is focused on making sure that the final product of the graphics generation & composition process that gets placed into the frame buffer of the GPU for output is "perfect" in as much as the inherent graphics that the GPU itself is rendering don't include tears [assuming the GPU/display doesn't introduce tears due to timing issues].

              Of course, you might have noticed that I mentioned old-fashioned V-sync above. Well, traditional V-sync has its own problems in that it eliminates tearing but replaces tearing with frame stutter in some instances where the physical refresh rate of the monitor doesn't line up with the rate at which new frames are actually being produced by the GPU. G-sync & Freesync fix both the tearing problem and the stutter problem, which goes beyond the capabilities of traditional V-sync.
              Last edited by chuckula; 13 February 2016, 10:17 PM.

              Comment


              • #8
                Originally posted by Hi-Angel View Post
                Does it mean that Wayland doesn't need FreeSync?
                No, freesync is about syncing the display with the gpu. It has nothing to do with wayland.

                Comment


                • #9
                  Originally posted by chuckula View Post
                  Once they have done the cleanup to get it fully integrated into the kernel it will also be of assistance in spreading Freesync support to additional hardware since Intel has announced that they intend to support in the near future (maybe Kaby Lake IGPs?)
                  Would love to see that, too. Intel IGP with adaptive sync + AMD dGPU with PRIME would be a dream configuration.

                  Comment


                  • #10
                    Originally posted by chuckula View Post
                    No, because Freesync and G-sync address *phyiscal* tearing while Wayland addresses *logical* tearing. The physical tearing occurs when V-sync is turned off and the monitor refreshes part of the way through an update from the GPU, which is below Wayland's level. Wayland is focused on making sure that the final product of the graphics generation & composition process that gets placed into the frame buffer of the GPU for output is "perfect" in as much as the inherent graphics that the GPU itself is rendering don't include tears [assuming the GPU/display doesn't introduce tears due to timing issues].

                    Of course, you might have noticed that I mentioned old-fashioned V-sync above. Well, traditioanl V-sync has its own problems in that it eliminates tearing but replaces tearing with frame stutter in some instances where the physical refresh rate of the monitor doesn't line up with the rate at which new frames are actually being produced by the GPU. G-sync & Freesync fix both the tearing problem and the stutter problem, which goes beyond the capabilities of traditional V-sync.
                    Just a note about an aspect that people might notice when Freesync and G-Sync will become more common:
                    If the frame rate in a fullscreen game suddenly drops from 100 FPS to 20 FPS because the player enters a game area with high complexity, the player will notice this drop. The same is most likely true if the frame rate oscillates, such as 20-30-20-30-etc, goes up such as 20-30-40-50, or goes down such as 60-50-40-30. Freesync and G-Sync are definitely better than a fixed display refresh rate if the CPU+GPU are unable to render the scene always under 16.6 milliseconds (60 Hz), but they cannot eliminate frame stutter completely. The laws of our universe (physics) do not allow frame stutter to be completely eliminated when the CPU+GPU are not fast enough.

                    A comparison of a fixed frame rate (classic vsync) with a variable frame rate (FreeSync, G-Sync):
                    With Freesync and G-Sync the following rule can be executed: Limit the framerate to a maximum of 50 Hz, and if the 20 millisecond requirement cannot be met then display the frame as soon as the CPU+GPU completes the frame's rendering.
                    Without Freesync and G-Sync the above rule cannot be executed. Instead, the following rule can be executed: Limit the framerate to a maximum of 50 Hz, and if the 20 millisecond requirement cannot be met then display the frame after 2*20=40 ms, and if the 40 ms requirement cannot be met then display the frame after 3*20=60 ms, and if the 40 ms requirement cannot be met then display the frame after 4*20=80 ms, etc.

                    Comment

                    Working...
                    X