Announcement

Collapse
No announcement yet.

It Looks Like AMD Will Support FreeSync With Their New Linux Display Stack

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by duby229 View Post

    The trekkies already got me with this one..... It's not Vulkan, It's Vulcan!

    Just kidding. Don't let nvidia fanboys get under your skin, most of them lack basic fundamental common sense.
    A Vulcan-like person most likely goes through steps such as:
    1. https://en.wikipedia.org/wiki/Hilbert%27s_program
    2. https://en.wikipedia.org/wiki/G%C3%B...eness_theorems
    3. https://en.wikipedia.org/wiki/Halting_problem
    4. https://en.wikipedia.org/wiki/Computer_programming
    5. https://en.wikipedia.org/wiki/Problem_solving
    Last edited by Guest; 14 February 2016, 05:51 AM. Reason: Replace Vulkan with Vulcan

    Comment


    • #22
      FreeSync sounds really great, I didn't know how interesting/important it was before this code drop and exchanges in forums. My biggest takeaway from this however is that AMD guys seem to understand what they are doing and are ready to get everything to get this merged. I approve such attitude. Hopefully together they will get this merged as soon as humanly possible without damaging integrity of code base.

      Comment


      • #23
        Originally posted by haagch View Post
        Intel doesn't care about prime. Try playing a game with intel dri3 using prime. It's completely unplayable. Intel knows that cross device synchronization is missing in their driver, they just don't do anything about it.
        Thanks for the information. What about dri2?

        Comment


        • #24
          Originally posted by chuckula View Post

          No, because Freesync and G-sync address *physical* tearing while Wayland addresses *logical* tearing. The physical tearing occurs when V-sync is turned off and the monitor refreshes part of the way through an update from the GPU, which is below Wayland's level. Wayland is focused on making sure that the final product of the graphics generation & composition process that gets placed into the frame buffer of the GPU for output is "perfect" in as much as the inherent graphics that the GPU itself is rendering don't include tears [assuming the GPU/display doesn't introduce tears due to timing issues].

          Of course, you might have noticed that I mentioned old-fashioned V-sync above. Well, traditional V-sync has its own problems in that it eliminates tearing but replaces tearing with frame stutter in some instances where the physical refresh rate of the monitor doesn't line up with the rate at which new frames are actually being produced by the GPU. G-sync & Freesync fix both the tearing problem and the stutter problem, which goes beyond the capabilities of traditional V-sync.
          With FreeSync, i thought the refresh rate stayed static at 40 Hz for all frames below 40,anything above are matched Hz/frames until monitors max Hz,If a monitors max is 144Hz, and frames go higher than that,v-sync kicks in. If you have v-sync on, and frames drop below 40 v-sync kicks in.So you still have potential for tearing or stutter.

          I thought G-sync handled it differently through its module by not using v-sync on/off, but draws extra frames and increases refresh rates when frames drop below 40, and matches like Freesync above 40, which is claimed actually eliminates tearing/stutter completely unlike Freesync

          Comment


          • #25
            Thank you AMD.
            I really hope the code makes it into the kernel.

            Comment


            • #26
              Originally posted by DDF420 View Post

              With FreeSync, i thought the refresh rate stayed static at 40 Hz for all frames below 40,anything above are matched Hz/frames until monitors max Hz,If a monitors max is 144Hz, and frames go higher than that,v-sync kicks in. If you have v-sync on, and frames drop below 40 v-sync kicks in.So you still have potential for tearing or stutter.

              I thought G-sync handled it differently through its module by not using v-sync on/off, but draws extra frames and increases refresh rates when frames drop below 40, and matches like Freesync above 40, which is claimed actually eliminates tearing/stutter completely unlike Freesync

              Since November AMD implemented frame doubling, so if your higher freesync range is 2.5 times the lower, the lower one does not matter. Same with NVidia, though I do not know if they double their frames in soft- or hardware. Once you are above the max monitor refresh range you have either vsync off, which results in tearing (and somewhat lower latency/input lag) or you have it on, which results in a tear free display (with an input latency of max 16ms at 60Hz) - so if you stay above the monitor refresh rate both adapive syncs technologies don't differ from non adaptive sync approaches.

              Comment


              • #27
                Originally posted by Namenlos View Post


                Since November AMD implemented frame doubling, so if your higher freesync range is 2.5 times the lower, the lower one does not matter. Same with NVidia, though I do not know if they double their frames in soft- or hardware. Once you are above the max monitor refresh range you have either vsync off, which results in tearing (and somewhat lower latency/input lag) or you have it on, which results in a tear free display (with an input latency of max 16ms at 60Hz) - so if you stay above the monitor refresh rate both adapive syncs technologies don't differ from non adaptive sync approaches.
                Thanks for the explanation. That's probably the best summary I've read. If I understand you right, I think what you're saying is, it's not the frame time people notice, it's the differences in back to back frame times that people notice. In other words, you won't notice a frame at 16ms latency, but if it's followed by a frame at 8ms latency, then followed by a third frame at 16ms latency, that difference will be noticeable?

                Comment


                • #28
                  The problem it that without adaptive sync you have a fixed rate of refreshing the panel, every 16.7ms for 60Hz. Now if you are below 60fps you have to wait a whole frame. So you have a short time of 30Hz/30fps. You notice that because without motion blur like in movies 30fps is not quite enough. Adaptive sync technologies fix that completely sane way while 120Hz masks some of the problems. I don't think input lag is a real problem with 60fps and above, at least for me. But of course more is better .

                  edit: You can run freesync(/gsync probably too) with or without vsync.
                  Last edited by Namenlos; 14 February 2016, 01:01 PM.

                  Comment


                  • #29
                    Originally posted by Zan Lynx View Post
                    If the price tag was $100 more than a similar monitor than you probably have it. :-P
                    I know this was a joke, but actually some bargain monitors like the Wasabi Mango UHD420 (one of the cheapest 42" UHD monitors on the market) are FreeSync capable with recent firmware.

                    Review at PCPer

                    Comment


                    • #30
                      Originally posted by chithanh View Post
                      I know this was a joke, but actually some bargain monitors like the Wasabi Mango UHD420 (one of the cheapest 42" UHD monitors on the market) are FreeSync capable with recent firmware.

                      Review at PCPer
                      I just read a comment claiming that the FreeSync firmware has been taken down. Maybe they're going to tack $100 onto the price and sell it as a different model.

                      Comment

                      Working...
                      X