Announcement

Collapse
No announcement yet.

AMD Posts Latest Open-Source Linux Patches For FreeSync / Adaptive-Sync / VRR

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    After reading the patchset descriptions (and the reference links), I have quite a few questions:

    1. It seems that originally VRR was enabled just for an application that requested it (opt-in), while now it's enabled for everything by default, with a considerable blacklist (opt-out). Is that correct?
    2. Does this mean that any application that can go fullscreen (like Rhythmbox in party mode, or basically any picture viewer you can think of) will get VRR and that might induce luminosity flickering on certain panels for content that doesn't redraw often (i.e. most desktop apps)? I'm afraid that maintaining a blacklist of all such apps (not to mention you can put any app to fullscreen (and I don't mean maximized) on X11, if you so desire) will be a major PITA.
    3. Why are movie players included in the blacklist? One of the reference links specifically speaks about movie players being a great use case for VRR, in order to provide smooth 24 Hz playback.
    4. As a user, can I somehow figure out which apps currently have VRR active and which don't? And configure that easily (e.g. with an envvar)?

    Thanks.

    Comment


    • #22
      Originally posted by kparal View Post
      VRR and that might induce luminosity flickering on certain panels for content that doesn't redraw often (i.e. most desktop apps)?
      The monitor's EDID contains the range the panel is capable of refreshing in. Within this range there shouldn't be flickering and the driver shouldn't go below that range, i.e. request the panel to refresh at 20 Hz when it has a minimum rate of 30.

      Why are compositors blacklisted anyways? Seems weird to me given adaptive sync originated from eDP approaches to save power by dynamically lowering refresh rates or eliminate unnecessary refreshes with PSR etc. Couldn't this help AMD's APUs in mobile devices saving power? Where's the problem in refreshing a static desktop, web page etc. at the actual minimal refresh rate?
      Last edited by juno; 12 October 2018, 09:20 AM.

      Comment


      • #23
        Originally posted by juno View Post

        The monitor's EDID contains the range the panel is capable of refreshing in. Within this range there shouldn't be flickering and the driver shouldn't go below that range, i.e. request the panel to refresh at 20 Hz when it has a minimum rate of 30.

        Why are compositors blacklisted anyways? Seems weird to me given adaptive sync originated from eDP approaches to save power by dynamically lowering refresh rates or eliminate unnecessary refreshes with PSR etc. Couldn't this help AMD's APUs in mobile devices saving power? Where's the problem in refreshing a static desktop, web page etc. at the actual minimal refresh rate?
        I've seen the Windows desktop with VRR enabled and there's quite a bit of noticeable changes in brightness as the FPS vary.
        Certain desktop apps might even misbehave if VRR is enabled.

        Comment


        • #24
          Originally posted by juno View Post
          Why are compositors blacklisted anyways?
          There are a few topics discussing this on the Nvidia forum from back when gsync was implemented on Linux. There a a couple of limitation that have to do with X. The biggest problem that I remember had to do with the way X handles multiple monitors which made it impossible to do. There were some more things about detecting when gsync could actually be active. This effectively means that it only works on fullscreen stuff. You have to realise that these things are hooked into the OpenGL/Vulkan code to determine the moment where a frame is displayed (annoyingly proven by the fact that it took a year for nvidia to make gsync work for Vulkan games). I've been using gsync for a while now and while I absolutely love it, you see some weird things happening occasionally. Little things like some games dropping gsync for a bit when a Steam achievement pops up for instance. I wonder how Wayland will deal with Free/G-sync.

          Comment


          • #25
            Originally posted by kparal View Post
            3. Why are movie players included in the blacklist? One of the reference links specifically speaks about movie players being a great use case for VRR, in order to provide smooth 24 Hz playback.
            Adaptive sync for video playback can be advantageous but video players must be tested if they work properly with adaptive sync first. I suspect that in a lot of cases it won't work properly and you will end up with more stuttering and possibly brightness flickering. Blacklisting is probably a good idea to avoid problems, then if any video player is known to work properly with adaptive sync it can always be removed from the blacklist in the future.

            Originally posted by kparal View Post
            4. As a user, can I somehow figure out which apps currently have VRR active and which don't? And configure that easily (e.g. with an envvar)?
            The best and easiest way for me is to just open my monitors OSD and look at the resolution/refresh status. The refresh rate will change in realtime according to the framerate of the application. Sadly, not a lot of monitors has this feature which makes it a lot harder.

            Comment


            • #26
              Originally posted by Evil Penguin View Post

              I've seen the Windows desktop with VRR enabled and there's quite a bit of noticeable changes in brightness as the FPS vary.
              Certain desktop apps might even misbehave if VRR is enabled.
              Yes, certainly. I've seen this on Windows as well. At low frame-rates, certain colours (especially bright ones) can cause brightness flickering which is very annoying.

              Also, if it was to be enabled on the desktop and stuck at low framerate then the mouse pointer will be sluggish and stutter a lot.

              Edit: Saving power by lowering the refresh rate might make sense on laptops, but on desktops it is not worth it becouse they are not big. My monitor sips about 40W normally @120hz. At it's lowest refresh rate of 35hz it saves just a few watts (<5W IIRC) which is not really worth it considering the downsides.
              Last edited by Brisse; 12 October 2018, 10:12 AM.

              Comment


              • #27
                Originally posted by Brisse View Post
                Also, if it was to be enabled on the desktop and stuck at low framerate then the mouse pointer will be sluggish and stutter a lot.
                Obviously, the refresh rate should be changed as soon as content (that includes the mouse pointer) changes. That's the essence of adaptive sync.

                Originally posted by Brisse View Post
                Edit: Saving power by lowering the refresh rate might make sense on laptops, but on desktops it is not worth it becouse they are not big. My monitor sips about 40W normally @120hz. At it's lowest refresh rate of 35hz it saves just a few watts (<5W IIRC) which is not really worth it considering the downsides.
                Yes, and I was talking about mobile PCs. 5W is a huge amount in a mobile device. Apart from that, the potential is more on the GPU side. Lower refresh rate enables lower power states, lower display controller clock, lower video memory clock. Take PSR, where you don't save anything on the display side because the refresh rate is kept up, yet the technology saves enough power in the GPU and signalling paths to justify its implementation.

                Comment


                • #28
                  Originally posted by theriddick View Post
                  Good, now if AMD can also release a 1080TI beater (even by %5 is enough) GPU then we be smooth sailing
                  They are focusing right now on delivering Navi to Sony for the PS5 and MS for the Xbox Two. Navi will translate into peecee land as a Polaris replacement i.e. mid-range part. This Navi mid-range part in 2019 will likely perform on par with a GTX 1080. After those consoles launch, the Navi successor (Arcturus?) will be the Vega replacement in 2020/2021 that will be a true high end gaming part.

                  Right now AMD is focused on getting marketshare and revenue, primarily through CPU's and semi-custom GPU's. High end PC Gaming GPU's are not part of that equation. NVidia knows this, which is why they are price gouging consumers with $1200 gaming cards. Pretty much the same thing intel was doing on the CPU side, before Ryzen launched, where an i7 was $1000.

                  In reality though, an Rx 580 will deliver good frame rates at 1080p for just about every game available on Linux today, so I wouldn't bother waiting if you're in the market for a new card. And if you're some hard core gamer with 4k 120hz display and all the AAA titles, you're probably not running Linux anyways.
                  Last edited by torsionbar28; 12 October 2018, 11:30 AM.

                  Comment


                  • #29
                    Originally posted by torsionbar28 View Post
                    They are focusing right now on delivering Navi to Sony for the PS5 and MS for the Xbox Two. Navi will translate into peecee land as a Polaris replacement i.e. mid-range part. This Navi mid-range part in 2019 will likely perform on par with a GTX 1080. After those consoles launch, the Navi successor (Arcturus?) will be the Vega replacement in 2020/2021 that will be a true high end gaming part.
                    That part doesn't make sense as Polaris is nowhere near a 1080. And AMD is preparing yet another Polaris refresh, suggesting Polaris will stay for a while. Navi should extend the lineup above Polaris, not replace it. It should replace Vega10, which has HBM and is too big and too expensive to produce for only reaching GTX 1080 level in graphics workloads.
                    That being said, Vega isn't a high-end card, it's just mid-range. The GTX 1080 has been mid-range for at least 1.5 years. If Navi reaches this performance level in 2019, it will be more low-end than mid-range.

                    Originally posted by torsionbar28 View Post
                    In reality though, an Rx 580 will deliver good frame rates at 1080p for just about every game available on Linux today, so I wouldn't bother waiting if you're in the market for a new card. And if you're some hard core gamer with 4k 120hz display and all the AAA titles, you're probably not running Linux anyways.
                    You don't have to be a hard core gamer to own peripheral devices that doesn't suck. And gamer or not, a 1080p display sucks. Apart from that, there are many very demanding games running on Linux. Natively and easily accessible via WINE/Proton.

                    Anyway, I'm running Linux with a 144 Hz WQHD display and Polaris10. And I play games. Of course it works, as long as you're ok with lowering the quality settings. I'd still like to see AMD competing in high-end again and in the meantime I'm looking forward to Navi.
                    Last edited by juno; 12 October 2018, 01:02 PM.

                    Comment


                    • #30
                      I doubt it will take AMD 2021 to catch up to the 1080TI which was released in 2017, that's a 4 year gap!

                      As soon as AMD release their new polaris and 7nm GPU's, NVIDIA will respond with a price cut to the 10 series, so AMD is going to be under pressure at all tiers of the game and any new GPU they release won't change that!

                      Comment

                      Working...
                      X