Announcement

Collapse
No announcement yet.

FreeSync Support For RADV Vulkan Driver Blocked By Lack Of Config System

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • FreeSync Support For RADV Vulkan Driver Blocked By Lack Of Config System

    Phoronix: FreeSync Support For RADV Vulkan Driver Blocked By Lack Of Config System

    With the Linux 5.0 kernel the AMDGPU landed its support for FreeSync / Variable Rate Refresh support and that's already been joined by FreeSync/VRR support within Mesa's RadeonSI OpenGL driver. But FreeSync support for the RADV Radeon Vulkan driver has been delayed -- a merge request is now open but it's not expected to be merged at this point for a lack of a configuration management system...

    http://www.phoronix.com/scan.php?pag...c-MR-Available

  • #2
    This article raises so many questions. Like, why the heck does this tech need a white- or blacklist in the first place? What kind of ill effects does enabling freesync on a gtk4 app have? Clearly I don't understand what freesync is, because I had never thought apps could be behaving bad with freesync on.

    Comment


    • #3
      Originally posted by remenic View Post
      This article raises so many questions. Like, why the heck does this tech need a white- or blacklist in the first place? What kind of ill effects does enabling freesync on a gtk4 app have? Clearly I don't understand what freesync is, because I had never thought apps could be behaving bad with freesync on.
      The problem is that FreeSync changes your monitor's refresh rate to match the fps that a program is running at.

      So if you were watching a 24fps video in your opengl/vulkan video player (like mpv), and FreeSync got enabled, your entire display would be refreshing at 24Hz.

      I have a 144Hz monitor and personally never want FreeSync enabled. Tearing is barely noticeable on 144Hz, and I enjoy lower input latency from having no compositing or Vsync whatsoever in games. FreeSync and Gsync still add input latency, just slightly less than regular Vsync.

      For things like my video player, there is nothing wrong with that using normal Vsync. The latency difference is irrelevant.

      The only benefit I see to FreeSync/Gsync, is for playing video games with no tearing, while having slightly less input lag than normal Vsync, that's it.

      Comment


      • #4
        Originally posted by remenic View Post
        This article raises so many questions. Like, why the heck does this tech need a white- or blacklist in the first place? What kind of ill effects does enabling freesync on a gtk4 app have? Clearly I don't understand what freesync is, because I had never thought apps could be behaving bad with freesync on.
        If Freesync was enabled while doing desktop work and refresh rate dropped to minimum then everything would stutter heavily, including the mouse pointer. I remember this happened sometimes in early Freesync-days on Windows. On the desktop you want to run the display at it's full refresh rate to keep things smooth. Also, a lot of video players can't handle Freesync properly and will cause strange issues like refresh rate fluctuating all over the place with stuttery video playback as a result. They can be made to properly handle Freesync which can benefit video playback but they need to be tested first so the sane approach is to blacklist all video players and then eventually removing them from the blacklist when they have been tested and are known to work properly.

        Comment


        • #5
          because it is not the right thing for non-games (like browsers or media players)
          ???
          Why is it not the right thing?
          Wouldn't it be perfect to have 20 Hertz when you just read a web page?
          Wouldn't it be perfect to have 24/25 or 50 Hertz when playing a movie in your browser?

          Wouldn't it save a lot of power for notebooks and tablets?
          What's the role of the application? - How can it control the rate?
          At which point do browsers make use of RADV?

          I do not believe that we need to switch this feature off for any application. I am 100% sure that we can provide a switch for the application to turn it off.
          What we need is only something like Radeon Chill that runs for all applications, capturing pixel movement in real time to determine the rate to send refreshed images.
          Last edited by oooverclocker; 04-08-2019, 01:32 PM.

          Comment


          • #6
            Originally posted by oooverclocker View Post
            Wouldn't it be perfect to have 20 Hertz when you just read a web page?
            No. Your mouse pointer would stutter and lag terribly, scrolling the page would be horrible and animations would look bad. Some cheap displays might even flicker.

            Originally posted by oooverclocker View Post
            Wouldn't it be perfect to have 24/25 or 50 Hertz when playing a movie in your browser?
            If the video is played in fullscreen then yes, assuming it would work properly which probably isn't the case with any browser right now, so for now they need to be blacklisted.

            Comment


            • #7
              Freesync range isn't down to something insanely dumb like 20hz. It'll choose the least common denominator to it's wanted range and use that, more than likely.

              I'm no expert in freesync implementation in the kernel or mesa, so this might be coming from my ass a little bit. Sounds like the issue is between userland and the desktop composition. Linux likes (at least I like) to composite everything through one app. Works well, only one refresh rate at any time, everything is butter smooth and nothing is broken. As soon as you give the option for apps or entire screens to have different rates of display, all that goes out the window. They now will have to be more careful, by trying to guess the framerate apps "want" or should run at, and probably block compositing per-monitor much more to let the apps and mesa dictate the frame rate instead of the desktop compositioning. This is all assuming the apps themselves can handle those changes in some way. Games? Easy, they're made for varying rates. Everything else? Who knows what it wants or how it "should" work? Not anyone except that app, as the app is responsible for not breaking for hugely varried rates of being polled/called. So breakage can happen if they didn't really consider their display code doing these exotic things. Microsoft Windows has the advantage of being able to add little flags and extra code and proprietary crap. We don't. We need it to work with what we have. We do have the advantage of open source and fixing it ourselves, but that also is the curse we face.



              Until you submit your patches, don't question why support is taking time and why it doesn't exist. Windows and some crappy implementations can force the desktop to do things and expect people to fix it later, we are better than that. We want support baked in, correctly, and working with everything as best as it can. We have to give some of those developers time to give userland stuff the opportunity to work well with it without hacks and assumptions that will also break later.

              Comment


              • #8
                No worries with freesync being turned off in Ubuntu mainline 5.1_rc3 's and 4, as well as drm-tip .

                Comment


                • #9
                  The only way this will ever work properly is if it is disabled for everything by default and you have to manually enable it for a certain program. The only thing it is even useful for is video games, otherwise regular Vsync is fine as slightly worse input latency is not an issue for desktop apps.

                  There's no way you can allow every program rendering something in OpenGL/Vulkan to be changing your monitor's refresh rate based on whatever FPS that is running at.

                  Originally posted by oooverclocker View Post
                  Wouldn't it be perfect to have 20 Hertz when you just read a web page?
                  lol. Even browsing webpages or moving windows on your desktop looks better on 144Hz than 60Hz.

                  Wouldn't it save a lot of power for notebooks and tablets?
                  Benchmarks running a monitor at 60Hz and 144Hz show no significant difference in power usage. Refresh rate isn't a major power usage concern.

                  I do not believe that we need to switch this feature off for any application.
                  lol

                  Comment


                  • #10
                    Originally posted by Brisse View Post
                    No. Your mouse pointer would stutter and lag terribly, scrolling the page would be horrible and animations would look bad.
                    I think you haven't understood what I meant.

                    Let's say you have an n x m grid of color values and you diff one value in a square of x fields from each consecutive image, so when you move your mouse and diff enough pixels you will notice that the white pixel on your web page suddenly got black.
                    So increasing value gaps of many pixels mean increasing refresh rates and decreasing gaps mean decreasing refresh rates.

                    And indeed, I don't have the slightest clue what RadeonSI, RADV or even single windows have to do with that.

                    Edit: Except from providing an API call to fix a certain rate in fullscreen applications, or deactivating VRR, of course.
                    Last edited by oooverclocker; 04-08-2019, 03:09 PM.

                    Comment

                    Working...
                    X