Announcement

Collapse
No announcement yet.

Ubuntu's Mir May Be Ready For FreeSync / Adaptive-Sync

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Testing, testing...I posted a response several minutes ago and it has apparently just disappeared.

    Comment


    • #12
      Originally posted by Zan Lynx View Post
      Testing, testing...I posted a response several minutes ago and it has apparently just disappeared.
      normal, this site now has censorship

      Comment


      • #13
        Originally posted by juno View Post
        Although I appreciate the foss efforts, it's sad how long we have to wait for features that have worked on windows for a long time.
        some simple fixes.

        Remember Windows is released with a finalized driver model each release and the graphic stack hardly ever changes(specially if it could brake the old Win32 API layers that 70% - 90% of the windows software still uses) whereas Linux actually never had a "graphic stack" but a dumb shadow FB and userspace libraries to pass GLX + GL1.2 commands until 8 years ago.

        So what this means? on windows you just add the code to the blob kernel/userspace side and it should work on a set of windows releases and for those that doesn't they can simply drop support or get creative to work around it based on how much money those releases brings them. So is relatively "easy".

        On Linux, the drivers on the kernel and userspace are written at the same time as the graphic stack is and have to adapt almost on real time with it, this also means desktops/middleware libraries/applications have to absorb those features in real time too which translates in a incredibly difficult set of tasks that require surgical coordination with 20 other teams just to get the feature into workable state just to start later to optimize. It is really really difficult.

        The good news is that Linux graphic stack is almost getting globally entering the "usable state" and once all reach feature parity the optimization process will begin, hence next hardware features will enter the stack a freaking lot faster since it will only require to focus on write that feature like in Windows.

        Why in Linux this has taken so much time? actually comparable to windows itself, the only difference is you don't experience it since its closed source. aka just get it magically when microsoft think is ready enough(remember Vista introduction to WDDM1.0 but took until Windows 7 WDDM 1.1 to actually work properly), the problem is since Linux didn't have an stable previous gen graphic stack you have no other choice but to jump into it as is developed whereas Windows had the NT model while worked on WDDM for some years, so you just don't notice it as hard as with Linux.

        Comment


        • #14
          Originally posted by andre30correia View Post

          normal, this site now has censorship
          Uhhh no, not at all.

          Originally posted by Zan Lynx View Post
          Testing, testing...I posted a response several minutes ago and it has apparently just disappeared.
          Just moderation queue for spam detector, will be appearing shortly.
          Michael Larabel
          https://www.michaellarabel.com/

          Comment


          • #15
            Originally posted by Zan Lynx View Post
            Pretty sure that isn't how it works at all. Not with G-Sync anyway. Perhaps I just assumed Adaptive Sync was just as good. Maybe it isn't.
            FreeSync is more or less the same thing as G-Sync (they are different adaptive vsync implementations), they basically monitor the fps and tell the display controller to adjust refresh frequency.

            Vsync is "adapt the video stream to screen refresh rate" (i.e. send double frames or clone stuff as needed), which is the other way around, and of course stresses more the GPU.

            If the display hardware has other problems with it the G-Sync hardware on the monitor smooths it away. The GPU / computer side never sees a problem.
            "G-sync hardware" on the monitor is mostly a DRM device to ensure that the feature isn't enabled on compatible screens that didn't pay the license to Nvidia. Sure it does have some minor additional gimmicks for edge cases, but the real meat is not there.

            Gsync or AMD's Sync or Vesa's require a screen that can change refresh rate dynamically as this "smooths it away" magic happens by simply changing dynamically refresh rate of the screen dynamically with content.

            i.e. the GPU outputs 23.89 frames per second? No problem, a message is sent and the screen refresh rate switches to 23.89 Hz and everything is as smooth as possible, no dropped frames, no half-rendered stuff, no double-frames.

            The differences between the various standards are mostly due to how they actually make this framerate change happen and how precise they are and so on. Different implementations of the same general concept, not different technologies.

            Comment


            • #16
              Originally posted by andre30correia View Post
              normal, this site now has censorship
              Why wasn't this post censored lol.

              Originally posted by Zan Lynx View Post
              Pretty sure that isn't how it works at all. Not with G-Sync anyway. Perhaps I just assumed Adaptive Sync was just as good. Maybe it isn't.
              FreeSync is more or less the same thing as G-Sync (they are different adaptive vsync implementations), they basically monitor the fps and tell the display controller to adjust refresh frequency.

              Vsync is "adapt the video stream to screen refresh rate" (i.e. send double frames or clone stuff as needed), which is the other way around, and of course stresses more the GPU.

              If the display hardware has other problems with it the G-Sync hardware on the monitor smooths it away. The GPU / computer side never sees a problem.
              "G-sync hardware" on the monitor is mostly a DRM device to ensure that the feature isn't enabled on compatible screens that didn't pay the license to Nvidia. Sure it does have some minor additional gimmicks for edge cases, but the real meat is not there.

              Gsync or AMD's Sync or Vesa's require a screen that can change refresh rate dynamically as this "smooths it away" magic happens by simply changing dynamically refresh rate of the screen dynamically with content.

              i.e. the GPU outputs 23.89 frames per second? No problem, a message is sent and the screen refresh rate switches to 23.89 Hz and everything is as smooth as possible, no dropped frames, no half-rendered stuff, no double-frames.

              The differences between the various standards are mostly due to how they actually make this framerate change happen and how precise they are and so on. Different implementations of the same general concept, not different technologies.

              Comment


              • #17
                Originally posted by starshipeleven View Post
                Gsync or AMD's Sync or Vesa's require a screen that can change refresh rate dynamically as this "smooths it away" magic happens by simply changing dynamically refresh rate of the screen dynamically with content.
                Quoted for truth. Unlike the old bottle monitors of the 1990s, modern display hardware does not actually have screen blanking and flyback and scanlines. Sure, the concepts are supported because much of the standard signalling (and software driving it) assumes those things but the reality has been different for a while.

                In fact, the whole concept of frames is a little old-fashioned when it comes to modern video display technology (witness Vulkan rendering chains, or nVidia's eglstreams driver). On a modern personal computer system with multiple monitors and VR goggles, framerate on the rendering and display side makes little sense and these new technologies in the drivers are trying to address that, giving you a better experience.

                The things holding back the better experience right now are (1) an assumption on the part of games and other applications that there is a single CRT display that draws a single frame at a time and (2) legacy media formats that assume a fixed frame rate (eg. 24 frames per second for filmed movies, 29.97 frames per second for NTSC television and DVDs). Progress is slow and steady and not in great demand, because most consumers are unaware of what they're missing since they've never had it.

                Don't worry, but the time we start having consumer-level content that demands the new display technology, the full graphics stack will be in place to support it.

                Comment


                • #18
                  Originally posted by starshipeleven View Post
                  i.e. the GPU outputs 23.89 frames per second? No problem, a message is sent and the screen refresh rate switches to 23.89 Hz and everything is as smooth as possible, no dropped frames, no half-rendered stuff, no double-frames.
                  Basically, you are right, of course. But most panels don't support refresh rates as low as this. They start flickering or display other errors when trying so low, that's why there are still doubled/tripled/... frames. But it is not more a problem, of course, as there is no stuttering caused by this. Just outputting every frame twice and you see 23.98 FPS but the screen refreshes w/ 47.95 Hz.
                  Using G-Sync, the module inside the monitor does the doubling when the refresh rate is too low. FreeSync on Windows uses what they call "LFC" for Low Framerate Compensation. They determine it in the driver and let the display engine output the same frame multiple times, if necessary. I assume this could be a bit tricky to implement, when frame times are fluctuating much.
                  Last edited by juno; 30 August 2016, 08:24 AM.

                  Comment


                  • #19
                    Originally posted by juno View Post
                    Basically, you are right, of course. But most panels don't support refresh rates as low as this. They start flickering or display other errors when trying so low, that's why there are still doubled/tripled/... frames. But it is not more a problem, of course, as there is no stuttering caused by this. Just outputting every frame twice and you see 23.98 FPS but the screen refreshes w/ 47.95 Hz.
                    That's because technology isn't mature.

                    All adaptive sync systems are supposed to support from 2hz to somewhere-triple-digit-hz, if the hardware cannot do it they resort to tricks like the above (still using the flexible refresh rate at their advantage), but that's a reaction to stated hardware limits, if (sometime in the future) you connect a screen that can do 10Hz fine they send 10hz.

                    I assume this could be a bit tricky to implement, when frame times are fluctuating much.
                    Afaik the framerate fluctuation does not affect the system (on any of the three). It's either way too fast to get caught unprepared or there is some kind of micro-buffer so that it always knows that the next X frames are at Y framerate.

                    Comment


                    • #20
                      There's no "dynamic refresh rate" change. The LCD monitor waits for a new frame on DisplayPort. When it gets a new frame it displays it. There isn't anything strange about it.

                      What is actually artificial and unnatural for an LCD is a fixed refresh rate which is an artifact left over from CRT displays. An LCD doesn't actually need repeated frames. If it doesn't get new information it just holds what it has. The need for repeating frames under 30 FPS is a strange artifact related to losing data sync on the cable and the monitor thinking it got disconnected if the GPU stops sending.

                      So there's no refresh rate change. There's simply the maximum data rate on the DisplayPort cable, which cannot be exceeded without corruption. And there's the data rate the electronics in the LCD monitor can accept, which might be lower than the cable. And there's the rate the monitor can twist the LCD elements to a new image. Other than those limits, the GPU and the monitor can send a frame whenever it feels like it.

                      Laptop, tablet and phone displays don't use refresh rates either. When a tablet goes idle, about 50 milliseconds after the last image update, the display hardware just stops updating. It doesn't send new frames. It just keeps the LCD / OLED right where it was and the GPU goes idle. Intel integrated GPUs do this with laptop screens for power savings too.

                      Comment

                      Working...
                      X