Announcement

Collapse
No announcement yet.

AMD Finally Rolls Out New Linux Patches For Adaptive-Sync / VRR (FreeSync)

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by debianxfce View Post
    Win10 uses 1GB ram after boot when Debian xfce 350MB.
    That doesn't mean anything really.
    A good OS should use what it needs to work fast, to use most memory wouldn't be a bad thing.
    Memory management differ from different OS If you switch your windows computer to an SSD it will make a great difference and everything feels much faster, doing the same on your linux computer won't make a to big difference since it uses memory in a different way.
    Using memory that's available is a good thing.

    Originally posted by boxie View Post
    It would make for an pretty nice ITX sized workstation 2700X+VEGA8 :P
    It would be enough to include a smaller GPU like a tiny Vega 3 in an workstation GPU.
    I do think a APU with more cores/threads and a smaller GPU would sell well for many workstations.

    Comment


    • #52
      Originally posted by theriddick View Post
      Well Gsync does have larger headroom, and its technically better
      No it doesn't, no it isn't. There is just no need for an additional sync module inside the monitor adding 100-500 $ of cost. See new UHD 144 Hz monitors where this module even needs a pesky fan inside the monitor. That's actually an inferior solution.
      The display engine on the GPU side can duplicate frames just as well. AMD LFC has proved that years ago.

      They could still keep the G-Sync branding and certify specific monitors that meet higher quality standards with a G-Sync label without refusing their own customers access to the cheaper (but not worse) technology.

      But it's Nvidia after all.

      Comment


      • #53
        Michael Typo:

        "as well as within Mesa, the X.Org DDX driver, and Mesa"

        What has become of tildearrow ? ​​​​​​​

        Comment


        • #54
          Originally posted by Nille_kungen View Post
          It would be enough to include a smaller GPU like a tiny Vega 3 in an workstation GPU.
          I do think a APU with more cores/threads and a smaller GPU would sell well for many workstations.
          vega3 might work for the lower end, but some workstation apps require a decent amount of GPU (think CAD) or being able to run a 4/6 screen accelerated desktop with maybe some VM action too.

          ideally 2 zone SR-IOV for graphics

          Comment


          • #55
            Originally posted by juno View Post

            No it doesn't, no it isn't.
            Google is your friend, sometimes it pays to research something before opening your mouth.


            Comment


            • #56
              Originally posted by theriddick View Post
              Google is your friend, sometimes it pays to research something before opening your mouth.
              Do it.

              And stop spreading FUD in the meantime.


              And, btw: reading "G-Sync vs. FreeSync" articles on the web with arbitrary "benchmarks" and false conclusions is not research.
              Last edited by juno; 13 September 2018, 07:41 AM.

              Comment


              • #57
                Its well know that Gsync has a lower limit of frame insertion (smoothing) then freesync without flickering. Gsync is able to insert more frames to the display to offset low fps, it also has lower latency then freesync (1.0 btw, 2.0 is a premium method like gsync).

                There are articles that go into detail about testing this out, you won't find a research paper from a university about gsync or freesync comparison, but you can believe whatever you like...

                Comment


                • #58
                  The problem is that you (and many articles on the web) are comparing completely arbitrary stuff.

                  "Monitor A vs. Monitor B" is not "G-Sync vs. Adaptive Sync"

                  Many articles are comparing G-Sync and FreeSync monitors with completely different panels and electronics. The monitor might look the same from the outside but contain completely different technology on the inside. That causes differences in input lag/latency, minimum/maximum refresh rates etc. Again, that's not G-Sync vs. Adaptive Sync. That's monitor vs. monitor. And some people tend to "test" expensive G-Sync vs. cheap FreeSync monitors. By choosing the test equipment carefully you could bias the result very strongly into both directions. And you can find articles on the web reflecting both directions.

                  "Monitor A with GTX1080 vs. Monitor B vs. RX580" is not "G-Sync vs. Adaptive Sync"

                  Sure you can do that comparison and measure the latency from input to the panel. But wilfully or accidentally ignoring the fact that the 1080 takes several milliseconds less to render the frame makes the whole comparison invalid. By choosing the test equipment carefully you could bias the result very strongly into both directions.

                  "G-Sync vs. FreeSync" before 2016 is not "G-Sync vs. Adaptive Sync"

                  There are still some three year old articles online claiming that G-Sync is the superior solution because it works for lower minimum refresh rates. That's wrong. The minimum refresh rate that can be achieved without flickering is limited by the display panel. Usually that's above 30 Hz for both G-Sync and FreeSync monitors. G-Sync monitors can't display 20 Hz. That's what they have the G-Sync module for, it duplicates (or multiplies with a larger multiplier) the incoming frames, bringing the panel back into its functional refresh rate range. E.g. 20 FPS get displayed as 40 Hz with each frame displayed twice. AMD introduced this in late 2016, and I'm quoting myself:

                  Originally posted by juno View Post
                  The display engine on the GPU side can duplicate frames just as well. AMD LFC has proved that years ago.
                  Originally posted by theriddick View Post
                  Its well know that Gsync has a lower limit of frame insertion (smoothing) then freesync without flickering. Gsync is able to insert more frames to the display to offset low fps, it also has lower latency then freesync (1.0 btw, 2.0 is a premium method like gsync).
                  And that's where you should do your research. LFC has been a thing for almost three years now. And it isn't limited to FreeSync 2.0 either.

                  G-Sync just shifts that responsibility from the GPU/display engine to the monitor/G-Sync module. But that doesn't make it technically better in any way. Maybe the duplication algorithm, timing prediction could still be improved, but if that's true, it is for both solutions. Right now, both solutions just work without any noticeable difference and within measuring inaccuracy. Given, both monitor's panels have comparable min and max refresh to start with, of course. Again, "expensive vs. cheap" is not "G-Sync vs. Adaptive Sync". Obviously a panel with a VRR from 60-144 Hz. won't perform as well as one with a VRR from 30 to 144 Hz.

                  As I already said. There are expensive and good monitors with FreeSync on the market and there are cheap crappy monitors with FreeSync on the market. But there aren't cheap or crappy monitors with G-Sync on the market. That's the difference and it isn't a technical one. I've said previously: Nvidia could still do that and only label high quality monitors as G-Sync compatible while at the same time supporting the industry standard. Because the issue is that a cheap crappy FreeSync monitor is still way better than a cheap crappy monitor without Adaptive Sync at all.

                  But that's true for you of course:

                  Originally posted by theriddick View Post
                  you can believe whatever you like...
                  Last edited by juno; 13 September 2018, 08:38 AM.

                  Comment


                  • #59
                    However you want to paint it mate, at the end of the day GSYNC monitors DO have lower latency and better frame smoothing thresholds.

                    I however prefer freesync due to affordability and options.

                    Comment


                    • #60
                      No. They. Don't.

                      Comment

                      Working...
                      X