Page 2 of 3 FirstFirst 123 LastLast
Results 11 to 20 of 25

Thread: AMD Shows Off An External FreeSync Monitor In Action

  1. #11
    Join Date
    Apr 2013
    Posts
    24

    Default

    Unlike NVIDIA's G-Sync, no specialized hardware is required to support FreeSync. This week at Computex, AMD was showing off the first FreeSync capable monitor
    Really dude? That's a contradiction right there, you need no specialized hardware except *cough* a FreeSync capable monitor with DisplayPort 1.2a or newer + a card with the same specs (and I assume an AMD card unless Nvidia decide they want to support it too)

    Standalone monitors as is typically do not support this technology meaning you DO need specialized hardware. You just don't need that expensive GSync chip. I hope Nvidia responds to this by supporting freesync as well.

    But judging by the differences that GSync is a much more complicated implementation of this, I am sure there are benefits it has that freesync does not (not that I care, I prefer freesync because it's hardly any extra price for me to get that)
    Last edited by rabcor; 06-06-2014 at 02:16 PM.

  2. #12
    Join Date
    Aug 2012
    Location
    Pennsylvania, United States
    Posts
    1,877

    Default

    Quote Originally Posted by rabcor View Post
    Really dude? That's a contradiction right there, you need no specialized hardware except *cough* a FreeSync capable monitor with DisplayPort 1.2a or newer + a card with the same specs (and I assume an AMD card unless Nvidia decide they want to support it too)
    "No specialized hardware" Means "No hardware who's sole purpose is to make this work." DisplayPort 1.2a capable monitors and cards would have found their way out into customers hands eventually.

  3. #13
    Join Date
    Jan 2013
    Posts
    43

    Default

    Quote Originally Posted by Calinou View Post
    Will it work in windowed/“fake fullscreen” mode in games? I'm sadly not sure at all.

    Do games or layers like SDL need specific code to support this?
    I think this will be implemented so that applications don't even have to know its there. It's basically just adaptive vsync except that the PC/GPU triggers the monitor redraw start.

    Quote Originally Posted by Ericg View Post
    "No specialized hardware" Means "No hardware who's sole purpose is to make this work." DisplayPort 1.2a capable monitors and cards would have found their way out into customers hands eventually.
    I read somewhere that AMD will support it on all GCN 1.0 and higher GPUs. On the monitor side you need new hardware of course.
    I wonder if this will help with Oculus, too.

  4. #14

    Default

    40-60Hz?

    I think G-sync was meant to go with 120-144Hz monitors, which help with mouse lag.

  5. #15
    Join Date
    Jul 2013
    Posts
    374

    Default

    Quote Originally Posted by A Laggy Grunt View Post
    40-60Hz?

    I think G-sync was meant to go with 120-144Hz monitors, which help with mouse lag.
    In the video, the guy mentions that the spec allows for variance from 2-240hz, and the monitor picks a slice out of that and reports it to the GPU as what it supports (for example, 40-60hz). The monitor in the demo is an already released monitor with a special firmware upgrade, so that 20hz variable is probably the best it could do.

  6. #16
    Join Date
    Feb 2013
    Posts
    59

    Arrow Feature for power nazis..

    Correct me if I'm wrong.......but the ONLY advantage that this adaptive-synchronization thing offers (compared to normal vertical-sync) is that is can reduce power usage a little of your monitor by dynamicly lowering the frame-rate at times when you are viewing low frame-rate stuff.. Right?..

    The last time this was posted on phoronix, a lot of people were cheering about finally having tear-free games......but that's stupid, because v-sync has already done that for eons..

    I guess some people just don't understand what they are talking about some times.. Oh well.. At least I do..

  7. #17
    Join Date
    Aug 2012
    Location
    Pennsylvania, United States
    Posts
    1,877

    Default

    Quote Originally Posted by Baconmon View Post
    Correct me if I'm wrong.......but the ONLY advantage that this adaptive-synchronization thing offers (compared to normal vertical-sync) is that is can reduce power usage a little of your monitor by dynamicly lowering the frame-rate at times when you are viewing low frame-rate stuff.. Right?..

    The last time this was posted on phoronix, a lot of people were cheering about finally having tear-free games......but that's stupid, because v-sync has already done that for eons..

    I guess some people just don't understand what they are talking about some times.. Oh well.. At least I do..
    Lol... Troll is ALMOST successful. Next time tone down the elitism. For the record of everyone else... This is not a "power usage only" feature. This IS about fixing tearing and stuttering

  8. #18
    Join Date
    Apr 2014
    Posts
    110

    Default

    Quote Originally Posted by Kivada View Post
    For this to work you will need both a monitor and GPU that support DisplayPort 1.2a, this is the version that makes it part of the core VESA standard. Turns out it has been in the VESA spec for years, but was an optional extension that nobody was using.
    Might be worth calling around to find out which companies have this extension and a firmware update. Might save a bit of cash.

  9. #19

    Default

    Quote Originally Posted by Baconmon View Post
    Correct me if I'm wrong.......but the ONLY advantage that this adaptive-synchronization thing offers (compared to normal vertical-sync) is that is can reduce power usage a little of your monitor by dynamicly lowering the frame-rate at times when you are viewing low frame-rate stuff.. Right?..

    The last time this was posted on phoronix, a lot of people were cheering about finally having tear-free games......but that's stupid, because v-sync has already done that for eons..

    I guess some people just don't understand what they are talking about some times.. Oh well.. At least I do..
    V-Sync introduces GUI latency unless you've got a monster video card because if a frame is a few milliseconds too late, it has to be either discarded or delayed to the next refresh interval.

    Adaptive sync lets the video card say: "This time, we're doing 59Hz rather than 60Hz so I can still immediately display this frame I just finished."

  10. #20
    Join Date
    Sep 2008
    Posts
    121

    Default

    Quote Originally Posted by Baconmon View Post
    Correct me if I'm wrong.......but the ONLY advantage that this adaptive-synchronization thing offers (compared to normal vertical-sync) is that is can reduce power usage a little of your monitor by dynamicly lowering the frame-rate at times when you are viewing low frame-rate stuff.. Right?..

    The last time this was posted on phoronix, a lot of people were cheering about finally having tear-free games......but that's stupid, because v-sync has already done that for eons..

    I guess some people just don't understand what they are talking about some times.. Oh well.. At least I do..

    It is meant to fix stuttering.

    If a new frame isn't made available to the monitor in time for the next refresh, the monitor will keep the old image for 2 frames, before eventually jumping forward.
    This technology tells the monitor to wait slightly more than 1/60 seconds because the new frame isn't rendered yet.
    This wouldn't be much of a problem if you have a really high refreshrate, or a really fast graphics card.
    This is just common sense, and everyone is surprised it wasn't already like this.

    https://www.youtube.com/watch?v=3PJjhBUSuHk

    Powersavings is probably extremely minor.
    Last edited by Micket; 06-07-2014 at 07:01 AM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •