Announcement

Collapse
No announcement yet.

Radeon FreeSync 2 Rolled Out With HDR & More

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by pal666 View Post
    last time i checked it required 2.5x, did it change with this announcement?
    Indeed AMD said 2,5x, but in theory 2x is enough. And in practice this is also the case. For example, the Samsung C34F791 has a display rate of 48 to 100 Hz and it is compatible with LFC.

    Comment


    • #12
      Originally posted by pal666 View Post
      for obvious reason: software drivers will ask hardware videocard to do it, which is much more powerful than monitor and will not add 30ms latency like monitors which need to transform their input
      I didn't know hw accelerated tone mapping was a feature of Polaris/Vega, thanks for updating me.

      Comment


      • #13
        Originally posted by pal666 View Post
        last time i checked it required 2.5x, did it change with this announcement?
        2x is the theoretical minimum, for reliable detection and/or execution AMD (and probably Nvidia too) need a 2.5x span. This did not change to my knowledge in recent times.

        Comment


        • #14
          Originally posted by bug77 View Post
          I didn't know hw accelerated tone mapping was a feature of Polaris/Vega, thanks for updating me.
          I didn't know HW acceleration of tone mapping was something newsworthy for any GPU made in the last decade. http://http.download.nvidia.com/deve...eagues_HDR.pdf

          Comment


          • #15
            Originally posted by starshipeleven View Post
            I didn't know HW acceleration of tone mapping was something newsworthy for any GPU made in the last decade. http://http.download.nvidia.com/deve...eagues_HDR.pdf
            Is this really the same thing? I thought the "old" HDR was more of a fake HDR, as opposed to what we're getting these days. Then again, tone mapping is tone mapping...
            Regardless, setting the video card to render HDR when you have a non-HDR monitor is the wrong thing to do, so I'm really not sure what's the problem AMD's solution fixes.

            Comment


            • #16
              Originally posted by bug77 View Post
              Is this really the same thing? I thought the "old" HDR was more of a fake HDR, as opposed to what we're getting these days. Then again, tone mapping is tone mapping...
              Afaik tone mapping didn't change. Rendered on CPU it would have total crap framerate even with good CPUs, so it's always on GPU.
              I didn't check the rest of the PDF as it's irrelevant for my point.

              Regardless, setting the video card to render HDR when you have a non-HDR monitor is the wrong thing to do, so I'm really not sure what's the problem AMD's solution fixes.
              already said by others
              1. it sets a min quality level for screens supporting Freesync2
              2. freesync2 reads HDR info from the screen itself and makes sure the game renders frames ready for the screen, so the display controller does not have to do any tone mapping on its own. This decreases latency and should increase quality as the GPU is usually better than a dedicated microcontroller in the screen.

              Comment


              • #17
                Originally posted by starshipeleven View Post
                Afaik tone mapping didn't change. Rendered on CPU it would have total crap framerate even with good CPUs, so it's always on GPU.
                I didn't check the rest of the PDF as it's irrelevant for my point.
                Ok, I think we cleared this up.

                Originally posted by starshipeleven View Post
                already said by others
                1. it sets a min quality level for screens supporting Freesync2
                2. freesync2 reads HDR info from the screen itself and makes sure the game renders frames ready for the screen, so the display controller does not have to do any tone mapping on its own. This decreases latency and should increase quality as the GPU is usually better than a dedicated microcontroller in the screen.

                http://arstechnica.com/gadgets/2017/...-release-date/
                I was strictly talking about about this "reduce latency" thing. You'd have to actually configure your system the wrong way (set GPU to HDR, plug in non-HDR monitor) for this part of FreeSync2 to kick in. Or at least that's my impression. I also don't have numbers about the latency introduced by the monitor doing the tone mapping, so I'm not sure what savings we're looking at.

                Comment


                • #18
                  Originally posted by robin4002 View Post
                  not with FreeSync 2 :

                  LFC require at least a ratio of 2x between lower and higher refresh rate.

                  If a monitor can do 60 to 120 Hz, with LFC if you run at 40 fps, the refresh will be 80 hz, making the game as smooth as if the screen was running at 40 Hz.
                  Hasn't Gsync had this since its initial inception?

                  Comment


                  • #19
                    Originally posted by bug77 View Post

                    Is this really the same thing? I thought the "old" HDR was more of a fake HDR, as opposed to what we're getting these days. Then again, tone mapping is tone mapping...
                    Regardless, setting the video card to render HDR when you have a non-HDR monitor is the wrong thing to do, so I'm really not sure what's the problem AMD's solution fixes.
                    Display HDR is best for 4K/HDR-video playback. For games, HDR is often better when computed directly in the GPU using shaders (simply because when computed using the game engine, you have more knowledge, like the z-buffer for instance).

                    Edit:
                    Just to be clearer, HDR shaders exists for more than a decade now. What's new is just HDR displays. Which basically does what a shader would do: take High Dynamic Range images and flatten them for the display color space.

                    The difference with a game engine shader is that the game engine can give more information. For instance, if you're inside a house and looking through a window, the dynamic range can be huge, but it will eventually depends on where you're looking at. If you're looking at the window, it means you want to see through, so you'll select a high "standard range" that allows you see what's behind the window, but the wall around will appear dark. On the other hand, if you're looking at the wall next to the window, you want to see the details of the wall, so you'll select a low "standard range" that allows you to see these details, but the window will appear very bright.
                    Last edited by Creak; 04 January 2017, 11:32 AM.

                    Comment


                    • #20
                      Originally posted by Creak View Post

                      Display HDR is best for 4K/HDR-video playback. For games, HDR is often better when computed directly in the GPU using shaders (simply because when computed using the game engine, you have more knowledge, like the z-buffer for instance).

                      Edit:
                      Just to be clearer, HDR shaders exists for more than a decade now. What's new is just HDR displays. Which basically does what a shader would do: take High Dynamic Range images and flatten them for the display color space.

                      The difference with a game engine shader is that the game engine can give more information. For instance, if you're inside a house and looking through a window, the dynamic range can be huge, but it will eventually depends on where you're looking at. If you're looking at the window, it means you want to see through, so you'll select a high "standard range" that allows you see what's behind the window, but the wall around will appear dark. On the other hand, if you're looking at the wall next to the window, you want to see the details of the wall, so you'll select a low "standard range" that allows you to see these details, but the window will appear very bright.
                      Your understanding on what exactly HDR is and how it works is so far off, I don't even know where to begin.

                      Comment

                      Working...
                      X