Originally posted by pal666
View Post
Announcement
Collapse
No announcement yet.
Radeon FreeSync 2 Rolled Out With HDR & More
Collapse
X
-
Originally posted by pal666 View Postfor obvious reason: software drivers will ask hardware videocard to do it, which is much more powerful than monitor and will not add 30ms latency like monitors which need to transform their input
Comment
-
-
Originally posted by bug77 View PostI didn't know hw accelerated tone mapping was a feature of Polaris/Vega, thanks for updating me.
Comment
-
Originally posted by starshipeleven View PostI didn't know HW acceleration of tone mapping was something newsworthy for any GPU made in the last decade. http://http.download.nvidia.com/deve...eagues_HDR.pdf
Regardless, setting the video card to render HDR when you have a non-HDR monitor is the wrong thing to do, so I'm really not sure what's the problem AMD's solution fixes.
Comment
-
Originally posted by bug77 View PostIs this really the same thing? I thought the "old" HDR was more of a fake HDR, as opposed to what we're getting these days. Then again, tone mapping is tone mapping...
I didn't check the rest of the PDF as it's irrelevant for my point.
Regardless, setting the video card to render HDR when you have a non-HDR monitor is the wrong thing to do, so I'm really not sure what's the problem AMD's solution fixes.
1. it sets a min quality level for screens supporting Freesync2
2. freesync2 reads HDR info from the screen itself and makes sure the game renders frames ready for the screen, so the display controller does not have to do any tone mapping on its own. This decreases latency and should increase quality as the GPU is usually better than a dedicated microcontroller in the screen.
Comment
-
Originally posted by starshipeleven View PostAfaik tone mapping didn't change. Rendered on CPU it would have total crap framerate even with good CPUs, so it's always on GPU.
I didn't check the rest of the PDF as it's irrelevant for my point.
Originally posted by starshipeleven View Postalready said by others
1. it sets a min quality level for screens supporting Freesync2
2. freesync2 reads HDR info from the screen itself and makes sure the game renders frames ready for the screen, so the display controller does not have to do any tone mapping on its own. This decreases latency and should increase quality as the GPU is usually better than a dedicated microcontroller in the screen.
http://arstechnica.com/gadgets/2017/...-release-date/
Comment
-
Originally posted by robin4002 View Postnot with FreeSync 2 :
LFC require at least a ratio of 2x between lower and higher refresh rate.
If a monitor can do 60 to 120 Hz, with LFC if you run at 40 fps, the refresh will be 80 hz, making the game as smooth as if the screen was running at 40 Hz.
Comment
-
Originally posted by bug77 View Post
Is this really the same thing? I thought the "old" HDR was more of a fake HDR, as opposed to what we're getting these days. Then again, tone mapping is tone mapping...
Regardless, setting the video card to render HDR when you have a non-HDR monitor is the wrong thing to do, so I'm really not sure what's the problem AMD's solution fixes.
Edit:
Just to be clearer, HDR shaders exists for more than a decade now. What's new is just HDR displays. Which basically does what a shader would do: take High Dynamic Range images and flatten them for the display color space.
The difference with a game engine shader is that the game engine can give more information. For instance, if you're inside a house and looking through a window, the dynamic range can be huge, but it will eventually depends on where you're looking at. If you're looking at the window, it means you want to see through, so you'll select a high "standard range" that allows you see what's behind the window, but the wall around will appear dark. On the other hand, if you're looking at the wall next to the window, you want to see the details of the wall, so you'll select a low "standard range" that allows you to see these details, but the window will appear very bright.Last edited by Creak; 04 January 2017, 11:32 AM.
- Likes 1
Comment
-
Originally posted by Creak View Post
Display HDR is best for 4K/HDR-video playback. For games, HDR is often better when computed directly in the GPU using shaders (simply because when computed using the game engine, you have more knowledge, like the z-buffer for instance).
Edit:
Just to be clearer, HDR shaders exists for more than a decade now. What's new is just HDR displays. Which basically does what a shader would do: take High Dynamic Range images and flatten them for the display color space.
The difference with a game engine shader is that the game engine can give more information. For instance, if you're inside a house and looking through a window, the dynamic range can be huge, but it will eventually depends on where you're looking at. If you're looking at the window, it means you want to see through, so you'll select a high "standard range" that allows you see what's behind the window, but the wall around will appear dark. On the other hand, if you're looking at the wall next to the window, you want to see the details of the wall, so you'll select a low "standard range" that allows you to see these details, but the window will appear very bright.
Comment
Comment