And how do we know which Dav1d version FF and Chrome use? Does it ship with the browser or it tries to use the system wide library installed by the OS user?
Announcement
Collapse
No announcement yet.
AVX2 Tuning Paying Off Big Time For Dav1d 10b/12b Video Decode
Collapse
X
-
Originally posted by Danny3 View PostWow, cool and everything, but why the fuck we still have stone age framerate ?
I don't give a fuck that the the CPU can decode the 100-600 FPS when the video is still 24 FPS !
I'm really tired of blurry vision !
The way I see it, the main benefit to highly efficient decoders is being able to use cheap and low-power CPUs.
- Likes 4
Comment
-
Originally posted by cl333r View PostAnd how do we know which Dav1d version FF and Chrome use? Does it ship with the browser or it tries to use the system wide library installed by the OS user?
- Likes 2
Comment
-
Originally posted by cl333r View PostI thought it's only a problem with movies looking like soap opera at 60 FPS?
Why would anyone complain about a documentary or show at 60 FPS (other than bandwidth)?
It varies by content, person, and (especially) the display. A lot of those TVs that do 24 to 60 fps conversions suffer from that; fake 120hz TVs. The problem with those is the algorithms just add extra frames and then downsample into 60 which is a big cause of the soap opera effect. It's a math thing. 24 doesn't go into 60 so you go up to 120 then down hence terms like "fake 120 hz". That's why it's less of an issue on actual 120hz and 240hz screens (simply 5x and 10x more frames with 24fps content) or with 30/60 fps content on those kinds of displays since there is a lot less guess-work on the algorithms part.
My Dad's old TV, for example, made almost everything look like it had that effect. It had some horrible fake 120hz mode, but even with as much post processing as possible disabled, fake 120hz off, that TV still gave too sharp of a picture and we could never dial in settings that looked good for everything...games and TV shows (30/60 fps) or just movies (24fps).
Some people simply aren't susceptible to it. I call those people "happy gaming at 30 fps". Once I notice it I can't not see it, unfortunately.
There's also some psychology involved that likely only poorer people and people in their 30s and up will likely only notice. We grew up without HD content, without high FPS displays, etc. Sometimes simply seeing content in its full glory is jarring to people. Faces and random objects have more detail than you'd normally notice in real life. My assumption is that younger and richer people are around better quality displays to be less susceptible from low to high quality transitions.
- Likes 1
Comment
-
Originally posted by skeevy420 View Post
TLDR: Normally it is when a video's FPS and display's frame rate don't mathematically line up you get that effect. 60fps content on a 60hz/60fps display looks beautiful. If it doesn't something probably needs tweaked or there's something psychological at play.
It varies by content, person, and (especially) the display. A lot of those TVs that do 24 to 60 fps conversions suffer from that; fake 120hz TVs. The problem with those is the algorithms just add extra frames and then downsample into 60 which is a big cause of the soap opera effect. It's a math thing. 24 doesn't go into 60 so you go up to 120 then down hence terms like "fake 120 hz". That's why it's less of an issue on actual 120hz and 240hz screens (simply 5x and 10x more frames with 24fps content) or with 30/60 fps content on those kinds of displays since there is a lot less guess-work on the algorithms part.
My Dad's old TV, for example, made almost everything look like it had that effect. It had some horrible fake 120hz mode, but even with as much post processing as possible disabled, fake 120hz off, that TV still gave too sharp of a picture and we could never dial in settings that looked good for everything...games and TV shows (30/60 fps) or just movies (24fps).
Some people simply aren't susceptible to it. I call those people "happy gaming at 30 fps". Once I notice it I can't not see it, unfortunately.
There's also some psychology involved that likely only poorer people and people in their 30s and up will likely only notice. We grew up without HD content, without high FPS displays, etc. Sometimes simply seeing content in its full glory is jarring to people. Faces and random objects have more detail than you'd normally notice in real life. My assumption is that younger and richer people are around better quality displays to be less susceptible from low to high quality transitions.Last edited by MadeUpName; 17 May 2021, 10:36 PM.
- Likes 1
Comment
-
Originally posted by Toggleton View PostIf you talk about "turning on AV1 support in their browsers" you mean in Youtube? Youtube is so far 8bit only and that(dav1d decoding) should be nearly fully done(not many big improvements anymore)
Just enable it watch a YT video that got encoded in AV1 and look how the performance/CPU usage is.
And you can set it that you only want it for 480p or lower https://www.youtube.com/account_playback
There was no change for 8bit in 0.9 so you can look at the results of 0.8.2 https://openbenchmarking.org/test/pts/dav1d
I don't think decoding/playback of h265 would be noticeable faster than AV1 decoding with dav1d(8bit).
If you talk about x265 you are talking about the encoding side. Have no up to date numbers that compare x265 vs AV1 encoder. But the AV1 encoder did get quite a lot faster since AV1 got released.
Comment
-
Originally posted by linuxgeex View PostBenchmark results on a Haswell-era CPU machine would have been particularly helpful for the folks who are avoiding turning on AV1 support in their browsers and/or sticking with x265 because "AV1 is too slow".
- Likes 1
Comment
Comment