Announcement

Collapse
No announcement yet.

AVX2 Tuning Paying Off Big Time For Dav1d 10b/12b Video Decode

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    And how do we know which Dav1d version FF and Chrome use? Does it ship with the browser or it tries to use the system wide library installed by the OS user?

    Comment


    • #12
      Originally posted by Danny3 View Post
      Wow, cool and everything, but why the fuck we still have stone age framerate ?
      I don't give a fuck that the the CPU can decode the 100-600 FPS when the video is still 24 FPS !
      I'm really tired of blurry vision !
      Although I'd rather videos be at 60FPS, I'm not too upset that isn't too common. At 4K, you need a lot of bandwidth to stream it, and then for anyone who isn't streaming, it takes up a lot of disk space (especially if it isn't lossy).

      The way I see it, the main benefit to highly efficient decoders is being able to use cheap and low-power CPUs.

      Comment


      • #13
        Originally posted by cl333r View Post
        And how do we know which Dav1d version FF and Chrome use? Does it ship with the browser or it tries to use the system wide library installed by the OS user?
        In the case of FF, it looks like it's updated from time to time directly into third_party sub-directory:

        https://hg.mozilla.org/mozilla-centr...rty/dav1d/NEWS

        Comment


        • #14
          Originally posted by cl333r View Post
          I thought it's only a problem with movies looking like soap opera at 60 FPS?
          Why would anyone complain about a documentary or show at 60 FPS (other than bandwidth)?
          TLDR: Normally it is when a video's FPS and display's frame rate don't mathematically line up you get that effect. 60fps content on a 60hz/60fps display looks beautiful. If it doesn't something probably needs tweaked or there's something psychological at play.

          It varies by content, person, and (especially) the display. A lot of those TVs that do 24 to 60 fps conversions suffer from that; fake 120hz TVs. The problem with those is the algorithms just add extra frames and then downsample into 60 which is a big cause of the soap opera effect. It's a math thing. 24 doesn't go into 60 so you go up to 120 then down hence terms like "fake 120 hz". That's why it's less of an issue on actual 120hz and 240hz screens (simply 5x and 10x more frames with 24fps content) or with 30/60 fps content on those kinds of displays since there is a lot less guess-work on the algorithms part.

          My Dad's old TV, for example, made almost everything look like it had that effect. It had some horrible fake 120hz mode, but even with as much post processing as possible disabled, fake 120hz off, that TV still gave too sharp of a picture and we could never dial in settings that looked good for everything...games and TV shows (30/60 fps) or just movies (24fps).

          Some people simply aren't susceptible to it. I call those people "happy gaming at 30 fps". Once I notice it I can't not see it, unfortunately.

          There's also some psychology involved that likely only poorer people and people in their 30s and up will likely only notice. We grew up without HD content, without high FPS displays, etc. Sometimes simply seeing content in its full glory is jarring to people. Faces and random objects have more detail than you'd normally notice in real life. My assumption is that younger and richer people are around better quality displays to be less susceptible from low to high quality transitions.

          Comment


          • #15
            Originally posted by skeevy420 View Post

            TLDR: Normally it is when a video's FPS and display's frame rate don't mathematically line up you get that effect. 60fps content on a 60hz/60fps display looks beautiful. If it doesn't something probably needs tweaked or there's something psychological at play.

            It varies by content, person, and (especially) the display. A lot of those TVs that do 24 to 60 fps conversions suffer from that; fake 120hz TVs. The problem with those is the algorithms just add extra frames and then downsample into 60 which is a big cause of the soap opera effect. It's a math thing. 24 doesn't go into 60 so you go up to 120 then down hence terms like "fake 120 hz". That's why it's less of an issue on actual 120hz and 240hz screens (simply 5x and 10x more frames with 24fps content) or with 30/60 fps content on those kinds of displays since there is a lot less guess-work on the algorithms part.

            My Dad's old TV, for example, made almost everything look like it had that effect. It had some horrible fake 120hz mode, but even with as much post processing as possible disabled, fake 120hz off, that TV still gave too sharp of a picture and we could never dial in settings that looked good for everything...games and TV shows (30/60 fps) or just movies (24fps).

            Some people simply aren't susceptible to it. I call those people "happy gaming at 30 fps". Once I notice it I can't not see it, unfortunately.

            There's also some psychology involved that likely only poorer people and people in their 30s and up will likely only notice. We grew up without HD content, without high FPS displays, etc. Sometimes simply seeing content in its full glory is jarring to people. Faces and random objects have more detail than you'd normally notice in real life. My assumption is that younger and richer people are around better quality displays to be less susceptible from low to high quality transitions.
            For sports and nature shows yes higher frame rates can look better. But there is some thing psychological about 24fps for drama and a number of other formats and it has nothing to do with the "poor" people your pissing on or age. It has to do with the fact that when the image gets to realistic it kills the ability to suspend disbelief. We spend a lot of money and effort making videos less realistic through reduced depth of field, non-natural lighting, filters and haze for that very reason. What you are missing is that the 120fps you so love is mostly 24fps with each frame shown 5 times.
            Last edited by MadeUpName; 17 May 2021, 10:36 PM.

            Comment


            • #16
              ~/.config/mpv/mpv.conf

              video-sync=display-resample
              interpolation=yes
              tscale=box
              tscale-window=kaiser
              tscale-clamp=0.0
              Problem solved!

              Comment


              • #17
                Originally posted by Toggleton View Post
                If you talk about "turning on AV1 support in their browsers" you mean in Youtube? Youtube is so far 8bit only and that(dav1d decoding) should be nearly fully done(not many big improvements anymore)
                Just enable it watch a YT video that got encoded in AV1 and look how the performance/CPU usage is.
                And you can set it that you only want it for 480p or lower https://www.youtube.com/account_playback
                There was no change for 8bit in 0.9 so you can look at the results of 0.8.2 https://openbenchmarking.org/test/pts/dav1d

                I don't think decoding/playback of h265 would be noticeable faster than AV1 decoding with dav1d(8bit).


                If you talk about x265 you are talking about the encoding side. Have no up to date numbers that compare x265 vs AV1 encoder. But the AV1 encoder did get quite a lot faster since AV1 got released.
                Software decoding CPU use is pretty high compared to hardware accelerated video decode, no matter which way you cut it.

                Comment


                • #18
                  Originally posted by linuxgeex View Post
                  Benchmark results on a Haswell-era CPU machine would have been particularly helpful for the folks who are avoiding turning on AV1 support in their browsers and/or sticking with x265 because "AV1 is too slow".
                  x265 is encoder, subj is decoder. if you meant h265, it's abandoned. youtube users could be sticking to h264 instead

                  Comment


                  • #19
                    Originally posted by skeevy420 View Post
                    Normally it is when a video's FPS and display's frame rate don't mathematically line up you get that effect.
                    freesync is a solution to this problem

                    Comment


                    • #20
                      Originally posted by pal666 View Post
                      freesync is a solution to this problem
                      Yep. All it has to do is double the number of frames used and it's good to go. No down-conversion to cause crappy interpolation.

                      Comment

                      Working...
                      X