Announcement

Collapse
No announcement yet.

dav1d 1.0 AV1 Video Decoder Nears Release With AVX-512 Acceleration

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by hotaru View Post

    my experience with AV1 on older hardware (which is limited to desktops) has been that it works well enough that you probably wouldn't notice a difference between it and VP9, but software-decoded AV1 uses a lot more electricity than hardware-decoded VP9. the older systems I've used can only handle AV1 decoding "just fine" as long as you don't think about the environmental impact.
    the damage to your electricity bill would be far greater than any environmental impact you would inflict. it would be negligible. in fact the "electricity costs" would be reduced by the sheer amount of internet traffic that ditching VP9 for AV1 would save. your shitty desktop from 05 will work harder. but the thousands terabytes in network bandwidth and electrical usage savings that results from this would be far greater. not only does your comment come off as trying to gather social points, but it was completely wrong too.

    we aren't talking about some small amount of reduction Older tests marked a 25% bandwidth efficiency over VP9 so that could be even higher now.

    Comment


    • #32
      Originally posted by Quackdoc View Post
      the damage to your electricity bill would be far greater than any environmental impact you would inflict.
      I don't care about the insignificant "damage" to my electricity bill or the insignificant environmental impact of a single person's use of AV1. what matters is the environmental impact of millions of people using AV1 instead of VP9 on older hardware.

      your shitty desktop from 05 will work harder. but the thousands terabytes in network bandwidth and electrical usage savings that results from this would be far greater.
      again, the problem is that there are millions of such shitty desktops, not just one. burning 400% more electricity for a 30% decrease in network bandwidth is not good.

      Comment


      • #33
        Originally posted by hotaru View Post
        again, the problem is that there are millions of such shitty desktops, not just one. burning 400% more electricity for a 30% decrease in network bandwidth is not good.
        then you don't understand just how much can be saved when migrating to AV1.

        here are some of the environmental savings off the top of my head.

        1. Storage servers. 25% less storage usage. not only does this mean less drives consuming electricity, this means that the computers that need to manage the thousands of gigabytes a second of routing get a 25% easier load.

        both storage drives and computers are producing less heat and taking up less storage. the manufacturing of storage its self is a significant savings to the environment. when you factor in redundancy that many companies have, cutting down on storage usage itself would be a massive boon to the ecosystem. then of course the electricity usage is greatly cut down too. as you have to cool the massive data servers.

        2. network servers. that route the traffic wind up using less electricity too. massive load reduction considering that much of the worlds traffic right now is spent on streaming platforms. these can benefit from the same gain in cooling.

        all of these dwarf the detriments of "millions of devices" burning extra electricity.

        EDIT: as an appendage, for me on my celeron n3050, AV1 actually performs better than vp9 software decode. higher FPS, and when locked to the same FPS, less CPU time. so it depends on the device anyways.

        EDIT2: I should state that this is with an optimzied build of dav1d, I dunno how they fare on equal terms on my device.

        EDIT3: wiped stuff, downloaded ffmpeg from arch repo, they preform very similar actually.
        Last edited by Quackdoc; 03 March 2022, 04:31 PM.

        Comment


        • #34
          Originally posted by Quackdoc View Post
          EDIT: as an appendage, for me on my celeron n3050, AV1 actually performs better than vp9 software decode. higher FPS, and when locked to the same FPS, less CPU time. so it depends on the device anyways.
          so how much power does your n3050 use to decode 720p AV1 at 30 fps?.

          how much power does it use to decode H.264 using the n3050's hardware decoder at the same resolution and framerate?

          that difference will easily dwarf the insignificant amount of power that it takes to send the video over a network.

          Intel has hardware decoding for VP9 since Skylake, and AMD since Raven Ridge, so the advantage of using the available decoding hardware will be even bigger on newer hardware.

          Comment


          • #35
            Originally posted by hotaru View Post

            so how much power does your n3050 use to decode 720p AV1 at 30 fps?.

            how much power does it use to decode H.264 using the n3050's hardware decoder at the same resolution and framerate?

            that difference will easily dwarf the insignificant amount of power that it takes to send the video over a network.

            Intel has hardware decoding for VP9 since Skylake, and AMD since Raven Ridge, so the advantage of using the available decoding hardware will be even bigger on newer hardware.
            I go from about 6.8w vaapi avc to 7.1-7.2w 1080p dav1d for reference I draw about 8-9w when doing multiple decodes at the same time to max it out

            Comment


            • #36
              Originally posted by Quackdoc View Post

              I go from about 6.8w vaapi avc to 7.1-7.2w 1080p dav1d for reference I draw about 8-9w when doing multiple decodes at the same time to max it out
              and how much power does it take to transfer that video over a network, compared to idle? probably at least an order of magnitude less than the difference between software and hardware decoding.

              Comment


              • #37
                Originally posted by hotaru View Post

                and how much power does it take to transfer that video over a network, compared to idle? probably at least an order of magnitude less than the difference between software and hardware decoding.
                in isolation? sure, but when you are transferring nearly terabytes of data, it ads up, and that's just in isolation. when you add the other factors such as CDNs the electricity cost usage skyrockets.

                Comment

                Working...
                X