Announcement

Collapse
No announcement yet.

Intel Arc Graphics Running On Fully Open-Source Linux Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #71
    Originally posted by arQon View Post
    Because roughly 0% of devices can play AV1 efficiently, and an unknown but significantly non-zero number of devices can't play it at all (at an acceptable framerate / resolution) and never will. Just because you and I can play it on our desktops, or I can play it on the HTPC at a mere 20x the power draw of h265, doesn't change that.
    > dav1d is efficient enough that most android boxes and TVs can handle at least 1080p30 if not 4k30.
    So... how useful is that if your content is *in* 4k30, or 4k60, which iPhones already record at. If you have a Pixel 10 or whatever that's capable of recording in AV1, are you taking videos of your kids in peasant 1080p, or glorious 4k? Does your wife's phone support AV1 playback? At all? How about 4k60 AV1? How about your parents' phones? And so on.
    If all you care about is torrents, and you only watch those on a machine that *can* handle AV1 playback, great. Likewise if you're ripping all your BluRays for personal use, and you only watch them on your home theater setup: go for it, and enjoy. AV1 is the "right" format for those scenarios, but there are, you know, *other* scenarios, and other people, and those people are not only overwhelmingly boat than you, they also uses their video in a different way.
    > its efficient enough that I would only consider AV1 a no go for low end battery operated devices
    Good for you. Again though, this is redefining words, only now with your explicit selection bias added to it. 20x is more likely an under-estimate than not, but let's be *really* generous and call it 10x instead - even if that's still somehow "efficient enough" in *your* mind, it's certainly going to be a minority opinion for anyone whose device dies 25 minutes into a movie, when they could have watched all LOTR twice on the same device in h265 with HW decode.
    *That's* why it's a *in general* bad choice, and why it will remain one until AV1 has mainstream HW adoption.
    I know you follow this stuff, so you can probably answer this off the top of your head and will have a link if not: how's that coming along? What percentage of phones / tablets / etc have ANY support for AV1 at all? How many have it in HW? What's the matrix for 720p/1080p/4k and 30/60. etc etc. What will those numbers look like next year? (You can cheat and assume Pixel(N+1) sells the same as Pixel(N) etc).
    In the meantime, h265 is on *everything*, with HW support. What's the energy delta between 4k60HEVC (which we know the ?7w? Pi can play) and 4k60AV1 on it - same movie, same rez, etc.
    Different people have the luxury of being able to make different choices, but those questions are the ones you need to answer to be able to say that *your* choice is no longer inferior. One day it will be, but today is not that day.
    i had this Glorious 4K stuff yesterday people with their google pixel 6 did make 4K video in a themepark and they discovered that is suck so much data that their bandwidth of their mobile internet goes down and it takes like forever to send it.
    and others who get the video had the same bandwidth problem it takes like forever to get it.

    to all this problems AV1 technological has a solution and it is server-side/back-end solution to this because if you make a 8K video in AV1 the server can use this file to send 8K to a customer also it can use the same file to send 5K to customer it can use the same file to send 4K to the customer and it also with the same file can send 2K to the user and also 1K and also 0,5K and also even 480p...

    people who get the video could watch it nearly instandly without downloading the 8K data instead they get what their bandwidth can handle maybe only 480p or 2k...

    264/265 and so one in this perspective is a failure if you have a 4K file you can only send 4K or you need to re-encode it with a smaller resolution then you get a new file and you can send it... and AV1 does this without re-encode with lower resolution.

    "Because roughly 0% of devices can play AV1 efficiently"

    if you have a smart-tv this does not matter at all as long as the ARM SOC is fasth enough to decode it in real-time.

    it only need effiency on battery time devices like smartphones. but on these battery time devices 480p AV1 means low resolution mostly gives you the quality you need for these small screens and bigger resolution would not give you higher optical quality because the monitor of these devices is small.

    the devices who need efficiency because of battery time are mostly the same devices who has bandwidth problem...

    just think in this scenario: one makes video want to stream in real-time to many people...

    your 264/265 is not real time steaming instead you make the video then you upload the video another person download the video...

    but if you stream it in real time AV1 is superior and why? 1 person has bandwidth problem he automatically gets lower resolution stream from exactly the same source file... another person has high brandwidth and gets higher version out of the same file. and may instead of server-clind architectur you use P2P architectur and maybe your bandwidth hits a limit for your P2P architectur AV1 is your life savior because all the people then automatically gets lower resolution out of the same source and same file.

    x264/265 is made in a way that you can stram your video... but AV1 is made in a way that it made for steaming in the first place...

    with 264/265 if you want to stream 8k and 5k and 4k and 2k and 1k and 480p then you need 6 files and re-encode it 6 times
    with av1 you only need the 8k file... this saves a lot of space on the server to.

    companies like google or apple could start to avoid pay patent fees for their smartphones and if someone need x264 they encode in on the server side.

    Phantom circuit Sequence Reducer Dyslexia

    Comment


    • #72
      Originally posted by arQon View Post
      Except still only for the imaginary world where power consumption doesn't matter, and people deliberately downgrade footage so that they can play it back.
      You're outright not interested in assessing the pros and cons of each format, or taking anything resembling an objective view: you're just on a crusade to promote AV1, and anyone who doesn't agree that it's perfect and the only format anyone ever needs is wrong, the end.
      Cellphone manufacturers must be kicking themselves over the billions they wasted on dedicated video blocks. Along with AMD and nvidia and Intel. If only they'd known that "nobody" cares about efficiency, and that "everything" can play video in software.
      Let's see what reddit's advice for AV1 is... "For (mpv) to work, be ready to dedicate *2-8 GB of RAM* to buffering frames". Really? 8GB of RAM? That seems like a LOT of memory for what I'm sure you'd like to pretend must be a *really* old CPU, doesn't it. Must be 4k60 though, right? Huh, it's 1080p. Go figure. 2+GB of buffering needed, to give AV1 enough of a head start that it can decode the last frame just as you watch it. For 1080p. Without which it "drops 1019 of 1900 frames". 1900/30=... oof: that's 2+GB of buffering for a *1 minute* clip.
      (https://www.reddit.com/r/AV1/comment..._av1_playback/)
      Well, maybe reddit's just wrong - let's try somewhere else:
      "Playing AV1 video is very CPU demanding" (their emphasis) - https://linuxreviews.org/AV1
      "Latest libdav1d 0.9.0 finally makes 10/12bit videos playable on Desktop via AVX2 optimizations. (snip) ... now plays without stutters and with low CPU usage on my Ryzen 3600". Ryzen 3600. 12 threads of Zen2 at ~4GHz, and it only "NOW" - as in, 2021 - plays "without stutters".
      So, what, now it's an x86 problem? np - I've got an overclocked Pi4 here, let's see how that does with a nice simple 720p clip like https://linuxreviews.org/static/vide...Challenge.webm ...
      Ouch. Yeah, wow, that is JANKY AF.
      It seems that, unsurprisingly, reality does not agree with you on this. If anything, reality questions your basic credibility as a source, given the discrepancy between your observations and everyone else's. I know LG makes some ferociously expensive TVs, but are you really trying to claim flawless 4k30 playback on a 4-year-old device, when *7GHz* of Pi cores can't even manage 720p without so much stuttering it makes you seasick? That's a 9x difference in resolution, and while the scaling certainly isn't linear, your suggestion that (a) that TV has 30+GHz worth of CPU in it, and (b) everyone but the poorest of the poor can buy one with the loose change in the back of the couch, are both utterly absurd.
      I've made a point of not giving you grief for being a shill, because it feels more like you're just misguided about this rather than actively trying to be dishonest - but you are *very* misguided indeed if I can prove you wrong with just one of the devices in *this room*, let alone the rest of the house, or even "just" my street.
      I get it. The MPEG patent trolls and licensing fees make them not just easy to hate but deserving of it, and I'm with you that the day we move on to open codecs can't come soon enough. But it can't do so purely on the back of unfulfilled wishes and hopes and dreams. Which, after wasting more time than I really have for this topic, leaves us right back where we came in, so this time I'll put it more simply and more concretely: this claim of AV1 being remotely suitable for even "many" people, let alone "most", is provably wrong; and trivially so; and all the selection bias in the world doesn't change that.
      *None* of this is even remotely surprising though, because everyone's been here before with h264->h265. There were fanboys on forums screaming about how files would be half the size; announcing that they would re-rip all their DVDs and BluRays and re-encode their torrented pr0n; and that playback isn't a problem because the gaming rig Mommy bought them for Xmas works fine; and the encoding time doesn't matter either because the latest version increases the framerate from 0.4fps to 0.8 fps - nearly realtime! - as long as you don't mind a bit of extra blockiness; and so on.
      Only time will make a difference - and more specifically, despite your denial, HW support. The "average" device is not going to have the same performance as an i5-6600 any time soon, and that's what you need for just 1080p AV1 in pure software, even assuming a majority would somehow be willing to accept that level of resolution loss after being talked into buying a "4"K TV, or making similar choices for their recording capabilities. That means at least 3 years, probably closer to 5 to reach parity with current h265 support, and possibly even longer given the rampant global inflation right now and the fears of recession in the majority of the developed world.
      I get that your worldview is still too narrow to accept that, but I've done my part. On the bright side though, 5 years from now when AV1 finally passes the tipping point just in time to be replaced by AV2, at least you'll be able to say "You know, I was an early adopter" and have everyone marvel at your foresight.
      your 720p example https://linuxreviews.org/static/vide...Challenge.webm does not result in a relevant cpu load on my system... threadripper1920x...

      https://ibb.co/sKFsVKM

      and there are 4K sample videos to.

      why not use this source: https://www.elecard.com/videos
      "
      3840x2160 AV1, WebM 16000 kbps
      ​"
      https://www.elecard.com/storage/vide...3840x2160.webm

      result: https://ibb.co/Lv8mmGP

      you see even the 4K video does not result in a relevant cpu workload... on my system amd 1920x...

      of course i should test it on a much slower system but keep in mind my system is already 5 years old.

      even my system before this one who is now nearly 8 years old would not have a problem to make the 4K AV1 Webm video run.

      this means we already talk about 10 year old systems who can not play it.


      Phantom circuit Sequence Reducer Dyslexia

      Comment


      • #73
        Originally posted by qarium View Post
        your 100% number clearly only reflect the past.

        Oh? In that case, can you make a list of all the devices with HW AV1 support and their market share? I'll even wait, since it'll only take you about 3 minutes. :P

        > but the end-user/consumer has also a profit from it because in the past he did spend a lof of money on patents for the hardware he did buy...

        hahaha. That was a good one.
        So, phones are suddenly going to be $5 cheaper huh? Well, in about 5-6 years when they can finally start to think about actually dropping the HEVC decoders, at least.
        Even if you're willing to imagine HW manufacturers don't just keep the money they'll save - which, as you know, they absolutely will - how much does it matter on a $1000+ phone anyway?

        > it do save money for netflix and youtube right but it also saves money on people who do avoid hardware who pay patent fees to MPEG LA...

        You... might want to do the math on that.

        Let's see... the license pack for the Pi3 is £2.40​ for MPEG2, or $2.78. So yeah, let's call it $5. (The actual cost is $2.25).
        A simple estimate of 10% of 400% CPU on the Pi for the HW block for HEVC, vs more than 400% of the CPU for AV1. (We'll ignore the fact that the Pi can't even play 720p, so instead of a 7W Pi I'd have to use my 200W+ dekstop instead). That means AV1 costs an additional 97.5% of your kWh price, to be generous about it.

        Electricity currently costs 39c/kWh in Germany, and 27c in the UK. Again, let's be generous and go with the UK price. At an additional cost of 24.375c per kWh, on the 7W Pi, which I think it's fair to say is *by far* the lowest-power "PC" out there, it takes 20.5 hours of video playback for AV1 to be more expensive.

        So, yeah: "it saves users money too" is basically just hopelessly wrong, unless they never use the device for video at all.
        Last edited by arQon; 06 September 2022, 04:35 PM.

        Comment


        • #74
          Originally posted by qarium View Post
          if you have a smart-tv this does not matter at all as long as the ARM SOC is fasth enough to decode it in real-time.
          Which, as we've already proven, it is not. Even for just 720p, you need 7+GHz of 4-core CPU.

          > not result in a relevant cpu load on my threadripper1920x

          This is among the most ridiculously stupid comments I have ever seen.

          Comment


          • #75
            Originally posted by arQon View Post
            Which, as we've already proven, it is not. Even for just 720p, you need 7+GHz of 4-core CPU.
            > not result in a relevant cpu load on my threadripper1920x
            This is among the most ridiculously stupid comments I have ever seen.
            no its not stupid it is the system i sid in front of. and by this we can estimate that 4K av1 will even run on 10 years old systems.
            an AMD FX8320 from 10 years ago will play the 4K AV1 video.

            https://cpu.userbenchmark.com/Compar...320/3934vs1983

            your raspberry pi 4 example is maybe true but of course the raspberry pi4 is slower than an AMD FX8320 from 10 years ago.

            Phantom circuit Sequence Reducer Dyslexia

            Comment


            • #76
              Originally posted by arQon View Post
              ​Oh? In that case, can you make a list of all the devices with HW AV1 support and their market share? I'll even wait, since it'll only take you about 3 minutes. :P
              i never said anything like that. you do not need hw AV1 support. if your cpu is fast enough.

              and no one cares if raspberry pi 4 is to slow... you can buy something like rockchip rk3588 the cpu is fast enough and also it has AV1 hardware support.
              https://www.cnx-software.com/2021/12...c-coming-soon/

              Originally posted by arQon View Post
              > but the end-user/consumer has also a profit from it because in the past he did spend a lof of money on patents for the hardware he did buy...
              hahaha. That was a good one.
              So, phones are suddenly going to be $5 cheaper huh? Well, in about 5-6 years when they can finally start to think about actually dropping the HEVC decoders, at least.
              Even if you're willing to imagine HW manufacturers don't just keep the money they'll save - which, as you know, they absolutely will - how much does it matter on a $1000+ phone anyway?
              > it do save money for netflix and youtube right but it also saves money on people who do avoid hardware who pay patent fees to MPEG LA...
              You... might want to do the math on that.
              Let's see... the license pack for the Pi3 is £2.40​ for MPEG2, or $2.78. So yeah, let's call it $5. (The actual cost is $2.25).
              A simple estimate of 10% of 400% CPU on the Pi for the HW block for HEVC, vs more than 400% of the CPU for AV1. (We'll ignore the fact that the Pi can't even play 720p, so instead of a 7W Pi I'd have to use my 200W+ dekstop instead). That means AV1 costs an additional 97.5% of your kWh price, to be generous about it.
              Electricity currently costs 39c/kWh in Germany, and 27c in the UK. Again, let's be generous and go with the UK price. At an additional cost of 24.375c per kWh, on the 7W Pi, which I think it's fair to say is *by far* the lowest-power "PC" out there, it takes 20.5 hours of video playback for AV1 to be more expensive.
              So, yeah: "it saves users money too" is basically just hopelessly wrong, unless they never use the device for video at all.
              some people and also i do not want to pay any money to MPEG LA not even 10 cent ...

              you are free to pay money to MPEG LA... i have no problem with that but i will not.

              and even 10 years old hardware is fast enough to play 4K AV1 videos... of yourse raspberry pi 4 is to slow but you are free to buy a rockchip-rk3588
              Phantom circuit Sequence Reducer Dyslexia

              Comment


              • #77
                Originally posted by arQon View Post
                Oh? In that case,
                to make it more clear for you what is my opinion: i have no problem to pay my electric bill
                but i have a problem with pay MPEG LA a penny...
                i do not want to pay 1 single penny ...

                i see the MPEG LA cartell as the enemy of the free world.

                soon i will buy a new GPU for my system with AV1 decode hardware and then its even more in the direction i want.

                Phantom circuit Sequence Reducer Dyslexia

                Comment


                • #78
                  Originally posted by qarium View Post
                  to make it more clear for you what is my opinion:
                  You're entitled to your opinion. What you *said*, however, was not "your opinion", it was this:

                  > but it also saves money on people who do avoid hardware who pay patent fees to MPEG LA

                  which is a falsehood stated as a fact - otherwise known as a lie. Since English isn't your first language I'm sure it was just problems expressing something - which is completely understandable - but it's still false, so it needs to be corrected because otherwise someone else who knows nothing about the topic will be misled.

                  I'm guessing translation issues are also why you went off on these rather bizarre tangents, as well as the main cause of the numerous other incorrect statements in your posts, nearly all of which were already shown to be untrue before you wrote them.

                  Comment


                  • #79
                    Originally posted by arQon View Post
                    You're entitled to your opinion. What you *said*, however, was not "your opinion", it was this:
                    > but it also saves money on people who do avoid hardware who pay patent fees to MPEG LA
                    which is a falsehood stated as a fact - otherwise known as a lie. Since English isn't your first language I'm sure it was just problems expressing something - which is completely understandable - but it's still false, so it needs to be corrected because otherwise someone else who knows nothing about the topic will be misled.
                    I'm guessing translation issues are also why you went off on these rather bizarre tangents, as well as the main cause of the numerous other incorrect statements in your posts, nearly all of which were already shown to be untrue before you wrote them.
                    the complete argument that AV1 cosumes more power decoded on the cpu is an argument of the past because future systems like rockchip rk3588 do have AV1 hardware decode.

                    i really don't know why you fokus on the past instead of the future.
                    Phantom circuit Sequence Reducer Dyslexia

                    Comment


                    • #80
                      Probably because there are people stupid enough to lie about how CPU-intensive AV1 decode is, when it can be tested in about 30 seconds. Also because there are people stupid enough to make comments like "It doesn't matter that it's slow as long as the CPU can decode it" in response to proof that the CPU can't. Either or both of those.

                      Comment

                      Working...
                      X