Announcement

Collapse
No announcement yet.

Intel Arc Graphics Running On Fully Open-Source Linux Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    Originally posted by Venemo View Post

    A driver is a piece of software that runs on your CPU, a firmware is a piece of software that runs on the device. These two things cooperate to make the HW draw pretty pixels on your screen.

    If you disagree, that's fine, then I guess you can say that I just work on a "not-a-driver".
    Semantics aside, if the entire open source driver stack can't function without closed source firmware, that seems notable.

    If a vendor were to move a huge chunk of the work done from the driver into a closed source firmware, then some people might start to question just how open source that "open source" driver stack really is.

    With regards to firmware and "it's been this way for a long time", the Intel i915 kernel driver would function fine without the closed source guc up until adl-p, so that is pretty recent.

    Comment


    • #62
      Originally posted by qarium View Post
      so why not just use AV1 like any other SANE person ?
      If your version of "sane" means "use a codec that has no HW support on within rounding error of 100% of devices" I'm glad I don't use your dictionary.

      AV1 is almost certainly the future, but it's an exceptionally bad choice *right now* absolutely everywhere. The only reason to use it is if you're in a position where *somebody else's* CPU can be traded for *your* bandwidth, i.e. as a money-saving exercise for streaming companies. Since you aren't Netflix or YouTube, that reason doesn't apply to you.

      Comment


      • #63
        Originally posted by arQon View Post

        If your version of "sane" means "use a codec that has no HW support on within rounding error of 100% of devices" I'm glad I don't use your dictionary.

        AV1 is almost certainly the future, but it's an exceptionally bad choice *right now* absolutely everywhere. The only reason to use it is if you're in a position where *somebody else's* CPU can be traded for *your* bandwidth, i.e. as a money-saving exercise for streaming companies. Since you aren't Netflix or YouTube, that reason doesn't apply to you.
        why is it a bad choice? dav1d is efficient enough that most android boxes and TVs can handle at least 1080p30 if not 4k30. its efficient enough that I would only consider AV1 a no go for low end battery operated devices

        Comment


        • #64
          Originally posted by Quackdoc View Post
          why is it a bad choice?
          Because roughly 0% of devices can play AV1 efficiently, and an unknown but significantly non-zero number of devices can't play it at all (at an acceptable framerate / resolution) and never will. Just because you and I can play it on our desktops, or I can play it on the HTPC at a mere 20x the power draw of h265, doesn't change that.

          > dav1d is efficient enough that most android boxes and TVs can handle at least 1080p30 if not 4k30.

          So... how useful is that if your content is *in* 4k30, or 4k60, which iPhones already record at. If you have a Pixel 10 or whatever that's capable of recording in AV1, are you taking videos of your kids in peasant 1080p, or glorious 4k? Does your wife's phone support AV1 playback? At all? How about 4k60 AV1? How about your parents' phones? And so on.

          If all you care about is torrents, and you only watch those on a machine that *can* handle AV1 playback, great. Likewise if you're ripping all your BluRays for personal use, and you only watch them on your home theater setup: go for it, and enjoy. AV1 is the "right" format for those scenarios, but there are, you know, *other* scenarios, and other people, and those people are not only overwhelmingly boat than you, they also uses their video in a different way.

          > its efficient enough that I would only consider AV1 a no go for low end battery operated devices

          Good for you. Again though, this is redefining words, only now with your explicit selection bias added to it. 20x is more likely an under-estimate than not, but let's be *really* generous and call it 10x instead - even if that's still somehow "efficient enough" in *your* mind, it's certainly going to be a minority opinion for anyone whose device dies 25 minutes into a movie, when they could have watched all LOTR twice on the same device in h265 with HW decode.

          *That's* why it's a *in general* bad choice, and why it will remain one until AV1 has mainstream HW adoption.
          I know you follow this stuff, so you can probably answer this off the top of your head and will have a link if not: how's that coming along? What percentage of phones / tablets / etc have ANY support for AV1 at all? How many have it in HW? What's the matrix for 720p/1080p/4k and 30/60. etc etc. What will those numbers look like next year? (You can cheat and assume Pixel(N+1) sells the same as Pixel(N) etc).

          In the meantime, h265 is on *everything*, with HW support. What's the energy delta between 4k60HEVC (which we know the ?7w? Pi can play) and 4k60AV1 on it - same movie, same rez, etc.

          Different people have the luxury of being able to make different choices, but those questions are the ones you need to answer to be able to say that *your* choice is no longer inferior. One day it will be, but today is not that day.

          Comment


          • #65
            Originally posted by arQon View Post

            Because roughly 0% of devices can play AV1 efficiently, and an unknown but significantly non-zero number of devices can't play it at all (at an acceptable framerate / resolution) and never will. Just because you and I can play it on our desktops, or I can play it on the HTPC at a mere 20x the power draw of h265, doesn't change that.

            > dav1d is efficient enough that most android boxes and TVs can handle at least 1080p30 if not 4k30.

            So... how useful is that if your content is *in* 4k30, or 4k60, which iPhones already record at. If you have a Pixel 10 or whatever that's capable of recording in AV1, are you taking videos of your kids in peasant 1080p, or glorious 4k? Does your wife's phone support AV1 playback? At all? How about 4k60 AV1? How about your parents' phones? And so on.

            If all you care about is torrents, and you only watch those on a machine that *can* handle AV1 playback, great. Likewise if you're ripping all your BluRays for personal use, and you only watch them on your home theater setup: go for it, and enjoy. AV1 is the "right" format for those scenarios, but there are, you know, *other* scenarios, and other people, and those people are not only overwhelmingly boat than you, they also uses their video in a different way.

            > its efficient enough that I would only consider AV1 a no go for low end battery operated devices

            Good for you. Again though, this is redefining words, only now with your explicit selection bias added to it. 20x is more likely an under-estimate than not, but let's be *really* generous and call it 10x instead - even if that's still somehow "efficient enough" in *your* mind, it's certainly going to be a minority opinion for anyone whose device dies 25 minutes into a movie, when they could have watched all LOTR twice on the same device in h265 with HW decode.

            *That's* why it's a *in general* bad choice, and why it will remain one until AV1 has mainstream HW adoption.
            I know you follow this stuff, so you can probably answer this off the top of your head and will have a link if not: how's that coming along? What percentage of phones / tablets / etc have ANY support for AV1 at all? How many have it in HW? What's the matrix for 720p/1080p/4k and 30/60. etc etc. What will those numbers look like next year? (You can cheat and assume Pixel(N+1) sells the same as Pixel(N) etc).

            In the meantime, h265 is on *everything*, with HW support. What's the energy delta between 4k60HEVC (which we know the ?7w? Pi can play) and 4k60AV1 on it - same movie, same rez, etc.

            Different people have the luxury of being able to make different choices, but those questions are the ones you need to answer to be able to say that *your* choice is no longer inferior. One day it will be, but today is not that day.
            my laptop running a celeron n3050, the second weakest laptop in 2015, which at the time was literally the cheapest you could possibly get running windows (lenovo n22), can play 1080p30 av1 videos drawing about 15watts from the wall, vs swdec which draws about 10watts or so. my desktop ryzen 5 2600 doesnt have a noticeable wattage increase over hevc hwdec (my rx 580 mind you probably has issues so that could very well contribute). my phones, (zte axon 7, s9+ huawei p20 pro) can do av1 1080p60 swdec for 4+ hours each. my sisters 2018 android TV can do av1 software dec at 4k30 (some LG tv).

            NO. the amount of people whom av1 is unsuitable for is far lower then what you claim. the majority of people wont be encoding 4k60.

            Comment


            • #66
              Originally posted by Quackdoc View Post
              NO. the amount of people whom av1 is unsuitable for is far lower then what you claim.
              Except still only for the imaginary world where power consumption doesn't matter, and people deliberately downgrade footage so that they can play it back.

              You're outright not interested in assessing the pros and cons of each format, or taking anything resembling an objective view: you're just on a crusade to promote AV1, and anyone who doesn't agree that it's perfect and the only format anyone ever needs is wrong, the end.

              Cellphone manufacturers must be kicking themselves over the billions they wasted on dedicated video blocks. Along with AMD and nvidia and Intel. If only they'd known that "nobody" cares about efficiency, and that "everything" can play video in software.

              Let's see what reddit's advice for AV1 is... "For (mpv) to work, be ready to dedicate *2-8 GB of RAM* to buffering frames". Really? 8GB of RAM? That seems like a LOT of memory for what I'm sure you'd like to pretend must be a *really* old CPU, doesn't it. Must be 4k60 though, right? Huh, it's 1080p. Go figure. 2+GB of buffering needed, to give AV1 enough of a head start that it can decode the last frame just as you watch it. For 1080p. Without which it "drops 1019 of 1900 frames". 1900/30=... oof: that's 2+GB of buffering for a *1 minute* clip.
              (https://www.reddit.com/r/AV1/comment..._av1_playback/)

              Well, maybe reddit's just wrong - let's try somewhere else:
              "Playing AV1 video is very CPU demanding" (their emphasis) - https://linuxreviews.org/AV1
              "Latest libdav1d 0.9.0 finally makes 10/12bit videos playable on Desktop via AVX2 optimizations. (snip) ... now plays without stutters and with low CPU usage on my Ryzen 3600". Ryzen 3600. 12 threads of Zen2 at ~4GHz, and it only "NOW" - as in, 2021 - plays "without stutters".

              So, what, now it's an x86 problem? np - I've got an overclocked Pi4 here, let's see how that does with a nice simple 720p clip like https://linuxreviews.org/static/vide...Challenge.webm ...
              Ouch. Yeah, wow, that is JANKY AF.

              It seems that, unsurprisingly, reality does not agree with you on this. If anything, reality questions your basic credibility as a source, given the discrepancy between your observations and everyone else's. I know LG makes some ferociously expensive TVs, but are you really trying to claim flawless 4k30 playback on a 4-year-old device, when *7GHz* of Pi cores can't even manage 720p without so much stuttering it makes you seasick? That's a 9x difference in resolution, and while the scaling certainly isn't linear, your suggestion that (a) that TV has 30+GHz worth of CPU in it, and (b) everyone but the poorest of the poor can buy one with the loose change in the back of the couch, are both utterly absurd.

              I've made a point of not giving you grief for being a shill, because it feels more like you're just misguided about this rather than actively trying to be dishonest - but you are *very* misguided indeed if I can prove you wrong with just one of the devices in *this room*, let alone the rest of the house, or even "just" my street.

              I get it. The MPEG patent trolls and licensing fees make them not just easy to hate but deserving of it, and I'm with you that the day we move on to open codecs can't come soon enough. But it can't do so purely on the back of unfulfilled wishes and hopes and dreams. Which, after wasting more time than I really have for this topic, leaves us right back where we came in, so this time I'll put it more simply and more concretely: this claim of AV1 being remotely suitable for even "many" people, let alone "most", is provably wrong; and trivially so; and all the selection bias in the world doesn't change that.

              *None* of this is even remotely surprising though, because everyone's been here before with h264->h265. There were fanboys on forums screaming about how files would be half the size; announcing that they would re-rip all their DVDs and BluRays and re-encode their torrented pr0n; and that playback isn't a problem because the gaming rig Mommy bought them for Xmas works fine; and the encoding time doesn't matter either because the latest version increases the framerate from 0.4fps to 0.8 fps - nearly realtime! - as long as you don't mind a bit of extra blockiness; and so on.

              Only time will make a difference - and more specifically, despite your denial, HW support. The "average" device is not going to have the same performance as an i5-6600 any time soon, and that's what you need for just 1080p AV1 in pure software, even assuming a majority would somehow be willing to accept that level of resolution loss after being talked into buying a "4"K TV, or making similar choices for their recording capabilities. That means at least 3 years, probably closer to 5 to reach parity with current h265 support, and possibly even longer given the rampant global inflation right now and the fears of recession in the majority of the developed world.

              I get that your worldview is still too narrow to accept that, but I've done my part. On the bright side though, 5 years from now when AV1 finally passes the tipping point just in time to be replaced by AV2, at least you'll be able to say "You know, I was an early adopter" and have everyone marvel at your foresight.

              Comment


              • #67
                Originally posted by arQon View Post

                Except still only for the imaginary world where power consumption doesn't matter, and people deliberately downgrade footage so that they can play it back.

                You're outright not interested in assessing the pros and cons of each format, or taking anything resembling an objective view: you're just on a crusade to promote AV1, and anyone who doesn't agree that it's perfect and the only format anyone ever needs is wrong, the end.
                I wont read any of this since you clearly didn't read what I had stated, if 5 watts of difference on a laptop, no perceptible difference on my desktop, and power efficient enough to run on an arm computer, which typically can run on 10-20w even maxed out, and multiple hours of playback on phone is that significant enough that I am some how saying it doesn't matter when I clearly stated otherwise, I dunno what to tell you, you either have reading comprehension issues or are deliberatly choosing to ignore what I said, which I am electing to do with the rest of your clearly correct post.

                Comment


                • #68
                  Michael, can we get some OpenCL benchmarks on the A380, or some other Intel dGPU, in the near future?
                  Last edited by coder; 31 August 2022, 11:11 AM.

                  Comment


                  • #69
                    Originally posted by Quackdoc View Post
                    I wont read any of this since you clearly didn't read what I had stated ... or are deliberatly choosing to ignore what I said
                    The second part. Except I'm not "ignoring" it, I'm saying that neither I, nor any reference I can easily find online, believes your claim that AV1 decode is not at least far more CPU-intensive than even *software* HEVC decode - so much so that even just 720p decode is not viable on a quad-core 1.8GHz ARM system; and it's obviously also massively more power-hungry than dedicated HW decode of HEVC.

                    Since I have the resources at hand to trivially test this, I don't have to go on "belief" to decide my position - because I *have* tested it, so I have proof of the outcome. Like all actual science involving actual tests rather than imagination or anecdotes, it's trivially reproducible by anyone with, say, a Pi4.

                    It's not that I don't *want* you to be right, it's simply that you aren't.
                    Last edited by arQon; 03 September 2022, 10:45 PM.

                    Comment


                    • #70
                      Originally posted by arQon View Post
                      If your version of "sane" means "use a codec that has no HW support on within rounding error of 100% of devices" I'm glad I don't use your dictionary.
                      your 100% number clearly only reflect the past. and because of this is not an argument of now and not an argument of the future.

                      Originally posted by arQon View Post
                      AV1 is almost certainly the future, but it's an exceptionally bad choice *right now* absolutely everywhere. The only reason to use it is if you're in a position where *somebody else's* CPU can be traded for *your* bandwidth, i.e. as a money-saving exercise for streaming companies. Since you aren't Netflix or YouTube, that reason doesn't apply to you.
                      believe me it is much more than just save money for straming companies Netflix/Youtube on bandwidth on the cost of others pocket ... this is all only the factor what did make the development of AV1 make possible in the first place.

                      but the end-user/consumer has also a profit from it because in the past he did spend a lof of money on patents for the hardware he did buy...

                      it do save money for netflix and youtube right but it also saves money on people who do avoid hardware who pay patent fees to MPEG LA...


                      Phantom circuit Sequence Reducer Dyslexia

                      Comment

                      Working...
                      X