Announcement

Collapse
No announcement yet.

Intel Arc Graphics Running On Fully Open-Source Linux Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #81
    Originally posted by arQon View Post

    Which, as we've already proven, it is not. Even for just 720p, you need 7+GHz of 4-core CPU.
    I don't know if you're using old data or what, but this is Phoronix and we can just look up on openbenchmarking.org that state-of-the-art software decoders can do 4K on 4 GHz of 4-core CPU.

    Comment


    • #82
      Originally posted by arQon View Post
      Probably because there are people stupid enough to lie about how CPU-intensive AV1 decode is, when it can be tested in about 30 seconds. Also because there are people stupid enough to make comments like "It doesn't matter that it's slow as long as the CPU can decode it" in response to proof that the CPU can't. Either or both of those.
      but i did test the 720p and the 4K video and by this test we can say for sure even a 10 year old cpu a FX8350 can do decode AV1 in real time.

      your example that the pi4 is to slow is also slow but only because the pi4 is much slower than a FX8350

      my mother has only a FX4300 this cpu is maybe to slow. but you can upgrade it to an fx8350 for like 30€ ... means not a big deal.
      Phantom circuit Sequence Reducer Dyslexia

      Comment


      • #83
        Originally posted by yump View Post

        I don't know if you're using old data or what, but this is Phoronix and we can just look up on openbenchmarking.org that state-of-the-art software decoders can do 4K on 4 GHz of 4-core CPU.
        right lets say these are still to slow:
        AMD PRO A12-9800 R7 12 COMPUTE CORES 4C
        AMD Ryzen 3 3200U

        ​but everything what is faster than this group should do it just fine:

        AMD Ryzen 7 4800U
        Intel Core i5-1135G7
        Intel Core i5-1145G7
        Intel Core i7-8565U
        Intel Core i7-1065G7
        Intel Core i7-8550U

        ​and this is already old hardware the realse date of the i7-8550u is 2017.

        and according to userbenchmark
        Based on 497,552 user benchmarks for the AMD FX-8350 and the Intel Core i7-8550U, we rank them both on effective speed and value for money against the best 1,442 CPUs.


        the fx8350 is 8% faster than the 8550u and this cpu is from 2012... this means everything what is younger than 10 years old can play it.
        Phantom circuit Sequence Reducer Dyslexia

        Comment


        • #84
          Originally posted by qarium View Post

          but i did test the 720p and the 4K video and by this test we can say for sure even a 10 year old cpu a FX8350 can do decode AV1 in real time.

          your example that the pi4 is to slow is also slow but only because the pi4 is much slower than a FX8350

          my mother has only a FX4300 this cpu is maybe to slow. but you can upgrade it to an fx8350 for like 30€ ... means not a big deal.
          at the very least, my ryzen 2600 can do netflix chimera at 4k with grain synth when limited to 3 threads inside of MPV,

          Comment


          • #85
            Originally posted by Quackdoc View Post
            at the very least, my ryzen 2600 can do netflix chimera at 4k with grain synth when limited to 3 threads inside of MPV,
            sounds good.-.. this means everything what is 10 years old can do it... exeption is the pi3/4 because it is so low-powered hardware that is is to slow.
            Phantom circuit Sequence Reducer Dyslexia

            Comment


            • #86
              Originally posted by kylew77 View Post
              I bought a Data Center Optane SSD yesterday for the 7us response rate. Get them while you can!
              Ugh. I finally broke down and bought a P5800X. Not sure if I'll use it or just keep it as an investment. There's one 400 GB model left, if anyone is interested*.

              https://www.provantage.com/intel-ssd...1~7ITE93AQ.htm

              Last year, I made about $1k selling a Radeon VII I bought in 2019. Perhaps this will turn out to have a similar return on investment. If not, I'll at least have the fastest boot drive we're likely to see for the next several years, at least. And even if someone eventually makes something faster, I doubt it'll have the quality & reliability of Intel's Data Center products.

              I still remember reading the classified ads in the back of a car magazine my aunt brought me from Germany, back in '89 or 1990. I remember seeing sooo many Ferrari F40's for sale, which I thought was dumb because it seemed obvious to me that Testarossa was the coolest Ferrari. Now, if you can find a F40 in good condition, they routinely sell for a few $Mil. I think this SSD is something like that. Computer hardware normally gets faster every year, but nobody is making a SSD that can rival Optane's endurance or QD1 IOPS in the foreseeable future.

              The one negative I've found about Optane is that it might not have cold storage data retention times as good as some NAND flash products (which, in turn, aren't as good as HDDs). I was hoping it'd be better, but the little evidence I found seems to run contrary to that. Not sure, but just something to beware of. I guess another downside is they burn a fair amount of power. This model uses between about 4 W and 14 W for idle/active, respectively.

              * It's entirely possible further batches will hit the market, in the coming months. If you buy one now, don't blame me if more stock shows up and maybe they sell for even less.

              Oh, and I forgot that Mouser still has them, but for $1546. Also, you can still find larger capacity models, if you want to spend that kind of $$$$. For instance, Newegg has 800 GB for $2042. They have 3rd party sellers with the 400 GB model, but those seem kinda sketchy, based on their rating. And I think the Beach Audio one is sourced from the same distributor that Provantage uses, because their inventory has moved in tandem.
              Last edited by coder; 22 September 2022, 11:22 PM.

              Comment


              • #87
                Originally posted by yump View Post
                I don't know if you're using old data or what
                It was right there in the post. That's the result from a Pi4 (an overclocked one, at that), which is a lot more representative of a phone CPU (you know, those things that 90-something % of video is consumed on) than an HEDT desktop is. This really shouldn't be this hard to understand, and it's hard to imagine that anyone pretending to isn't doing so solely because they're determined to push a false narrative.

                Comment


                • #88
                  Originally posted by arQon View Post

                  It was right there in the post. That's the result from a Pi4 (an overclocked one, at that), which is a lot more representative of a phone CPU (you know, those things that 90-something % of video is consumed on) than an HEDT desktop is. This really shouldn't be this hard to understand, and it's hard to imagine that anyone pretending to isn't doing so solely because they're determined to push a false narrative.
                  Is this a joke? an OC pi4 doesn't get remotely close. I watch 1080p30 content on my phones all the time an s9+ and a huawei p20 pro. I can decode 4k summer at 20fps on my crappy p20 pro running who knows what in the background.

                  and congratulations for using a 4k benchmark to show that you can't watch 720p content. instead of the 1080p summer benchmark, which shows 60fps decode, or chimera which show 45fps decode, both of which means 1080p30 is playable.

                  Comment


                  • #89
                    Originally posted by arQon View Post

                    It was right there in the post. That's the result from a Pi4 (an overclocked one, at that), which is a lot more representative of a phone CPU (you know, those things that 90-something % of video is consumed on) than an HEDT desktop is. This really shouldn't be this hard to understand, and it's hard to imagine that anyone pretending to isn't doing so solely because they're determined to push a false narrative.
                    The Pi4 is not representative of a phone CPU. The raspberry pi is built to a target price of $35, and it's built to be supplied for many years after initial release. Pi 4 is 4x A72. Even low-end phones are 2+6 at least, and mid-range phones are 4+4. The big cores on a phone are 3.5 (Kryo 460) to 5 (A78) microarchitecture generations newer than A72.

                    ... And if you look at mere 1080p instead of 4K, the Pi manages 63 FPS. (Chimera 1080p is "only" 45 FPS, but Chimera is 10-bit color depth.) I specifically chose to respond to your bogus 720p claim with a 4K benchmark -- nine times the pixels -- so that you could not possibly come back and quibble about it, and you still came back and quibbled about it!

                    Comment


                    • #90
                      Originally posted by Quackdoc View Post
                      and congratulations for using a 4k benchmark to show that you can't watch 720p content.
                      Which is, erm, exactly what I didn't do.
                      If you actually had a valid point, you wouldn't need to lie so blatantly to support it - hence the comment about wilful dishonesty for the sake of pushing a false narrative.

                      Comment

                      Working...
                      X