Intel Arc Graphics A580 On Linux: Open-Source Graphics For Under $200

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • flower
    Senior Member
    • Aug 2018
    • 424

    #11
    Originally posted by phoenix_rizzen View Post
    Where's the encoding, decoding, transcoding benchmarks? That's where these cards should shine, no? With all the extra hardware codec blocks and QuickSync support, these should be transcoding beasts. Would be interesting to see how these compare to CPU encoders (with and without QuickSync) and GPUs from Nvidia and AMD for H.264, H.265, AV1, VP8, VP9, and so on.

    Would an A380 be good enough for a Plex/Jellyfin/whatever media server, or would you need an A580? How many simultaneous transcoding streams do they support at 1080p? At 2160p? At 5320p? Where's the resolution cut-over for faster-than-real-time transcoding? Which codec works best for each resolution?

    Thankfully, all my end-points support H.264 and H.265 natively, so I don't have to transcode on my Plex server, otherwise my lowly GeForce 730 GPU wouldn't cut it. Would be nice to replace it with something that supports hardware transcode, just in case something needs it in the future. Don't want to spend $1000 on a high-end gaming GPU to get good-enough transcoding performance, though.
    I use an A380 as a media encoder for emby (they all have exactly the same encoders. so there is no difference between the models) and yes it is a beast.

    i am not sure how to compare it but i took me around 5 hours to transcode a video to x265 and with the gpu it is around 20min.

    Comment

    • darkbasic
      Senior Member
      • Nov 2009
      • 3083

      #12
      It would be a best buy if the Xe driver was going to support its media engine, otherwise it's a no go on non-x86 platforms.
      ## VGA ##
      AMD: X1950XTX, HD3870, HD5870
      Intel: GMA45, HD3000 (Core i5 2500K)

      Comment

      • aeternum
        Junior Member
        • Oct 2023
        • 1

        #13
        I don't have any Intel Arc GPU so I'm not sure about this but there are guidelines on both Intel and Asrock websites on reducing idle power draw
        https://www.intel.com/content/www/us/en/support/articles/000092564/graphics.html
        https://www.asrock.com/support/faq.us.asp?id=528

        Comment

        • stormcrow
          Senior Member
          • Jul 2017
          • 1511

          #14
          Michael Something about the hashcat numbers isn't sitting right for me. I'm not saying they're necessarily wrong, but I've seen numbers like that before where the results were exact whole numbers, or rounded off errors of 1/3 (yet these are whole numbers, not floats, or am I missing the decimal point?) but nothing else that turned out to be number handling errors. It suggests there's some arithmetic or type errors going on behind the scene either in PTS or in Hashcat's results.

          Let me rephrase... I think I'm being unclear... kinda have a sinus headache while writing this:

          So the hash cat results are all whole numbers. But the first two sets end in either 000, 333, 667. This suggest there's a division by 3 somewhere in the result calculation. Yet these are all whole number/integer results. Large whole numbers, which suggests a type conversion error (or overflow perhaps). You're possibly losing a decimal point somewhere or trying to divide an integer or something that results in an integer that looks like a 1/3 division result suggesting that PTS is taking the results of three tests and averaging them, or Hashcat itself is taking a 3 round benchmark and taking an average although my money is that the bug is in PTS... assuming it's not a weird arithmetic fluke.
          Last edited by stormcrow; 17 October 2023, 08:51 PM.

          Comment

          • Quackdoc
            Senior Member
            • Oct 2020
            • 4987

            #15
            I struggle to feel the worth of this card considering it's pricing for me, but generally at this point I would wait until the next generation cards

            Comment

            • avis
              Senior Member
              • Dec 2022
              • 2179

              #16
              Originally posted by Etherman View Post
              Huh? 🤔
              I don't know what to say man. Your comment is so rich I'm just speechless.

              Comment

              • QwertyChouskie
                Senior Member
                • Nov 2017
                • 637

                #17
                Originally posted by avis View Post

                I don't know what to say man. Your comment is so rich I'm just speechless.
                Your math didn;t make sense here. It should be 60 (watts) times 8 (hours per day) times 20 (days in a month) time 12 (months in a year) divided by 1000 (so we have KWh instead of Wh). So 115.2 KWh, which at $0.50 per KWh comes out to just under 60 bucks a year.

                Comment

                • WereCatf
                  Phoronix Member
                  • Jun 2022
                  • 70

                  #18
                  Originally posted by flower View Post

                  i am not sure how to compare it but i took me around 5 hours to transcode a video to x265 and with the gpu it is around 20min.
                  x265 is an encoder, not a codec: you do not encode to x265, you encode with x265. The codec is H.265, regardless of whether you use x265 or QuickSync or NVENC or whatever.

                  Comment

                  • Etherman
                    Senior Member
                    • Dec 2017
                    • 289

                    #19
                    Originally posted by avis View Post

                    I don't know what to say man. Your comment is so rich I'm just speechless.
                    I made the interesting self explainatory part bold.
                    You put a lot of days in a year and you made the hours 58 400 into Wh.
                    Last edited by Etherman; 17 October 2023, 11:45 PM.

                    Comment

                    • sarmad
                      Senior Member
                      • Jul 2013
                      • 1222

                      #20
                      Originally posted by avis View Post

                      You must have missed parts of the article talking about idle power consumption and efficiency. No, I will never want to see this in a laptop.
                      No, I didn't miss that part, but I assume this will be some sort of hybrid laptop so the Arc GPU can remain off while you're not gaming.

                      Comment

                      Working...
                      X