Announcement

Collapse
No announcement yet.

Intel Arc Graphics A580 On Linux: Open-Source Graphics For Under $200

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Intel Arc Graphics A580 On Linux: Open-Source Graphics For Under $200

    Phoronix: Intel Arc Graphics A580 On Linux: Open-Source Graphics For Under $200

    Last week Intel announced the Arc Graphics A580 as a new mid-range DG2/Alchemist graphics card option that comes in between the entry-level Arc Graphics A380 and the higher-end Arc Graphics A750/A770. With the Arc Graphics A580 coming in at under $200, it's quite an interesting graphics card for those after open-source Linux driver support and/or those wanting to experiment with Intel's growing oneAPI software ecosystem with excellent open-source GPU compute support.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Like the A750/A770, the Arc Graphics A580 continues to have a relatively high idle power consumption of around 39 Watts.
    That's just bad and it's even worse considering that a lot of Linux stuff is GPU accelerated nowadays including e.g. web browsing and even using GTK applications (not sure about Qt), so while you're not even gaming, your relatively idle power consumption will be even higher. I can't say how much higher, Michael hasn't provided the numbers but my ages old 1660Ti idles at around 8W and while I'm simply scrolling a static web page in Firefox it jumps to around 25W, i.e. that's extra 17W. And we take everything into considering this GPU may very well may have on average 60W power consumption which results in a 8h*20d*365 = 58.4kWh yearly electricity bill which is a very considerable sum of money in many countries of the world. My refrigerator is prolly more power efficient and I open it at least a couple of times every day.

    Comment


    • #3
      Where's the encoding, decoding, transcoding benchmarks? That's where these cards should shine, no? With all the extra hardware codec blocks and QuickSync support, these should be transcoding beasts. Would be interesting to see how these compare to CPU encoders (with and without QuickSync) and GPUs from Nvidia and AMD for H.264, H.265, AV1, VP8, VP9, and so on.

      Would an A380 be good enough for a Plex/Jellyfin/whatever media server, or would you need an A580? How many simultaneous transcoding streams do they support at 1080p? At 2160p? At 5320p? Where's the resolution cut-over for faster-than-real-time transcoding? Which codec works best for each resolution?

      Thankfully, all my end-points support H.264 and H.265 natively, so I don't have to transcode on my Plex server, otherwise my lowly GeForce 730 GPU wouldn't cut it. Would be nice to replace it with something that supports hardware transcode, just in case something needs it in the future. Don't want to spend $1000 on a high-end gaming GPU to get good-enough transcoding performance, though.

      Comment


      • #4
        Great. Now please put that in some decent laptops.

        Comment


        • #5
          Originally posted by sarmad View Post
          Great. Now please put that in some decent laptops.
          You must have missed parts of the article talking about idle power consumption and efficiency. No, I will never want to see this in a laptop.

          Comment


          • #6
            Originally posted by avis View Post

            That's just bad and it's even worse considering that a lot of Linux stuff is GPU accelerated nowadays including e.g. web browsing and even using GTK applications (not sure about Qt), so while you're not even gaming, your relatively idle power consumption will be even higher. I can't say how much higher, Michael hasn't provided the numbers but my ages old 1660Ti idles at around 8W and while I'm simply scrolling a static web page in Firefox it jumps to around 25W, i.e. that's extra 17W. And we take everything into considering this GPU may very well may have on average 60W power consumption which results in a 8h*20d*365 = 58.4kWh yearly electricity bill which is a very considerable sum of money in many countries of the world. My refrigerator is prolly more power efficient and I open it at least a couple of times every day.
            Arc770 can run on idle with 4W when ASPM L1 enabled in the MB bios and the monitor refresh rate is <100hz.
            Last edited by RejectModernity; 17 October 2023, 02:49 PM.

            Comment


            • #7
              Do these cards support VP9 hardware decoding using VA-API on Linux?

              Comment


              • #8
                Originally posted by Random_Jerk View Post
                Do these cards support VP9 hardware decoding using VA-API on Linux?
                Yes, they do.

                Comment


                • #9
                  Originally posted by Random_Jerk View Post
                  Do these cards support VP9 hardware decoding using VA-API on Linux?
                  Absolutely.

                  Comment


                  • #10
                    Originally posted by avis View Post

                    That's just bad and it's even worse considering that a lot of Linux stuff is GPU accelerated nowadays including e.g. web browsing and even using GTK applications (not sure about Qt), so while you're not even gaming, your relatively idle power consumption will be even higher. I can't say how much higher, Michael hasn't provided the numbers but my ages old 1660Ti idles at around 8W and while I'm simply scrolling a static web page in Firefox it jumps to around 25W, i.e. that's extra 17W. And we take everything into considering this GPU may very well may have on average 60W power consumption which results in a 8h*20d*365 = 58.4kWh yearly electricity bill which is a very considerable sum of money in many countries of the world. My refrigerator is prolly more power efficient and I open it at least a couple of times every day.
                    Huh? 🤔

                    Comment

                    Working...
                    X