Announcement

Collapse
No announcement yet.

Intel Arc Graphics A750 + A770 Linux Gaming Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Among the few unimplemented features is the Vulkan sparse support needed for some VKD3D-Proton games like DIRT 5, Deathloop, Assassin's Creed: Valhalla, Forza Horizon 4/5, and other modern titles.​
    This seems like a rather important shortcoming to me, as it effectively prevents any of the newer games from running in Linux.
    I was hoping hat Intel would have fixed it in time for the Arc-launch but as this has not happened, I would prefer if someone with significant reach could call them out for it.

    Comment


    • #12
      Michael since you already calculated performance per dollar for all individual tests could you also add a geometric mean of performance per dollar next time you do such an analysis?
      I know I can calculate them myself but this would be a major convenience improvement.
      Thanks in advance.

      Comment


      • #13
        I wouldn't call these a great buy. Right now the 6600 XT is faster than the A770, and is usually around $300 new. On Ebay the 6600 XT can be found used for $250 and sometimes even less. The RTX 3060 is just a little bellow $300 off Ebay used. Maybe if the A750 was $200 and the A770 was $250 I'd be all for it, but at nearly $300 you have far better choices.

        Comment


        • #14
          Power efficiency is low according to Windows reviews.

          The A770 draws about as much as a 3070, while the A750 draws as much as a 3060 TI under load. Which means the task energy is generally higher, even in games it favours like Forza Horizon 5.

          Comment


          • #15
            Michael good tests but where stay 1080p tests specially with Arc A750

            Comment


            • #16
              Originally posted by Dukenukemx View Post
              I wouldn't call these a great buy. Right now the 6600 XT is faster than the A770, and is usually around $300 new. On Ebay the 6600 XT can be found used for $250 and sometimes even less. The RTX 3060 is just a little bellow $300 off Ebay used. Maybe if the A750 was $200 and the A770 was $250 I'd be all for it, but at nearly $300 you have far better choices.
              16GB would be great for compute and 3D rendering, e.g. with Blender. Competitors like the 6600 XT have a lower amount of VRAM. And, personally I wouldn't purchase anyway an 8 GB VRAM GPU in 2022 even if just for gaming, unless forced to. Learned my lesson with the 4GB RX480 I got in 2016, when the 8 GB version was available.

              Originally posted by brucethemoose View Post
              Power efficiency is low according to Windows reviews.

              The A770 draws about as much as a 3070, while the A750 draws as much as a 3060 TI under load. Which means the task energy is generally higher, even in games it favours like Forza Horizon 5.
              ​Power efficiency under load is probably not a too big a problem and likely dependent on the default clock settings. More concerning is the idle/desktop consumption, where the GPU will stay most of the time in most use cases, even professional.
              Last edited by Solid State Brain; 05 October 2022, 10:39 AM.

              Comment


              • #17
                Michael can you confirm SR-IOV support? That feature alone will dictate whether I buy a 16 GB A770 or pass on this gen.

                Comment


                • #18
                  Originally posted by Dukenukemx View Post
                  I wouldn't call these a great buy. Right now the 6600 XT is faster than the A770, and is usually around $300 new. On Ebay the 6600 XT can be found used for $250 and sometimes even less. The RTX 3060 is just a little bellow $300 off Ebay used. Maybe if the A750 was $200 and the A770 was $250 I'd be all for it, but at nearly $300 you have far better choices.
                  A700 uses a larger die than the RX 6600 XT (237 mm^2 vs 406 mm^2)

                  So Intel probably has a higher cost floor... but yeah, they should undercut AMD/Nvidia even more if they want to gain marketshare.


                  *Hopefully* this big die means Arc has more driver optimization headroom, like the AMD 7950/7970 (which were slower than the smaller-die 680 at launch, but much faster years later).

                  Comment


                  • #19
                    This is not great product from Intel for my needs but I am happy to have another competitor making video cards for a linux nerd like me.
                    I can not consider purchasing one of these cards until H1'2023, when Linux distributions when should have more mature out-of-the-box support.
                    I am more excited to see what Intel can do with a generation 2 and 3 of its video cards.

                    Hopefully Intel knows they need to greatly improve their out-of-the-box support in linux distributions so that they work out of the box. Not everyone that uses linux is a grey beard level linux user.

                    Comment


                    • #20
                      Interested in Blender benchmarks. OpenData doesn't really have something currently, so I'd love to know how these stack up against NVIDIA.

                      Comment

                      Working...
                      X