Announcement

Collapse
No announcement yet.

Intel Arc Graphics A750 + A770 Linux Gaming Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    We need more Direct3D 9 and Direct3D 11 benchmarks via DXVK. Also, some Gallium 9 benchmarks would be interesting. The Windows drivers are supposed to be terrible for Direct3D 9 and less than ideal for Direct3D 11, so comparisons against Windows would be extremely interesting.

    Comment


    • #42
      Originally posted by ryao View Post
      We need more Direct3D 9 and Direct3D 11 benchmarks via DXVK. Also, some Gallium 9 benchmarks would be interesting. The Windows drivers are supposed to be terrible for Direct3D 9 and less than ideal for Direct3D 11, so comparisons against Windows would be extremely interesting.
      They're using a translation layer to DX12 on Windows which is similar to DXVK. I don't recall seeing any major performance differences between DXVK and DX912pxy or dgvoodoo's DX11/DX12 translation layers. The positive for using a DX12 translation layer is the use of Windows 11 AutoHDR which of course isn't supported when using Vulkan. It's the main reason I use dgvoodoo2 for many of my old DX9 games.

      Glide/DirectX implementation on D3D11/12. Contribute to dege-diosg/dgVoodoo2 development by creating an account on GitHub.

      Comment


      • #43
        Originally posted by WannaBeOCer View Post

        They're using a translation layer to DX12 on Windows which is similar to DXVK. I don't recall seeing any major performance differences between DXVK and DX912pxy or dgvoodoo's DX11/DX12 translation layers. The positive for using a DX12 translation layer is the use of Windows 11 AutoHDR which of course isn't supported when using Vulkan. It's the main reason I use dgvoodoo2 for many of my old DX9 games.

        https://github.com/dege-diosg/dgVoodoo2
        There are reports that the performance on D3D9 games is 1/3 to 1/2 that of the 3060. I would not expect performance to be that far behind the 3060 in D3D9 unless the translation layer is not very performant. DXVK is supposed to give at least 80% of native performance, but in many times can equal or even exceed native performance, especially for D3D9 games. I suspect it would do much better here.
        Last edited by ryao; 05 October 2022, 05:53 PM.

        Comment


        • #44
          my vega64 was similar performance than the Intel Arc A770... and the vega64 is on ebay for 250€

          and about the compute stack the vega64 is supported on ROCm/HIP in Blender 3.3

          plus for intel a770 is 16gb vram and raytracing support.

          but it is funny intel calls the a770 highend and in reality it is a lowend gpu.
          Phantom circuit Sequence Reducer Dyslexia

          Comment


          • #45
            Originally posted by schmidtbag View Post
            R I'm curious how it performs with and without reBAR.
            Yeah me too. I have an old crusty machine and it doesn't do rebar. Someday I'll upgrade but not now. I might think about slotting in a graphics adapter though. The old 750ti is getting a bit long in the tooth anymore here. I do have a PCI 3.0x16 slot Look out! Maybe I'll just get that CPU upgrade I always wanted?

            Comment


            • #46
              Originally posted by qarium View Post
              my vega64 was similar performance than the Intel Arc A770... and the vega64 is on ebay for 250€

              and about the compute stack the vega64 is supported on ROCm/HIP in Blender 3.3

              plus for intel a770 is 16gb vram and raytracing support.

              but it is funny intel calls the a770 highend and in reality it is a lowend gpu.
              I don’t understand why you keep talking about your RX Vega 64… it’s about 20%-30% slower than the A770 in modern titles. OneAPI is better, there are more workshops as well to quickly learn to use it as well compared to ROCm.

              With the amazing-looking Intel Arc A770, the blue team is making a push to offer a capable mid-range graphics card product at affordable pricing. Intel is including a lot of modern tech like AV1 video encode, hardware-accelerated ray tracing units and more on their newest release.



              Comment


              • #47
                these results are... wild I guess lol

                Comment


                • #48
                  Originally posted by brucethemoose View Post
                  Power efficiency is low according to Windows reviews.

                  The A770 draws about as much as a 3070, while the A750 draws as much as a 3060 TI under load. Which means the task energy is generally higher, even in games it favours like Forza Horizon 5.
                  Yea that's pretty bad. Cooler cases and more efficient heat sinks aren't free, either. They also might produce higher noise levels.

                  Comment


                  • #49
                    Originally posted by brucethemoose View Post
                    Power efficiency is low according to Windows reviews.

                    The A770 draws about as much as a 3070, while the A750 draws as much as a 3060 TI under load. Which means the task energy is generally higher, even in games it favours like Forza Horizon 5.
                    Ars Technica has several benchmarks where the Arcs did abysmally compared to AMD & Nvidia as well. Many of them are older games, and in some cases while the average or peak FPS was acceptable, the "1%" low end FPS was so bad as to make the games unplayable causing stutters and hiccups - and don't even bother with ray tracing in most games on the Arc despite the hardware being there. If you're buying the cards hoping to do ray tracing on the cheap, save your money for a next generation card from Nvidia or perhaps AMD because these aren't it. The problem with Michael's benchmarks is they don't say much of anything about how playable the games really are. They're just bare numbers without a quality of experience assessment like most automated benchmarks.

                    Ars points out several problems with the UX with the Windows drivers (for those like me that game in Windows & Linux), some are even surprisingly annoying oversights that shouldn't be happening. Granted, that's all on the Windows side, so like Michael said, you don't mind spending $350 on playing with first generation hardware and all the 'adventures' that brings with it either in Windows or Linux, go for it. But if you want to "just get on with it without hassles" like most of us, wait for AMD (upcoming releases) or Nvidia (to come down in price). In short, save your money and go with a proven solution.

                    Someone earlier mentioned how long does it take to get decent drivers? In Intel's case it's been decades and they're still crap for their GPU drivers. Maybe they should be hiring some actual gamers over to the GPU department in management positions and tell them what customers don't want - these are actually marketed to gamers and media content creators, after all, yet they still can't come up with an acceptable experience for those users that they are proud to embrace. Just marketing drivel.

                    Edit to add: I don't remember if Michael mentioned this, but do NOT buy these cards unless you have a 10th gen Intel CPU or Ryzen 3xxx or newer. They depend on a memory bandwidth feature in these recent CPUs to achieve performance they do have. Without it they're practically unusable!
                    Last edited by stormcrow; 05 October 2022, 07:28 PM.

                    Comment


                    • #50
                      What I really want one for is the AV1 hardware encoder because, IIRC, the hardware encoder was stripped from the RX6500

                      Comment

                      Working...
                      X