Announcement

Collapse
No announcement yet.

Intel Announces Arc Pro A-Series Professional GPUs

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by Eirikr1848 View Post
    Huh, so much for the rumors of Intel cancelling the Arc line!
    There were no such rumors. What's being canceled is the production of some AIB partners, which to my understanding is not a rumor.

    Comment


    • #22
      Originally posted by erniv2 View Post
      so if it is DX12.2 Compatible why not with dx9 or dx11?
      The currently available DX12 drivers are considered to achieve reasonable (if not as good as some hoped) results with the available GPUs, while the DX9/11 paths are consider to perform very poorly. That is (apparently) why the benchmark results that have been posted are all over the place, as if the game uses DX12 or Vulkan the results look much much better than if it uses DX9/11. The claim by some is that Intel did not fully appreciate the limitations in their existing drivers for DX9/11, and will now have to work to improve them, and that takes time (a lot of time).

      Comment


      • #23
        Originally posted by Setif View Post
        Intel Arc Pro A40 GPU Intel Arc Pro A50 GPU Intel Arc Pro A30M GPU (Mobile)
        Peak Performance 3.50 TFLOPs at Single Precision 4.80 TFLOPs at Single Precision 3.50 TFLOPs at Single Precision
        No word about Double Precision performance, seems not supported!
        There are "definite flops"(R)(c)tm in the intel gpu

        Comment


        • #24
          Originally posted by piotrj3 View Post
          75W dual slot card? Why dual slot for that power?
          Probably "intel watts" without boost clocks considered.

          Those cards might be useful if you don't need any fluid 3D graphics and the price is really low.

          Comment


          • #25
            Originally posted by CommunityMember View Post

            The currently available DX12 drivers are considered to achieve reasonable (if not as good as some hoped) results with the available GPUs, while the DX9/11 paths are consider to perform very poorly. That is (apparently) why the benchmark results that have been posted are all over the place, as if the game uses DX12 or Vulkan the results look much much better than if it uses DX9/11. The claim by some is that Intel did not fully appreciate the limitations in their existing drivers for DX9/11, and will now have to work to improve them, and that takes time (a lot of time).
            Like i said it´s a software issue.

            And since this is a Linux based forum, we all start our proton based steam clients and run the old games on DXVK, oh praise Intel for investing time into optimising Vulkan performance
            Sry i just couldnt resist.
            Last edited by erniv2; 08 August 2022, 03:22 PM.

            Comment


            • #26
              Low GPU memory if it's for Blender usage. 6G is filled fast when rendering.

              Comment


              • #27
                Originally posted by erniv2 View Post
                Like i said it´s a software issue.
                Not so sure, at least their low frame rate dips seem to be hardware related and only become a little better with rBAR (maybe not enough memory bandwidth or worse compression than the others?).

                I also believe they have build the hardware with modern APIs in mind and might not be that well suited for older APIs but DirectX pre 12 is really complex and it might just be a driver issue (look how long AMD has struggled to compete against Nvidia on old APIs).
                If I were Intel I would have used one of those million Vulkan to X layers and just implemented a Vulkan API in hardware.

                Comment


                • #28
                  Originally posted by Anux View Post
                  Not so sure, at least their low frame rate dips seem to be hardware related and only become a little better with rBAR (maybe not enough memory bandwidth or worse compression than the others?).

                  I also believe they have build the hardware with modern APIs in mind and might not be that well suited for older APIs but DirectX pre 12 is really complex and it might just be a driver issue (look how long AMD has struggled to compete against Nvidia on old APIs).
                  If I were Intel I would have used one of those million Vulkan to X layers and just implemented a Vulkan API in hardware.


                  Maybe Intel is secretly working to get Proton ported to Windows… run all those old DX9 – 11 games on Vulkan

                  That being said: Linux performance should be just fine – right? The biggest issue is Windows driver perf?

                  Originally posted by CommunityMember View Post

                  The currently available DX12 drivers are considered to achieve reasonable (if not as good as some hoped) results with the available GPUs, while the DX9/11 paths are consider to perform very poorly. That is (apparently) why the benchmark results that have been posted are all over the place, as if the game uses DX12 or Vulkan the results look much much better than if it uses DX9/11. The claim by some is that Intel did not fully appreciate the limitations in their existing drivers for DX9/11, and will now have to work to improve them, and that takes time (a lot of time).


                  I mean, they have been making integrated GPU drivers for so long I wonder if they never fully appreciated any software limitations to their drivers. Kind of curious now on the Windows side if any DX9-11 performance improvements end up applying backward in time to their older products!

                  Originally posted by schmidtbag View Post
                  There were no such rumors. What's being canceled is the production of some AIB partners, which to my understanding is not a rumor.

                  Perhaps on this side, no. But on Reddit? Based on Igor’s Lab and Videocardz and Anandtech articles over the last few weeks + Intel earnings reports – Alchemist was facing imminent cancellation.
                  It was a pleasant surprise this morning to see this article published and I am looking forward to the reviews.
                  Originally posted by erniv2 View Post

                  No im thinking more complicated again dGPU to dGPU do the videoproccessing on the arc card and outpout from a RTX30 so the powerhungry card sits still and the babycard does the work, it would be the 2nd card in the crippeled full length slot that only gets 4 lanes


                  In that case, libopenshot will still let you choose a GPU. As for choosing a default GPU for vaapi?

                  Once you get your card you can try out vainfo to make sure the DRI prime card number is correct then maybe put in your environment variables (though someone more knowledgable than me can chime in if that is not best practice)

                  DRI_PRIME=1 LIBVA_DRIVER_NAME=i915 vainfo

                  I was a Cyrix + Matrox boy who grew into an Apple + nVidia teen then into an AMD man… so I’m not really “in the know” when it comes to Intel stuff, but this *seems* at least like it should work with vainfo to make sure you are “offloading” to the correct card.


                  Comment


                  • #29
                    Originally posted by schmidtbag View Post
                    There were no such rumors. What's being canceled is the production of some AIB partners, which to my understanding is not a rumor.
                    There are rumors that Intel is considering dropping it's consumer cards, and sticking with only integrated graphics + compute/AI cards for the professional market.

                    However, it's important to note that the rumor is only that Intel is investigating what it wants to do going forward. No decisions have been made yet. Also, it would certainly not effect Arc at all, as those chips are all basically already done and sitting in storage. It would potentially effect Battlemage, or maybe not even until their 3rd gen Celestial cards.

                    Comment


                    • #30
                      Originally posted by Eirikr1848 View Post
                      That being said: Linux performance should be just fine – right? The biggest issue is Windows driver perf?
                      I haven't seen any linux benchmarks, phoronix would probably be the first place to look. But even the Windows benchmarks are from cards that were snatched out of asia and Michael probably won't get any card from Intel in the near future.

                      It would be really sad if Intel stopped their efforts, yes the first iteration is probably shitty but give them 2 generations and they should close the gap and fix their problems. Competition is dearly needed. Those old games will run fine once the graphics cards are more powerful, most current engines already support Vulkan and/or DX12 and the old games don't need today's high end cards, bad drivers can be mitigated with more raw power.

                      Comment

                      Working...
                      X