Announcement

Collapse
No announcement yet.

Intel Announces Arc Pro A-Series Professional GPUs

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by Setif View Post
    No word about Double Precision performance, seems not supported!
    Double precision is mandatory for OpenGL 4.x conformance. Gaming-oriented GPUs typically implement it at anywhere from 1/8th to 1/32nd of fp32 performance.

    Up until & including Gen9, I think Intel GPUs implemented fp64 at half of fp32 performance! That made some of their Iris models, which had 48 or even 72 EU, rather interesting for certain compute workloads!

    Comment


    • #42
      Originally posted by RedEyed View Post
      Specs are not impressive at all
      Cards in this performance/power range never are. AMD and Nvidia also sell low-powered cards. You just don't tend to hear as much about them.

      Look at some of these: https://www.nvidia.com/en-us/design-...ktop-graphics/
      • T400 uses 30 W and provides only 1.1 fp32 TFLOPS
      • T600 uses 40 W and provides only 1.7 fp32 TFLOPS
      • T1000 uses 50 W and provides only 2.5 fp32 TFLOPS

      AMD does the same: https://www.amd.com/en/graphics/workstations
      • Radeon Pro W6400 uses 50 W and provides 3.5 fp32 TFLOPS
      • Radeon Pro W6600 uses up to 130 W and provides 10.4 fp32 TFLOPS

      Comment


      • #43
        Originally posted by erniv2 View Post
        If you scroll down you can see those fat DX12Ultimate OpenCL and Vulkan Logos, so if it is DX12.2 Compatible why not with dx9 or dx11?
        My understanding is that DX12 is nearly as different from DX11 as Vulkan is from OpenGL. So, DX12 support doesn't automatically confer good support for DX11 or before.

        Originally posted by CommunityMember View Post
        the DX9/11 paths are consider to perform very poorly. That is (apparently) why the benchmark results that have been posted are all over the place, as if the game uses DX12 or Vulkan the results look much much better than if it uses DX9/11. The claim by some is that Intel did not fully appreciate the limitations in their existing drivers for DX9/11, and will now have to work to improve them, and that takes time (a lot of time).
        From what I've heard, a lot of work goes into tuning drivers to work well with specific games.

        In simple cases, GPU vendors will profile certain games (usually AAA titles) running on their GPUs and tune their drivers to optimize the bottlenecks they hit.

        In some of the more extreme cases, this amounts to GPU vendors detecting when a given game is running, and actually replacing some of its shader code with hand-optimized versions developed by the GPU vendor.
        Last edited by coder; 09 August 2022, 03:09 AM.

        Comment


        • #44
          Originally posted by coder View Post
          Depends on whether it has an auxiliary power connector. PCIe is limited to just 75 W per slot. So, if it gets all of its power from the motherboard, then you know it's truly just a 75 W card.
          Apart from my comment not beeing meant serious, I allready had that in mind. Atleast their rendered GPU images don't have one but I couldn't find real photos from the right angle.

          Comment


          • #45
            I wonder when Intel will start shipping GPUs instead of talking about and announcing their GPUs.

            Comment


            • #46
              Originally posted by tunnelblick View Post
              I wonder when Intel will start shipping GPUs instead of talking about and announcing their GPUs.
              Same day as Rust developers actually develop something or Wayland reaches an actual market share.
              Last edited by kpedersen; 09 August 2022, 05:56 AM.

              Comment


              • #47
                Originally posted by kpedersen View Post

                Same day as Rust developers actually develop something or Wayland reaches an actual market share.
                Someone's salty

                Comment


                • #48
                  Originally posted by erniv2 View Post

                  Intel® Arc™ Pro A-Series Graphics for Workstations

                  If you scroll down you can see those fat DX12Ultimate OpenCL and Vulkan Logos, so if it is DX12.2 Compatible why not with dx9 or dx11?
                  Because up to DX 11 / OpenGL API's are terrible API's especially to implement drivers for. The API's although deceptively simple, create a lot of "magic" behavior that needs to be implemented in the driver . Ontop of this different games abuse the API's in different ways which essentially means that if you want overall very good game performance you need to apply per game optimizations (this is what NVidia/AMD have been doing over decades, this is what NVidia's game ready drivers mean). As is obvious, since Intel is just starting it didn't have the privilege/time to do all of this work.

                  On the other handle, DX12/Vulkan are much more lower level/principled/correct API's that is not only much easier to develop drivers for but also allows game engines to squeeze much more performance out of the GPU. This is why ARC's DX12/Vulkan's performance is comparatively so much better. Its definitely possible for Intel to optimize up to DX11/OpenGL games to have much better performance retroactively with current GPU's but this will take a long time.
                  Last edited by mdedetrich; 09 August 2022, 05:33 PM.

                  Comment


                  • #49
                    Originally posted by CommunityMember View Post
                    The currently available DX12 drivers are considered to achieve reasonable (if not as good as some hoped) results with the available GPUs, while the DX9/11 paths are consider to perform very poorly. That is (apparently) why the benchmark results that have been posted are all over the place, as if the game uses DX12 or Vulkan the results look much much better than if it uses DX9/11. The claim by some is that Intel did not fully appreciate the limitations in their existing drivers for DX9/11, and will now have to work to improve them, and that takes time (a lot of time).
                    I wonder how much has to do with the latency and management differences between system RAM and video RAM. I imagine a lot of the tricks and strategies used for integrated cards don't transfer well.

                    Originally posted by Quackdoc View Post
                    I wonder if these will support SR-IOV
                    Good Question, could be a good feature for some, especially for the price.

                    Comment


                    • #50
                      Originally posted by mlau View Post

                      Someone's salty
                      The saltiest troll, this side of the pond!

                      Comment

                      Working...
                      X