Windows 11 vs. Linux Benchmarks For Intel Arc B-Series "Battlemage" Shows Strengths & Weaknesses

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • the-burrito-triangle
    Phoronix Member
    • Jul 2024
    • 80

    #31
    I feel like a broken record at this point, but these benchmarks show exactly what I see when profiling my own Intel GPUs on Linux: OpenGL greatly out performs Vulkan (I also have multiple cases where this is true for AMD's RDNA 3). The MESA Vulkan driver used for Intel GPUs is disappointing while the OpenGL driver is surprisingly in very good shape (so good, in fact, that it outperforms Intel's Win11 driver). I keep reading about people saying Vulkan is "superior" and "performs better", but the benchmarks speak for themselves.

    Vulkan takes more power and produces less FPS. This is a sad reality for every benchmark/game/wrapper combo I've thrown at an Intel Xe GPU. How the hell is OpenGL outperforming it? Decades of driver optimizations and vendor extensions? Or did Intel half-ass the hardware and only support the bare minimum spec to claim "compliance"? I honestly question if the MESA Vulkan driver is still learning to walk, or if Intel's HW is the real problem...
    Last edited by the-burrito-triangle; 23 December 2024, 09:17 PM.

    Comment

    • ElectricPrism
      Senior Member
      • Apr 2013
      • 1271

      #32
      If you value your time and play any games AMD on Linux is just better.

      Notice current Linux Gamer polling:

      54.13% AMD
      42.32% Nvidia
      3.50% Intel

      https://gamingonlinux.com/users/stat...#GPUVendor-top
      After decades of observing Nvidia on the Internet, and seeing the Nvidia lawsuits over fraud/false-advertising, under the table deals with benchmark software to stack the deck, and plain stupid pricing, it's clear that nvidia-bots and people with a financial interest have been spreading pro-nvidia shill propaganda on the net and anti-intel anti-amd for quite some time.

      The first-hand reality is different at least on Linux and depending on what your computer needs to do.

      I am excited and pleasantly pleased about Intel taking a bite out of Nvidia & AMD and providing some really great open source drivers.

      Novida is always Susvida.

      Comment

      • Quackdoc
        Senior Member
        • Oct 2020
        • 5072

        #33
        Originally posted by ElectricPrism View Post
        If you value your time and play any games AMD on Linux is just better.

        Notice current Linux Gamer polling:



        After decades of observing Nvidia on the Internet, and seeing the Nvidia lawsuits over fraud/false-advertising, under the table deals with benchmark software to stack the deck, and plain stupid pricing, it's clear that nvidia-bots and people with a financial interest have been spreading pro-nvidia shill propaganda on the net and anti-intel anti-amd for quite some time.

        The first-hand reality is different at least on Linux and depending on what your computer needs to do.

        I am excited and pleasantly pleased about Intel taking a bite out of Nvidia & AMD and providing some really great open source drivers.

        Novida is always Susvida.
        after AMD thought it was a good idea to sell products that are EOL as brand new, they have been dead to me. This is by far the most scummy shit that has happened in *recent* history. This, even ignoring all of the other garbage amd has been doing, sent them down to the bottom of the scum totem pole.

        Comment

        • piotrj3
          Senior Member
          • Mar 2019
          • 840

          #34
          Originally posted by sophisticles View Post
          By now it's clear that the Arc family of cards is great for certain specific use cases.

          If you want gaming, buy NVIDIA.

          If you want compute, buy NVIDIA.

          If you are working with video, buy NVIDIA.

          If you are on a tight budget, want respectable gaming, compute and video editing capabilities, buy Intel.

          If you are an even tighter budget and are content with decent gaming, compute and video editing capabilities, buy an AMD APU.

          And if you are nuts, buy an AMD video card.

          It's that simple.
          Intel APUs are actually better again. 245k iGPU is better than AMD's current APU options in desktop space. In laptop it is a question but we will see.

          Comment

          • pong
            Senior Member
            • Oct 2022
            • 316

            #35
            Strengths:
            Windows: FLASH firmware update capability. GUI utility to control display settings, GPU fans / clocks / power / leds, monitor GPU. SDK/API (IGCL) to control display settings, GPU fans / clocks / power / leds, monitor GPU.

            Weaknesses:
            Linux: All of the above (except reading power use, a temperature, and maybe reading fan speed?)

            Compute / IPEX stack doesn't allow allocation & use of more than 4GB RAM block / array on 8/12/16 GB GPUs.
            Describe the bug Intel compute runtime doesn't allow allocating a buffer bigger than 4 GB. intel/compute-runtime#627 When you allocate an array in intel-extension-for-pytorch bigger than 4 GB in A7...


            Comment

            • Daktyl198
              Senior Member
              • Jul 2013
              • 1582

              #36
              Originally posted by avis View Post

              In terms of performance per dollar, yeah, AMD is a tad better but only in raster:

              The next-generation Intel Arc graphics cards are here! The B580, powered by the Battlemage architecture, is priced at a highly competitive $250. Testing in our review confirms that Intel's new card outperforms both NVIDIA's GeForce RTX 4060 and AMD's RX 7600, and it now supports frame generation as well.


              In games with real RT AMD is nowhere close to NVIDIA. Intel's Battlemage is.

              And then NVIDIA has superior DLSS on its side which AMD only promises for the upcoming RDNA 4.0.
              Raster is still what 99% of gamers use. Most don't use Ray Tracing unless it's turned on by default in a game, which it often isn't. The only game that requires it is the new Indiana Jones game. As for NVidia and Intel, only their latest GPU generations have anything resembling decent real-time ray tracing performance. The prior generations, it was more of a toy, which is why many new NVidia software that use those special cores don't run on anything older than 30-series cards. AMD supposedly is upping their ray tracing performance next gen, but we'll see.

              As for DLSS/FSR/XeSS, they literally all suck. I've used DLSS and FSR and both noticeably degraded image quality. They were only invented to pretend like GPUs are more powerful than they are for benchmarks. It's fake performance, and it's the biggest fucking con in the tech world in a long time. I hate that they've been so normalized despite destroying image quality and now input latency as well.

              Comment

              • Daktyl198
                Senior Member
                • Jul 2013
                • 1582

                #37
                Originally posted by the-burrito-triangle View Post
                I feel like a broken record at this point, but these benchmarks show exactly what I see when profiling my own Intel GPUs on Linux: OpenGL greatly out performs Vulkan (I also have multiple cases where this is true for AMD's RDNA 3). The MESA Vulkan driver used for Intel GPUs is disappointing while the OpenGL driver is surprisingly in very good shape (so good, in fact, that it outperforms Intel's Win11 driver). I keep reading about people saying Vulkan is "superior" and "performs better", but the benchmarks speak for themselves.

                Vulkan takes more power and produces less FPS. This is a sad reality for every benchmark/game/wrapper combo I've thrown at an Intel Xe GPU. How the hell is OpenGL outperforming it? Decades of driver optimizations and vendor extensions? Or did Intel half-ass the hardware and only support the bare minimum spec to claim "compliance"? I honestly question if the MESA Vulkan driver still learning to walk, or if Intel's HW is the real problem...
                The Intel Xe driver utilizes Gallium for OpenGL support, which has a decade of support and examples in AMD drivers and Intel's prior GPU drivers that also use Gallium. Intel's Vulkan driver is basically brand new and vendor-specific.

                As for Vulkan being "superior" and "performing better", yes and no. Vulkan allows for far more throughput when it comes to draw calls sent to the GPU, as well as far more control by the engine developer over what exactly is being done on the GPU vs OpenGL so they can cut out unnecessary steps. But if a developer/engine doesn't take advantage of those features, there will be no performance uplift even if the drivers are great. Doom from Id Software is a great example of utilizing Vulkan to it's fullest potential. Most other AAA game engines just retro-fit their models to use Vulkan and thus it performs the same, if not worse, than their DirectX/OpenGL renderers.

                One way for Vulkan to 100% improve performance would be for a game engine to take advantage of the multi-GPU support that Vulkan provides. It allows you to partition work at your will among multiple GPUs in the system (e.g. Integrated and Dedicated, or 2 Dedicated cards) in order to get really good speedups without all of the drawbacks of SLI/CrossFire that tried to shoe-horn in multi-GPU outside of the engine's render cycle that was designed for a single GPU. But no game engine utilizes this, as it's basically double work for the developer as you have to write code for each instance of single GPU, two GPUs, etc.

                Comment

                • Anderse4
                  Junior Member
                  • Feb 2024
                  • 1

                  #38
                  Originally posted by pong View Post
                  Strengths:
                  Windows: FLASH firmware update capability. GUI utility to control display settings, GPU fans / clocks / power / leds, monitor GPU. SDK/API (IGCL) to control display settings, GPU fans / clocks / power / leds, monitor GPU.

                  Weaknesses:
                  Linux: All of the above (except reading power use, a temperature, and maybe reading fan speed?)

                  Compute / IPEX stack doesn't allow allocation & use of more than 4GB RAM block / array on 8/12/16 GB GPUs.
                  Describe the bug Intel compute runtime doesn't allow allocating a buffer bigger than 4 GB. intel/compute-runtime#627 When you allocate an array in intel-extension-for-pytorch bigger than 4 GB in A7...

                  This has not been true for a long time. Also ipex is no longer required for PyTorch. I’m training a model on my a770 right now with 14gb allocation measured in xpu-smi. The deep learning stack is much better than it was a year ago.

                  Comment

                  • avis
                    Senior Member
                    • Dec 2022
                    • 2260

                    #39
                    Originally posted by Daktyl198 View Post

                    Raster is still what 99% of gamers use. Most don't use Ray Tracing unless it's turned on by default in a game, which it often isn't. The only game that requires it is the new Indiana Jones game. As for NVidia and Intel, only their latest GPU generations have anything resembling decent real-time ray tracing performance. The prior generations, it was more of a toy, which is why many new NVidia software that use those special cores don't run on anything older than 30-series cards. AMD supposedly is upping their ray tracing performance next gen, but we'll see.

                    People do not normally buy new GPUs to play old games.

                    And Indiana Jones and the Great Circle perfectly demonstrates that your "99% of gamers use" mantra is unfortunately on its last legs.

                    Wait, it's not just one game, here's a full list:
                    • Avatar: Frontiers of Pandora
                    • Indiana Jones and the Great Circle
                    • Metro Exodus Enhanced Edition
                    • Star Wars Outlaws
                    And now that Sony PlayStation 5 Pro is out we'll have more and more games that require RT support. But you may continue to buy subpar outdated products for your friends stuck in the previous century.

                    Originally posted by Daktyl198 View Post
                    As for DLSS/FSR/XeSS, they literally all suck. I've used DLSS and FSR and both noticeably degraded image quality. They were only invented to pretend like GPUs are more powerful than they are for benchmarks. It's fake performance, and it's the biggest fucking con in the tech world in a long time. I hate that they've been so normalized despite destroying image quality and now input latency as well.
                    I'm 99.9% sure you've never played a single game with DLSS. I have. Lots of them. It works near perfectly in most of them. Free performance and no image quality loss.

                    Come to r/AMD and r/NVIDIA and check what people who use them actually have to say. r/AMD predominantly favours DLSS over FSR and says it's the biggest reason they chose NVIDIA.

                    Lastly, laymen may be wrong of course. Come watch what https://www.youtube.com/user/DigitalFoundry has to say about DLSS, a golden standard in image upscaling.

                    Your theoretical speculations are worthless, childish and fanboyish: "I remember many years ago people criticised DLSS 1.0 for its blurriness, yeah, I guess it's still blurry but I've not tried it personally. I will continue however to claim it's just bad." Try that with your mom, not me.

                    Comment

                    • pieman
                      Phoronix Member
                      • Mar 2020
                      • 111

                      #40
                      Originally posted by ElectricPrism View Post
                      If you value your time and play any games AMD on Linux is just better.

                      Notice current Linux Gamer polling:



                      After decades of observing Nvidia on the Internet, and seeing the Nvidia lawsuits over fraud/false-advertising, under the table deals with benchmark software to stack the deck, and plain stupid pricing, it's clear that nvidia-bots and people with a financial interest have been spreading pro-nvidia shill propaganda on the net and anti-intel anti-amd for quite some time.

                      The first-hand reality is different at least on Linux and depending on what your computer needs to do.

                      I am excited and pleasantly pleased about Intel taking a bite out of Nvidia & AMD and providing some really great open source drivers.

                      Novida is always Susvida.
                      if you use linux, nvidia shouldn't even be in your lexicon. and don't bother me with you edge case "MUUUHHH CUUUUDDDDDAAA" nonsense. this is why i'm glad nvidia keeps increasing prices. nvidia users deserve to be milked to death for their low vram cards.

                      Comment

                      Working...
                      X