Intel Lands "Round Robin Strict" Driver Optimization For Helping Battlemage/Xe2

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • phoronix
    Administrator
    • Jan 2007
    • 67050

    Intel Lands "Round Robin Strict" Driver Optimization For Helping Battlemage/Xe2

    Phoronix: Intel Lands "Round Robin Strict" Driver Optimization For Helping Battlemage/Xe2

    Intel's open-source Linux graphics driver engineers are busy working to further refine the Xe2 graphics performance for Lunar Lake integrated graphics and the newly-launched Battlemage discrete graphics. Landing in Mesa 25.0-devel this Friday afternoon is a new "Round Robin Strict" optimization to benefit both their OpenGL and Vulkan drivers on Linux with Xe2 hardware...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite
  • QwertyChouskie
    Senior Member
    • Nov 2017
    • 635

    #2
    4% is actually pretty huge, hopefully this and other optimizations will help this hardware hit full performance on Linux. Currently it seems the Linux performance is quite behind the Windows performance, but I expect that to change over the coming months.

    Comment

    • MillionToOne
      Phoronix Member
      • Aug 2024
      • 108

      #3
      How about help with Alchemist? Still runs like shit after 2 years under Linux.

      Comment

      • Kjell
        Senior Member
        • Apr 2019
        • 604

        #4
        AMDGPU recently switched to FIFO which sadly contributes to microstuttering

        i've reported the bug here https://bugzilla.kernel.org/show_bug.cgi?id=217158 and was suggested to post the bug here too. so here it is Arch Linux...

        Comment

        • fafreeman
          Phoronix Member
          • Feb 2021
          • 109

          #5
          Originally posted by MillionToOne View Post
          How about help with Alchemist? Still runs like shit after 2 years under Linux.
          After watching this video from Gamers Nexus:
          It doesn't seem like there is anything much more than can do for Alchemist. Alchemist emulates a lot of stuff because it doesn't have native, direct hardware for certain features. Like "Compute dispatch" and "Draw" if you watch the video.
          Last edited by fafreeman; 13 December 2024, 08:28 PM.

          Comment

          • the-burrito-triangle
            Phoronix Member
            • Jul 2024
            • 73

            #6
            This is more interesting for Lunar Lake than it is for Battlemage. Battlemage has been mostly a flop so far: little improvement for gaming over the previous generation and similar high idle power draw. (The OpenCL performance looks great though, so not everything sucks.) My AMD RX 7600 draws between 2 to 5 watts at idle... Using the same 36 watts as Battlemage's idle power draw, the RX 7600 can play an older game at 4k resolution (most of the power draw is the Vulkan wrapper, using OpenGL drops that to 16 watts)! The RX 7600 might not be fast, but it has pretty good performance per watt. I was hoping that Intel could make a card worth buying to replace my AMD card, but sadly I've decided to wait for UDNA to come out. No sense in buying RDNA 3.5 or 4.0 if AMD is going to drop them like turds in the toilet once UDNA arrives. I foresee a Vega-like situation for RDNA cards in the near future. All that said, I'd still be willing to buy a LNL laptop if I can get a good deal. But maybe waiting for Panther Lake and Xe3 is the better way to go. Xe2 just seems lackluster. Xe iGPUs were actually pretty amazing when released with Tigerlake / Rocket Lake (the Xe dGPUs were, and remain, disappointing though). Sadly, Intel's Vulkan drivers still suck. Their OpenGL driver performance is literally 2x that of the Vulkan drivers. I have yet to see a situation in which Vulkan _actually_ works better (less GPU load / power draw) than OpenGL... (My illusions of grandeur WRT Vulkan have been shattered--repeatedly, and without mercy by reality.)

            Comment

            • lyamc
              Senior Member
              • Jun 2020
              • 517

              #7
              Originally posted by the-burrito-triangle View Post
              Battlemage has been mostly a flop so far: little improvement for gaming over the previous generation and similar high idle power draw.
              The Linux performance is disappointing. In Windows this $250 card is performing at $400 price levels.

              Comment

              • creative
                Senior Member
                • Mar 2017
                • 868

                #8
                Let's hope intel keeps working hard and continueing it's GPU trajectory. I feel pretty battered after paying what I have for a RTX 4070 Ti Super, just to get an ok NVIDIA GPU with 16GB of VRAM. Sure I could have went with the 7900XT but as far as scalers go DLSS seems to be noticeably better, I'd rather not use scalers but that is where games have gone in terms of generating high fps experiences.

                Comment

                • the-burrito-triangle
                  Phoronix Member
                  • Jul 2024
                  • 73

                  #9
                  Originally posted by lyamc View Post

                  The Linux performance is disappointing. In Windows this $250 card is performing at $400 price levels.
                  Fair enough. That comment was written before I read the GamerNexus review for the B580. There are still some weird performance issues on Windows, and it doesn't always match the perf of a $400 card, but it is a good option for the price. And it's even more impressive for non-gaming tasks. Mostly, my disappointment is with it on Linux, as I've abandoned Windows. For those who can deal with MS's nonsense its a good card with working drivers out of the gate (unlike the previous generation's bumpy start).

                  And I agree with creative, I _want_ Intel to succeed in the GPU market. I'd like to have a third choice to help keep prices sane and have an alternative Linux native option in case AMD makes a flop of UDNA. Intel failing does no one any favors. And I am willing to actually buy a GPU from them, I just wish they could get their idle power draw figured out. How can they make great mobile iGPUs and then pop out these 40W idling turds? Does none of that iGPU power efficiency R&D translate over to their dGPUs? Just seems strange to me.

                  Comment

                  • AdrianBc
                    Senior Member
                    • Nov 2015
                    • 292

                    #10

                    Originally posted by the-burrito-triangle View Post
                    This is more interesting for Lunar Lake than it is for Battlemage. Battlemage has been mostly a flop so far: little improvement for gaming over the previous generation and similar high idle power draw. (The OpenCL performance looks great though, so not everything sucks.) My AMD RX 7600 draws between 2 to 5 watts at idle... Using the same 36 watts as Battlemage's idle power draw, the RX 7600 can play an older game at 4k resolution

                    The high idle power consumption with default settings appears to be caused by the fact that B580 keeps the PCIe interface running with all lanes active at maximum speed all the time, unlike the NVIDIA GPUs and the AMD GPUs, which whenever possible turn off some of the PCIe lanes and reduce the clock frequency for the others.

                    The published reviews have shown that enabling the PCIe Active State Power Management (ASPM), both in the BIOS settings and in the operating system, reduces the idle power consumption for B580 from 36 W to 7 W, which is very close to the AMD and NVIDIA GPUs. Most good desktop monitors consume around 50 W or even more, so when the display is active a difference in the idle power consumption of a couple of watt becomes completely irrelevant.

                    Unfortunately, this does not solve completely the problem, because with 2 or more monitors connected the idle power consumption goes back to 36 W, because apparently in this configuration the Intel driver forces the PCIe interface to maximum speed.

                    This looks like a firmware/software problem, not a hardware problem. Apparently the Intel driver or firmware are not clever enough to determine when it is safe to reduce the speed of the PCIe interface, because that will not affect the performance, unlike the NVIDIA and AMD drivers or firmware. Unlike the B580, my NVIDIA GPU (in Linux) keeps most of the time the PCIe interface at v1.0 speed, raising it to v2.0, v3.0 or v4.0 speeds only when necessary.

                    Except for the high idle power consumption at default settings, the energy efficiency of B580 appears to be good, competitive with AMD. At high power consumptions, the AMD GPUs lose their advantage from low power consumptions, because when there is work to do they must also raise the speed of PCIe to maximum and also the clock frequencies for the GPU and for its memory interface to their maximum values, in which case their advantage in power consumption vanishes.
                    Last edited by AdrianBc; 14 December 2024, 09:10 AM.

                    Comment

                    Working...
                    X