Intel Lands "Round Robin Strict" Driver Optimization For Helping Battlemage/Xe2

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • pong
    replied
    Originally posted by AdrianBc View Post

    The published reviews have shown that enabling the PCIe Active State Power Management (ASPM), both in the BIOS settings and in the operating system, reduces the idle power consumption for B580 from 36 W to 7 W, which is very close to the AMD and NVIDIA GPUs.
    The "solution" isn't complete though until everyone can use it.
    There are MANY motherboards / "BIOS" versions where one is given ABSOLUTELY NO CONTROL of ASPM related settings for the DGPU (or anything else) and the "mandatory defaults" are apparently to just keep everything running using high power all the time.

    This is intel, they design / make a large fraction of the CPUs, motherboard chipsets, system peripherals, and thus have a lot of control indirectly or directly over BIOS & driver level support for PCs in general.

    ARC DG2 GPUs came out in 2022 and had approximately these same problems and it was IIRC around the start of 2024 when they started paying any attention to high idle power use for ARC and
    the workaround advice wrt ASPM settings were unusable for a lot of people that had no such options or could not use them without causing crashes / instability due to whatever else not working well with ASPM modes enabled.

    So given intel's vast competence in PC architecture / drivers / bios / OSs we'd kind of expect them to "get the job done" and pervasively change whatever is needed in the BIOS / OS / hardware design / firmware / drivers et. al. to make ASPM "just work" for modern PCs so it can be used effectively / reliably for their products and all other peripherals.

    To launch generation #2 of DGPUs in 2024 with the same old problems for multi-monitor power use, higher refresh rate power use, and "needs ASPM manually set in the BIOS even though not so ubiquitously possible" problems unsolved since 2022 is disappointing especially when AMD, NV have better power management in practice.

    Leave a comment:


  • MillionToOne
    replied
    Originally posted by fafreeman View Post
    After watching this video from Gamers Nexus:

    It doesn't seem like there is anything much more than can do for Alchemist. Alchemist emulates a lot of stuff because it doesn't have native, direct hardware for certain features. Like "Compute dispatch" and "Draw" if you watch the video.
    And I'm not talking about Windows

    Leave a comment:


  • chuckula
    replied
    I just picked mine up at Microcenter today!

    Leave a comment:


  • AndyChow
    replied
    Test the power usage. Round robin costs more electricity, IME.

    Leave a comment:


  • creative
    replied
    cutterjohn Well I hope they keep it up both Intel and AMD. This releasing 8GB 60 series cards and 12GB 70's series cards is undoubtlable hubris on the part of NVIDIA. Don't get me wrong they make good GPU's but price gouging the customer and saying f-you with lower memory on card tiers that will clearly need the extra VRAM is very anti consumer, I think they do it cause they know they can. Huang has forgotten the gamers that helped make his business, it's sort of like when Metallica told their fans "We don't need you." I'm honestly kind of pissed about all this, cause I've been on the recieving end obviously.

    NVIDIA has really kind of become a bully to gamers.
    Last edited by creative; 14 December 2024, 09:09 PM.

    Leave a comment:


  • cutterjohn
    replied
    Originally posted by creative View Post
    Let's hope intel keeps working hard and continueing it's GPU trajectory. I feel pretty battered after paying what I have for a RTX 4070 Ti Super, just to get an ok NVIDIA GPU with 16GB of VRAM. Sure I could have went with the 7900XT but as far as scalers go DLSS seems to be noticeably better, I'd rather not use scalers but that is where games have gone in terms of generating high fps experiences.
    XeSS on Intel is pretty good IME...

    40W?! That's a massive regression from my a770 which idles around 8W on average, and Ive seen it go as low as 4W IIRC... still not as good as nVidia or AMD but livable...

    Im OK w/what likely will be the B7XX gaming perf, I just hope that compute has improved more, but the B5XX I have no interest in...

    [EDIT]

    Ah now that you mention ASPM I BELIEVE that Intel DID tell people to enable it for the Alchemist series!

    [/EDIT]

    [EDIT2]

    my 8W avg IS with 2 monitors attached! 1x1440p (landscape) and 1x1080p (portrait)...

    [/EDIT2]
    Last edited by cutterjohn; 14 December 2024, 09:30 AM.

    Leave a comment:


  • AdrianBc
    replied

    Originally posted by the-burrito-triangle View Post
    This is more interesting for Lunar Lake than it is for Battlemage. Battlemage has been mostly a flop so far: little improvement for gaming over the previous generation and similar high idle power draw. (The OpenCL performance looks great though, so not everything sucks.) My AMD RX 7600 draws between 2 to 5 watts at idle... Using the same 36 watts as Battlemage's idle power draw, the RX 7600 can play an older game at 4k resolution

    The high idle power consumption with default settings appears to be caused by the fact that B580 keeps the PCIe interface running with all lanes active at maximum speed all the time, unlike the NVIDIA GPUs and the AMD GPUs, which whenever possible turn off some of the PCIe lanes and reduce the clock frequency for the others.

    The published reviews have shown that enabling the PCIe Active State Power Management (ASPM), both in the BIOS settings and in the operating system, reduces the idle power consumption for B580 from 36 W to 7 W, which is very close to the AMD and NVIDIA GPUs. Most good desktop monitors consume around 50 W or even more, so when the display is active a difference in the idle power consumption of a couple of watt becomes completely irrelevant.

    Unfortunately, this does not solve completely the problem, because with 2 or more monitors connected the idle power consumption goes back to 36 W, because apparently in this configuration the Intel driver forces the PCIe interface to maximum speed.

    This looks like a firmware/software problem, not a hardware problem. Apparently the Intel driver or firmware are not clever enough to determine when it is safe to reduce the speed of the PCIe interface, because that will not affect the performance, unlike the NVIDIA and AMD drivers or firmware. Unlike the B580, my NVIDIA GPU (in Linux) keeps most of the time the PCIe interface at v1.0 speed, raising it to v2.0, v3.0 or v4.0 speeds only when necessary.

    Except for the high idle power consumption at default settings, the energy efficiency of B580 appears to be good, competitive with AMD. At high power consumptions, the AMD GPUs lose their advantage from low power consumptions, because when there is work to do they must also raise the speed of PCIe to maximum and also the clock frequencies for the GPU and for its memory interface to their maximum values, in which case their advantage in power consumption vanishes.
    Last edited by AdrianBc; 14 December 2024, 09:10 AM.

    Leave a comment:


  • the-burrito-triangle
    replied
    Originally posted by lyamc View Post

    The Linux performance is disappointing. In Windows this $250 card is performing at $400 price levels.
    Fair enough. That comment was written before I read the GamerNexus review for the B580. There are still some weird performance issues on Windows, and it doesn't always match the perf of a $400 card, but it is a good option for the price. And it's even more impressive for non-gaming tasks. Mostly, my disappointment is with it on Linux, as I've abandoned Windows. For those who can deal with MS's nonsense its a good card with working drivers out of the gate (unlike the previous generation's bumpy start).

    And I agree with creative, I _want_ Intel to succeed in the GPU market. I'd like to have a third choice to help keep prices sane and have an alternative Linux native option in case AMD makes a flop of UDNA. Intel failing does no one any favors. And I am willing to actually buy a GPU from them, I just wish they could get their idle power draw figured out. How can they make great mobile iGPUs and then pop out these 40W idling turds? Does none of that iGPU power efficiency R&D translate over to their dGPUs? Just seems strange to me.

    Leave a comment:


  • creative
    replied
    Let's hope intel keeps working hard and continueing it's GPU trajectory. I feel pretty battered after paying what I have for a RTX 4070 Ti Super, just to get an ok NVIDIA GPU with 16GB of VRAM. Sure I could have went with the 7900XT but as far as scalers go DLSS seems to be noticeably better, I'd rather not use scalers but that is where games have gone in terms of generating high fps experiences.

    Leave a comment:


  • lyamc
    replied
    Originally posted by the-burrito-triangle View Post
    Battlemage has been mostly a flop so far: little improvement for gaming over the previous generation and similar high idle power draw.
    The Linux performance is disappointing. In Windows this $250 card is performing at $400 price levels.

    Leave a comment:

Working...
X