Announcement

Collapse
No announcement yet.

Intel's Mitigation For CVE-2019-14615 Graphics Vulnerability Obliterates Gen7 iGPU Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Ladis
    replied
    Originally posted by caligula View Post

    They're doing something wrong. Seriously, AVX-512 machines can do 16 x 32b ALU operations per cycle. The machines run at 4-5 GHz. That's like 64-80G instructions per second per core. E.g. HD 4000 has 128 ALUs that run at around 1,1 GHz max. Modern memory bandwidth is also much higher and CPUs have better branch prediction and caches.
    GPUs have some often-used operations hardwired, e.g. ROP (Raster Output Processor - taking the float values of color channels, clamping them into 8bit integers and packing them into one 32bit integer - RGBA Uint32 pixel). It just increases the lag to see the pixel (longer pipeline), but doesn't affect the throughput (framerate). CPU has to do such things via a series of instructions.

    Also don't forget that you can't take the performance of SIMD CPU instructions as a simple multiply of what it does. For GPU operations (usually a pixel shader) SIMD (Single Instruction Multiple Data) means you don't use a lot of "data slots". E.g. you run an instruction over 4 color channels - RGBA - and the rest of data slots in the 16 variables of AVX512 is unused. And it takes additional instructions ot rearrange data between the SIMD operations (rearranging the output from the previous instruction/set of inctuctions as the input for the next instructions) - GPUs have hardwired "register swizzling" for that. GPUs have many small independent pipelines (or group of pipelines), each with small amount of ALUs - that way there's no unused performance like in CPU SIMD instructions working over a large set of variables if you are unable to use them all (e.g. next inctructions depends on the output of the previous instruction - common for pixel shaders - so you can't group them into one SIMD inctruction).

    PS: My brother have just dodged the bullet - was thinking about buying a Macbook Pro with Intel Iris Pro 5200 (Hasswell, now performence 40% down). (In my country, Macbooks are expensive compared to salaries, so it's common to buy an older model as used.)

    Leave a comment:


  • caligula
    replied
    Originally posted by HEX0 View Post
    And software rendering even with 80 cores is worse than Ivy Bridge GPU
    https://www.phoronix.com/scan.php?pa...swr-xeon&num=2
    They're doing something wrong. Seriously, AVX-512 machines can do 16 x 32b ALU operations per cycle. The machines run at 4-5 GHz. That's like 64-80G instructions per second per core. E.g. HD 4000 has 128 ALUs that run at around 1,1 GHz max. Modern memory bandwidth is also much higher and CPUs have better branch prediction and caches.

    Leave a comment:


  • polarathene
    replied
    Originally posted by Michael View Post

    I ran some tests on Whiskeylake yesterday and found no change in power draw on battery.
    Oh that's good to know. I guess the increase the user experienced is caused by something else perhaps, unless other gen9 models could be affected differently somehow.

    Just to confirm, you did test the power draw at idle right? Not just at load via PTS?

    Originally posted by boxie View Post

    is it possible to get some laptop benchmarks including power usage before/after mitigation too?
    See above, according to Michael there's no difference?
    Last edited by polarathene; 01-17-2020, 11:11 AM.

    Leave a comment:


  • DanL
    replied
    Originally posted by Hibbelharry View Post
    If we're loosing more than 100 percent of performance... If we're loosing half the performance...
    If you're loosing performance, you should tighten it.

    Leave a comment:


  • misGnomer
    replied
    Phoronix quoted. https://hexus.net/tech/news/graphics...x-performance/

    Being fast to report Linux news seems to have lead to the impression that MS-Windows isn't affected. I wonder how many people will be in for a surprise.

    Looking at the sheer number of Intel CPUs with Gen7 graphics and guessing how many sets were shipped and are still in use and possibly not even being updated any longer could mean trouble.

    https://en.wikipedia.org/wiki/List_o...ocessing_units

    Leave a comment:


  • HEX0
    replied
    Originally posted by birdie View Post

    Ages old games which can be barely called "3D"? Yeah, right, exactly what I said earlier - these are the kind of games which can be run on the CPU entirely.
    Regardless, any more performance degradation to already weak Gen7 Intel iGPUs would make certain barely playable games completely unplayable.
    And software rendering even with 80 cores is worse than Ivy Bridge GPU
    https://www.phoronix.com/scan.php?pa...swr-xeon&num=2

    Leave a comment:


  • slacka
    replied
    Originally posted by birdie View Post
    Ages old games which can be barely called "3D"?
    Sims 4 targeted the Xbox One / ps4 console generation. I've played it. Sure it's cartoonish, but true high quality 3D. No way this could run on a CPU at acceptable fps. It's a testament to Wine and Intel Lnux drivers that it runs so well for him.

    Leave a comment:


  • Michael
    replied
    Originally posted by polarathene View Post

    Your article points out that Gen9+ doesn't have much regression, but in the earlier article raising awareness, comments from users pointed out an uptick in wattage which is a pretty big deal for battery life on laptops

    Any chance of publishing an article that investigates that?
    I ran some tests on Whiskeylake yesterday and found no change in power draw on battery.

    Leave a comment:


  • wosgien
    replied
    Originally posted by caligula View Post
    What the heck, there are gamers who run Steam on machines with Ivy Bridge / Haswell iGPUs?
    Yes indeed.
    Many indie games are working OK (even great) on HD4600 (the Witness for example).

    Even commercial games (Team Fortress 2, Dragonball Budokai Fighters, Rocksmith, Dirt 3, Batman games) works perfectly on HD4600 (not in fullhd, but HD is okay)

    More over, HD4600 let you have sufficient framerate while maintaining some battery life (I can play more than 2 hours on Dragon ball, my battery is 5 years old). I can even enable VSync since my laptop allows me for 40Hz refresh rates.

    You need to have dual channel and possibly dual rank (dual rank is definitly an improvement on HD620, but I can’t tell for HD4600 since I don't have single rank memory)

    Leave a comment:


  • birdie
    replied
    Originally posted by HEX0 View Post

    Wow, such arrogance
    I can even play Sims 4 on Ivy Bridge GPU
    Ages old games which can be barely called "3D"? Yeah, right, exactly what I said earlier - these are the kind of games which can be run on the CPU entirely.

    Leave a comment:

Working...
X