Announcement

Collapse
No announcement yet.

Intel's Mitigation For CVE-2019-14615 Graphics Vulnerability Obliterates Gen7 iGPU Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Paradigm Shifter
    replied
    I've just updated to a kernel just freshly pushed (presumably to implement the mitigations for this vulnerability) and am now seeing constant GUI soft-locks happening on an i7-8559U (iris Plus 655, IIRC) with corresponding complaints in dmesg.

    Hopefully this will be solved (I can live with some slowdown, I can't live with regular soft-locks) with another tweak, but so far I'm far from happy.

    Leave a comment:


  • edenist
    replied
    Originally posted by HEX0 View Post
    I really hope whatever makes it into mainline kernel has a kernel parameter to disable the mitigation.
    Since my desktop died I've been using a laptop with i5-3317U and it can play classic games or even some light modern games.

    I can still run Sims 4 (via Wine) on medium/high no AA at around ~30 FPS aswell as Mount and Blade: Warband.
    As well as lots of classic games (Diablo, Blade of Darkness, Spore, Mafia, Operation Flashpoint, etc...)

    This patch in it's current form would render my laptop useless and I can't upgrade for a few more years
    I'd also like to throw my hat into the ring of intel graphics laptop gamers. I've got a i5-3337U, and it runs many games to my satisfaction. I'm regularly playing Neverwinter Nights EE without issue at native resolution [1600x900], as well as many older games on wine such as Nexus the jupiter incident. Is it also clear whether this just affects opengl performance? What about things using vaapi, such as video decode?

    Regardless, I'm still incredibly happy with my laptop from 2013 and am going to be incredibly frustrated if this forces me to fork out money for a new one due to no fault of my own. How the hell can we hold intel liable for the continuing cascade of stuff ups which are costing many people $$$?

    Leave a comment:


  • polarathene
    replied
    Originally posted by wizard69 View Post
    I just see the trend to difficult to replace batteries to be a huge inconvenience for the consumer that just drives bad behavior.
    Becoming less of an issue these days with power banks and USB-C powered devices, albeit you'd then be lugging that around reducing convenience of a laptop. This can also be done for the laptop models that are powered by DC barrel plugs(coaxial), but it's a bit more of a hassle(if going through USB-C/USB-PD to the DC barrel plug the laptop needs that is).

    Originally posted by birdie View Post
    God, so much drama/fuss about nothing. Broadwell/SkyLake U(HD) iGPUs are barely affected.
    Not entirely true, even if performance appears barely regressed the power draw is said to be much higher which has notable loss of battery life for laptop users.

    Originally posted by birdie View Post
    If you're into light gaming you must have at the very least a Ryzen 3000/4000 based laptop (the 4000 series is yet to be released) or something running a GeForce MX250 as a bare minimum.
    Ryzen 3200U iGPU is only about 28% faster than Intel 8145U UHD620 iGPU iirc. That's better but but not significantly. The nvidia dGPU isn't really great atm from what users report(might be better with 5.6/5.7 kernel where the issue with getting a proper power saving state is achievable on linux). Ryzen 4000 series for laptops seems quite nice, seems they made significant improvements in the one area they performed poorly at, battery life. In addition the 5.5(or was it 5.6) kernel finally brings PSR support to AMD GPU drivers on Linux, something Intel has been benefiting from for many years now.

    Leave a comment:


  • polarathene
    replied
    Originally posted by Michael View Post
    FYI, will also be testing on some other Ivy bridge / Haswell systems tomorrow, gotta see what I still have in the racks (or what I feel like assembling otherwise when digging through the CPU vault).
    Your article points out that Gen9+ doesn't have much regression, but in the earlier article raising awareness, comments from users pointed out an uptick in wattage which is a pretty big deal for battery life on laptops

    Any chance of publishing an article that investigates that?

    Leave a comment:


  • HEX0
    replied
    I really hope whatever makes it into mainline kernel has a kernel parameter to disable the mitigation.
    Since my desktop died I've been using a laptop with i5-3317U and it can play classic games or even some light modern games.

    I can still run Sims 4 (via Wine) on medium/high no AA at around ~30 FPS aswell as Mount and Blade: Warband.
    As well as lots of classic games (Diablo, Blade of Darkness, Spore, Mafia, Operation Flashpoint, etc...)

    This patch in it's current form would render my laptop useless and I can't upgrade for a few more years

    Leave a comment:


  • milkylainen
    replied
    It's a disease I tell you. Now the Intel infection has spread to other hardware units.
    Beside impaling your Intel hardware and burning it on open fire, I know of no cure.

    Leave a comment:


  • HEX0
    replied
    Originally posted by birdie View Post
    God, so much drama/fuss about nothing. Anything before Broadwell with Intel Iris is unsuitable even for light gaming (unless you're playing Adobe Flash based games in which case a GPU is not even needed) and Broadwell/SkyLake U(HD) iGPUs are barely affected.

    Once I tried playing an Unreal Engine 3 based game on an Intel HD 520 (Intel Skylake) - I got 12 fps at 800x600 resolution and minimum graphics settings with AA turned off. Whoever says they are gaming on Intel HD are either lying or they are not really gaming - they run casual games with absolutely basic graphics. Intel HD graphics before Ice Lake cannot run Crysis at any resolution or settings with playable frame rates and it's a game from 2007!

    If you're into light gaming you must have at the very least a Ryzen 3000/4000 based laptop (the 4000 series is yet to be released) or something running a GeForce MX250 as a bare minimum.
    Wow, such arrogance
    I can even play Sims 4 on Ivy Bridge GPU




    Leave a comment:


  • JMB9
    replied
    Originally posted by birdie View Post
    It's interesting why AMD fanboyism has suddenly become a trend lately ...
    Maybe I read my words differently than others.
    First of all - I am not referring to iGPU mitigations alone - maybe that is the problem.
    If any I had been an Intel fanboy (many will say this in less than a second, I am sure ) - but due to their superior technology right from the start with 8086.
    Intel currently seems to have no solution at all - and even Skylake has similar problems like Haswell which ruins playing several games (yes, games are easier to be used as tracers - and I think if I would refer to TeX this would be the wrong forum - wouldn't it).
    10% would not matter on the desktop - what I see is crazy - no benchmark necessary.
    And from mitigations - not iGPU alone but also Spectre&Meltdown&Friends, of cause - I cannot say what is responsible, only that when a slowdown of 20%+ happens, there was always a new firmware. I am aware of single mitigations with more than 10% performance loss. Maybe you have slept when one mitigation after another came in and slowed down the system - and firmware makes sure you cannot test the decrease in performance ... a nice game where people are URGENT to buy new HW.
    That was the game of SW industry - so bad programs that you felt obliged to buy the next version as "must be better" ...
    Ice Lake is not performant - not even available for 35 W. It is laughable. Why did Intel used it's 1st 8k capable graphics when they not even reach the 35 TDP and it can not even handle FullHD as Haswell graphics could?
    I mean before one mitigation after the other comes for CPU & iGPU ... maybe NOW the current CPU & iGPU solution doesn't look that ugly in comparison with the systems slowed down by Linux mitigation and FW. Hooray!
    I am waiting for the 1st 1 kW CPU with 14+++++++++ nm by Intel ... that is the way it goes right now - isn't it.
    And I really hate to use a dGPU with up to 130 W if a system with 35W to 65W TDP would do the job for me - but I am tired to wait for AMD releasing a Navi APU (which may even not be capable of 8k).
    Maybe if you would have read other comments of mine you would know that I am not happy with AMD either, as I look for a new PC for more than 3 years now!
    So no, I am not a fanboy for AMD - but maybe that will change. It that systems keep it's performance AND stay safe for at least six years - I am sure to buy AMD again and THEN you can call me a fanboy if you want. NOT NOW.
    But what Intel does ... the owner of a system seems to not have any saying. One may think of Sony when using upgrade you lose main functionality. And that's just the point about RMS being right in every aspect.
    I even said that maybe I end with using IBM Power ... but I am currently a little hooked on x86. A little ...
    So I am against that tactics Intel is currently doing. I expected a total redesign were no mitigations are necessary ... and it does not seem that they even try it.
    And THAT IS EXACTLY what Linus Torvalds said. But NOW - who would really dare to bet that Intel will deliver - THEY NOT EVEN ANNOUNCE ANYTHING - not even a 10 nm desktop CPU.
    So no, I am no longer any one's fanboy - but just look for good technology - and I am not pleased.
    But maybe I am the only one and all other users think of 10% performance reduction in total - which would not be visible at all - waiting 11 instead of 10 s.
    Such ugly and insecure technology should not be sold for so many years.
    No more mitigation, no more spying technology, no more blobs - that would be the direction.
    I would even be content with saying "we will end all mitigation by new silicon 4 years after its occurance".
    And that seems like not possible right now.
    Or to say "we will end up using Minix inside CS/CSME with ...", but this seem even unwanted.
    Or something like "we will release the source code of the BLOBs of current HW five years later ..." may also qualify as creating faith in the technology and would be really welcome from an open HW perspective.
    From my point of view I had been deprived - as it is a factor of ten lost - I am waiting for the cursor on a Haswell system which was fluent and fast even at 4k (even for easy games) ... that's gone. And a similar effect on Skylake with at least 30% loss in special situations.
    So even on Skylake the problem from Nov. to Dec. 2019 was easy to spot - 15% - laughable. I would not even frown about that ... currently I AM SCEEMING !!!
    I would work on my unmitigated and not firmware poised Sandy Bridge system if that would be capable of 4k. And no, this is NOT a joke !!! Wish it would ...

    Leave a comment:


  • FPScholten
    replied
    Well I can certainly notice the performance difference on my laptop with a Haswell i7 4700MQ processor when running with or without mitigations in place. The drop in performance is very notable when for example reencoding a small HD (720p) video from my camera to a more compact size takes about 20% longer with full mitigations on.
    Starting applications takes a bit more time and even booting takes a few seconds longer.
    However normal everyday tasks like browsing, writing or reading mail are hardly affected, but those tasks are generally not limited by the hardware, but more by the speed of the user....

    Leave a comment:


  • birdie
    replied
    Originally posted by JMB9 View Post
    It is interesting that Intel does not see a problem to take away 90% of performance and still trying to pretend that they care for security.
    It's interesting why AMD fanboyism has suddenly become a trend lately which is further exacerbated by an abundance of crazy false claims like this one "take away 90% of performance". In the worst case scenario the mitigations slow down certain tasks by no more than 40%. 90% means they become ten effing times slower. Unless you're running very narrow specific workflows all these mitigations slow down your system by less than 15% which is nothing for most users.

    Also, Ice/Whisky/Comet Lake CPUs are barely affected and they've been sold for over a year now.

    Originally posted by JMB9 View Post
    My Haswell with Intel i7‑4770T is no longer usable since Nov./Dec. 2019
    Another portion of lies without any proofs. A friend of mine has a laptop with a lot weaker CPU, Core i5 4250U, and he says he hasn't noticed any performance degradation in the last 24 months. His battery is dying but other than that everything runs as it did when he purchased it.
    Last edited by birdie; 01-16-2020, 03:50 PM.

    Leave a comment:

Working...
X