Announcement

Collapse
No announcement yet.

Intel's Mitigation For CVE-2019-14615 Graphics Vulnerability Obliterates Gen7 iGPU Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Is there a way to bypass this vulnerability by ensuring only one OpenGL workload is running on the GPU to minimize context switches? This could mean that running a single game and no compositor could yield massive perf gains over running one.

    Comment


    • #42
      Originally posted by uxmkt View Post
      Sure. Just because one is a "gamer" does not mean one is always seeking to run titles requiring AAA+ hardware. Some gamers are quite happy with playing (3D) classics or 2D games like platformers and puzzles.
      Um, there's a "tiny" performance gap between Ivy Bridge iGPU and the requirements of modern "AAA+" games. For instance, a simple $100 external GPU (heck, even the $150 AMD 2400G which includes a CPU) is far superior, probably 10 times faster than crappy old iGPUs. Then again, it's way slower than $500 cards. Playing some 2d platformers every now and then does not make you a gamer.

      Comment


      • #43
        God, so much drama/fuss about nothing. Anything before Broadwell with Intel Iris is unsuitable even for light gaming (unless you're playing Adobe Flash based games in which case a GPU is not even needed) and Broadwell/SkyLake U(HD) iGPUs are barely affected.

        Once I tried playing an Unreal Engine 3 based game on an Intel HD 520 (Intel Skylake) - I got 12 fps at 800x600 resolution and minimum graphics settings with AA turned off. Whoever says they are gaming on Intel HD are either lying or they are not really gaming - they run casual games with absolutely basic graphics. Intel HD graphics before Ice Lake cannot run Crysis at any resolution or settings with playable frame rates and it's a game from 2007!

        If you're into light gaming you must have at the very least a Ryzen 3000/4000 based laptop (the 4000 series is yet to be released) or something running a GeForce MX250 as a bare minimum.
        Last edited by birdie; 16 January 2020, 03:40 PM.

        Comment


        • #44
          It is interesting that Intel does not see a problem to take away 90% of performance and still trying to pretend that they care for security.
          From my point of view Intel Management lost scope. First of all, Intel has NO desktop CPU which would not require mitigations ... after so many years.
          Second they not even offer their customers big estates for new systems after having caused those problems grossly negligent.
          My Haswell with Intel i7‑4770T is no longer usable since Nov./Dec. 2019 - and the main system due to lack of a real new system - and hey I am really glad not to habe bought a newer Intel system - close with Skylake not bought due to Linux problems - close to Coffee Lake Refresh - still graphics of Haswell (DP 1.2, max res being 4k; mitigations still necessary) - and this will be the same for Comet Lake.
          I am now set to buy Ryzen 5 3600 (Zen2) & Radeon RX 5500 XT (RDNA Navi 14) to have a performant system ready for 8k (waiting of K|L-ubuntu 20.04 LTS to get kernel 5.5 and Mesa 20.0 (or a 19.3, which seems not that likely ...).
          Since 1987 I have never bought other CPUs than those from Intel (really a lot of systems) - and with current behaviour I may finally end buying IBM Power with AMD graphics someday if things keep that way. This was my 1st professional work in IT - so just back to the roots.
          We still have CS/CSME (and on AMD side PSP) as security nightmares and a symbol of extreme madness.
          Just thinking of an unpatched Minix system running more privileged than the Linux kernel. That's the security Intel is so proud of - ouch!
          My words can not describe what I want to express.
          Just to get a little work done I just switched off mitigations altogether - but one should also revoke the firmware disablements of the last years.
          I worked as Senior System Engineer and Unix Consultant for more than ten years - I hardened servers for banks, provided security audits.
          Intel not even try to admit its faults or help its users - and like Apple and Microsoft they make more money with worse products.
          And Google talking about `Moonshot' for some of that patching - which is useless concerning hardening. Only making it harder to see the many open doors.
          If anyone would reason about current x86 server technology would have a high barrier to getting control of I would just smile and go away.
          We will all have to make decisions ... and this game is no longer funny.
          I would not trust any corporation - and I lost trust in FSF after them pressing out RMS - so what is next?
          This world got absolutely crazy ... and maybe we must build up a community with an influence bigger than the biggest IT companies to resolve it.
          With people on the board who one could put faith in ... and those are extremely rare currently.
          Or we admit that we all lost and just give up ...
          But we never were farther apart from control of HW and SW than we are today - that may be good for secret services (being the cause of terrorist actions all over the planet) and other criminal organizations, but it's a killer of democracy - and no one can neglect it any more!

          Comment


          • #45
            Originally posted by JMB9 View Post
            It is interesting that Intel does not see a problem to take away 90% of performance and still trying to pretend that they care for security.
            It's interesting why AMD fanboyism has suddenly become a trend lately which is further exacerbated by an abundance of crazy false claims like this one "take away 90% of performance". In the worst case scenario the mitigations slow down certain tasks by no more than 40%. 90% means they become ten effing times slower. Unless you're running very narrow specific workflows all these mitigations slow down your system by less than 15% which is nothing for most users.

            Also, Ice/Whisky/Comet Lake CPUs are barely affected and they've been sold for over a year now.

            Originally posted by JMB9 View Post
            My Haswell with Intel i7‑4770T is no longer usable since Nov./Dec. 2019
            Another portion of lies without any proofs. A friend of mine has a laptop with a lot weaker CPU, Core i5 4250U, and he says he hasn't noticed any performance degradation in the last 24 months. His battery is dying but other than that everything runs as it did when he purchased it.
            Last edited by birdie; 16 January 2020, 03:50 PM.

            Comment


            • #46
              Well I can certainly notice the performance difference on my laptop with a Haswell i7 4700MQ processor when running with or without mitigations in place. The drop in performance is very notable when for example reencoding a small HD (720p) video from my camera to a more compact size takes about 20% longer with full mitigations on.
              Starting applications takes a bit more time and even booting takes a few seconds longer.
              However normal everyday tasks like browsing, writing or reading mail are hardly affected, but those tasks are generally not limited by the hardware, but more by the speed of the user....

              Comment


              • #47
                Originally posted by birdie View Post
                It's interesting why AMD fanboyism has suddenly become a trend lately ...
                Maybe I read my words differently than others.
                First of all - I am not referring to iGPU mitigations alone - maybe that is the problem.
                If any I had been an Intel fanboy (many will say this in less than a second, I am sure ) - but due to their superior technology right from the start with 8086.
                Intel currently seems to have no solution at all - and even Skylake has similar problems like Haswell which ruins playing several games (yes, games are easier to be used as tracers - and I think if I would refer to TeX this would be the wrong forum - wouldn't it).
                10% would not matter on the desktop - what I see is crazy - no benchmark necessary.
                And from mitigations - not iGPU alone but also Spectre&Meltdown&Friends, of cause - I cannot say what is responsible, only that when a slowdown of 20%+ happens, there was always a new firmware. I am aware of single mitigations with more than 10% performance loss. Maybe you have slept when one mitigation after another came in and slowed down the system - and firmware makes sure you cannot test the decrease in performance ... a nice game where people are URGENT to buy new HW.
                That was the game of SW industry - so bad programs that you felt obliged to buy the next version as "must be better" ...
                Ice Lake is not performant - not even available for 35 W. It is laughable. Why did Intel used it's 1st 8k capable graphics when they not even reach the 35 TDP and it can not even handle FullHD as Haswell graphics could?
                I mean before one mitigation after the other comes for CPU & iGPU ... maybe NOW the current CPU & iGPU solution doesn't look that ugly in comparison with the systems slowed down by Linux mitigation and FW. Hooray!
                I am waiting for the 1st 1 kW CPU with 14+++++++++ nm by Intel ... that is the way it goes right now - isn't it.
                And I really hate to use a dGPU with up to 130 W if a system with 35W to 65W TDP would do the job for me - but I am tired to wait for AMD releasing a Navi APU (which may even not be capable of 8k).
                Maybe if you would have read other comments of mine you would know that I am not happy with AMD either, as I look for a new PC for more than 3 years now!
                So no, I am not a fanboy for AMD - but maybe that will change. It that systems keep it's performance AND stay safe for at least six years - I am sure to buy AMD again and THEN you can call me a fanboy if you want. NOT NOW.
                But what Intel does ... the owner of a system seems to not have any saying. One may think of Sony when using upgrade you lose main functionality. And that's just the point about RMS being right in every aspect.
                I even said that maybe I end with using IBM Power ... but I am currently a little hooked on x86. A little ...
                So I am against that tactics Intel is currently doing. I expected a total redesign were no mitigations are necessary ... and it does not seem that they even try it.
                And THAT IS EXACTLY what Linus Torvalds said. But NOW - who would really dare to bet that Intel will deliver - THEY NOT EVEN ANNOUNCE ANYTHING - not even a 10 nm desktop CPU.
                So no, I am no longer any one's fanboy - but just look for good technology - and I am not pleased.
                But maybe I am the only one and all other users think of 10% performance reduction in total - which would not be visible at all - waiting 11 instead of 10 s.
                Such ugly and insecure technology should not be sold for so many years.
                No more mitigation, no more spying technology, no more blobs - that would be the direction.
                I would even be content with saying "we will end all mitigation by new silicon 4 years after its occurance".
                And that seems like not possible right now.
                Or to say "we will end up using Minix inside CS/CSME with ...", but this seem even unwanted.
                Or something like "we will release the source code of the BLOBs of current HW five years later ..." may also qualify as creating faith in the technology and would be really welcome from an open HW perspective.
                From my point of view I had been deprived - as it is a factor of ten lost - I am waiting for the cursor on a Haswell system which was fluent and fast even at 4k (even for easy games) ... that's gone. And a similar effect on Skylake with at least 30% loss in special situations.
                So even on Skylake the problem from Nov. to Dec. 2019 was easy to spot - 15% - laughable. I would not even frown about that ... currently I AM SCEEMING !!!
                I would work on my unmitigated and not firmware poised Sandy Bridge system if that would be capable of 4k. And no, this is NOT a joke !!! Wish it would ...

                Comment


                • #48
                  Originally posted by birdie View Post
                  God, so much drama/fuss about nothing. Anything before Broadwell with Intel Iris is unsuitable even for light gaming (unless you're playing Adobe Flash based games in which case a GPU is not even needed) and Broadwell/SkyLake U(HD) iGPUs are barely affected.

                  Once I tried playing an Unreal Engine 3 based game on an Intel HD 520 (Intel Skylake) - I got 12 fps at 800x600 resolution and minimum graphics settings with AA turned off. Whoever says they are gaming on Intel HD are either lying or they are not really gaming - they run casual games with absolutely basic graphics. Intel HD graphics before Ice Lake cannot run Crysis at any resolution or settings with playable frame rates and it's a game from 2007!

                  If you're into light gaming you must have at the very least a Ryzen 3000/4000 based laptop (the 4000 series is yet to be released) or something running a GeForce MX250 as a bare minimum.
                  Wow, such arrogance
                  I can even play Sims 4 on Ivy Bridge GPU




                  Comment


                  • #49
                    It's a disease I tell you. Now the Intel infection has spread to other hardware units.
                    Beside impaling your Intel hardware and burning it on open fire, I know of no cure.

                    Comment


                    • #50
                      I really hope whatever makes it into mainline kernel has a kernel parameter to disable the mitigation.
                      Since my desktop died I've been using a laptop with i5-3317U and it can play classic games or even some light modern games.

                      I can still run Sims 4 (via Wine) on medium/high no AA at around ~30 FPS aswell as Mount and Blade: Warband.
                      As well as lots of classic games (Diablo, Blade of Darkness, Spore, Mafia, Operation Flashpoint, etc...)

                      This patch in it's current form would render my laptop useless and I can't upgrade for a few more years

                      Comment

                      Working...
                      X