Announcement

Collapse
No announcement yet.

Intel's Mitigation For CVE-2019-14615 Graphics Vulnerability Obliterates Gen7 iGPU Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by Michael View Post
    FYI, will also be testing on some other Ivy bridge / Haswell systems tomorrow, gotta see what I still have in the racks (or what I feel like assembling otherwise when digging through the CPU vault).
    Your article points out that Gen9+ doesn't have much regression, but in the earlier article raising awareness, comments from users pointed out an uptick in wattage which is a pretty big deal for battery life on laptops

    Any chance of publishing an article that investigates that?

    Comment


    • #52
      Originally posted by wizard69 View Post
      I just see the trend to difficult to replace batteries to be a huge inconvenience for the consumer that just drives bad behavior.
      Becoming less of an issue these days with power banks and USB-C powered devices, albeit you'd then be lugging that around reducing convenience of a laptop. This can also be done for the laptop models that are powered by DC barrel plugs(coaxial), but it's a bit more of a hassle(if going through USB-C/USB-PD to the DC barrel plug the laptop needs that is).

      Originally posted by birdie View Post
      God, so much drama/fuss about nothing. Broadwell/SkyLake U(HD) iGPUs are barely affected.
      Not entirely true, even if performance appears barely regressed the power draw is said to be much higher which has notable loss of battery life for laptop users.

      Originally posted by birdie View Post
      If you're into light gaming you must have at the very least a Ryzen 3000/4000 based laptop (the 4000 series is yet to be released) or something running a GeForce MX250 as a bare minimum.
      Ryzen 3200U iGPU is only about 28% faster than Intel 8145U UHD620 iGPU iirc. That's better but but not significantly. The nvidia dGPU isn't really great atm from what users report(might be better with 5.6/5.7 kernel where the issue with getting a proper power saving state is achievable on linux). Ryzen 4000 series for laptops seems quite nice, seems they made significant improvements in the one area they performed poorly at, battery life. In addition the 5.5(or was it 5.6) kernel finally brings PSR support to AMD GPU drivers on Linux, something Intel has been benefiting from for many years now.

      Comment


      • #53
        Originally posted by HEX0 View Post
        I really hope whatever makes it into mainline kernel has a kernel parameter to disable the mitigation.
        Since my desktop died I've been using a laptop with i5-3317U and it can play classic games or even some light modern games.

        I can still run Sims 4 (via Wine) on medium/high no AA at around ~30 FPS aswell as Mount and Blade: Warband.
        As well as lots of classic games (Diablo, Blade of Darkness, Spore, Mafia, Operation Flashpoint, etc...)

        This patch in it's current form would render my laptop useless and I can't upgrade for a few more years
        I'd also like to throw my hat into the ring of intel graphics laptop gamers. I've got a i5-3337U, and it runs many games to my satisfaction. I'm regularly playing Neverwinter Nights EE without issue at native resolution [1600x900], as well as many older games on wine such as Nexus the jupiter incident. Is it also clear whether this just affects opengl performance? What about things using vaapi, such as video decode?

        Regardless, I'm still incredibly happy with my laptop from 2013 and am going to be incredibly frustrated if this forces me to fork out money for a new one due to no fault of my own. How the hell can we hold intel liable for the continuing cascade of stuff ups which are costing many people $$$?

        Comment


        • #54
          I've just updated to a kernel just freshly pushed (presumably to implement the mitigations for this vulnerability) and am now seeing constant GUI soft-locks happening on an i7-8559U (iris Plus 655, IIRC) with corresponding complaints in dmesg.

          Hopefully this will be solved (I can live with some slowdown, I can't live with regular soft-locks) with another tweak, but so far I'm far from happy.

          Comment


          • #55
            Originally posted by HEX0 View Post

            Wow, such arrogance
            I can even play Sims 4 on Ivy Bridge GPU
            Ages old games which can be barely called "3D"? Yeah, right, exactly what I said earlier - these are the kind of games which can be run on the CPU entirely.

            Comment


            • #56
              Originally posted by caligula View Post
              What the heck, there are gamers who run Steam on machines with Ivy Bridge / Haswell iGPUs?
              Yes indeed.
              Many indie games are working OK (even great) on HD4600 (the Witness for example).

              Even commercial games (Team Fortress 2, Dragonball Budokai Fighters, Rocksmith, Dirt 3, Batman games) works perfectly on HD4600 (not in fullhd, but HD is okay)

              More over, HD4600 let you have sufficient framerate while maintaining some battery life (I can play more than 2 hours on Dragon ball, my battery is 5 years old). I can even enable VSync since my laptop allows me for 40Hz refresh rates.

              You need to have dual channel and possibly dual rank (dual rank is definitly an improvement on HD620, but I can’t tell for HD4600 since I don't have single rank memory)

              Comment


              • #57
                Originally posted by polarathene View Post

                Your article points out that Gen9+ doesn't have much regression, but in the earlier article raising awareness, comments from users pointed out an uptick in wattage which is a pretty big deal for battery life on laptops

                Any chance of publishing an article that investigates that?
                I ran some tests on Whiskeylake yesterday and found no change in power draw on battery.
                Michael Larabel
                https://www.michaellarabel.com/

                Comment


                • #58
                  Originally posted by birdie View Post
                  Ages old games which can be barely called "3D"?
                  Sims 4 targeted the Xbox One / ps4 console generation. I've played it. Sure it's cartoonish, but true high quality 3D. No way this could run on a CPU at acceptable fps. It's a testament to Wine and Intel Lnux drivers that it runs so well for him.

                  Comment


                  • #59
                    Originally posted by birdie View Post

                    Ages old games which can be barely called "3D"? Yeah, right, exactly what I said earlier - these are the kind of games which can be run on the CPU entirely.
                    Regardless, any more performance degradation to already weak Gen7 Intel iGPUs would make certain barely playable games completely unplayable.
                    And software rendering even with 80 cores is worse than Ivy Bridge GPU
                    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

                    Comment


                    • #60
                      Phoronix quoted. https://hexus.net/tech/news/graphics...x-performance/

                      Being fast to report Linux news seems to have lead to the impression that MS-Windows isn't affected. I wonder how many people will be in for a surprise.

                      Looking at the sheer number of Intel CPUs with Gen7 graphics and guessing how many sets were shipped and are still in use and possibly not even being updated any longer could mean trouble.


                      Comment

                      Working...
                      X