Originally posted by jrch2k8
View Post
Announcement
Collapse
No announcement yet.
The Performance Impact Of Spectre Mitigation On POWER9
Collapse
X
-
Originally posted by Maddo View PostNot having a license doesn't mean you can't emulate a different architecture.
Comment
-
Originally posted by Weasel View PostThat's what happens when you listen to clueless people on this forum instead of those who actually know what they're talking about.
Comment
-
Originally posted by Weasel View PostThat's what happens when you listen to clueless people on this forum instead of those who actually know what they're talking about.
In that article there's a quote from one of the developers on the penalties of full protection:
Tim Chen summed it up in the patches, "leaving STIBP on all the time is expensive for certain applications that have frequent indirect branches. One such application is perlbench in the SpecInt Rate 2006 test suite which shows a 21% reduction in throughput. Other application like bzip2 in the same test suite with minimal indirct branches have only a 0.7% reduction in throughput. IBPB will also impose overhead during context switches...Application to application exploit is in general difficult due to address space layout randomization in applications and the need to know an application's address space layout ahead of time. Users may not wish to incur performance overhead from IBPB and STIBP for general non security sensitive processes and use these mitigations only for security sensitive processes. This patchset provides a process property based lite protection mode that applies IBPB and STIBP mitigation only to security sensitive non-dumpable processes."
Comment
-
Originally posted by schmidtbag View PostMeaning what? Benchmarks showed that Intel had some pretty hefty performance hits from all of the mitigations. It's not a matter of opinion.
Comment
-
Originally posted by Weasel View PostI mean people who think that all the speculative execution exploits are Intel-only and keep on babbling about it, just like the Nvidia haters.
Comment
Comment