Is there any performance counterpart for using this?
Announcement
Collapse
No announcement yet.
The Linux State Of AMD's Zen x86 Memory Encryption
Collapse
X
-
Originally posted by darkbasic View PostTired of this shit, why can't someone produce a fully open source cpu?
Comment
-
An obvious use case for this would be for a a VM (say hosting a website) where the guest cannot trust the host. It would permit a website hosted on a commercial server to be inaccessable to the server's corporate owner or those snooping on them-if it works at all given the "posession=root" situation.
Let's admit it: no commerical web hosting or server space company can be trusted against warrants, national security letters,or just plain snooping for extra monetization the way Verizon does on their mobile Internet service. Thus the question becomes one of whether back doors can be kept out, and that is where the importance of auditable open source code comes in. As for DRM, that is simply a rather odd use of encryption in which an untrusted user must be given the keys anyway, which is why it always fails. The same might be true of encryption RAM against your server owner, but also like DRM it might slow down an attack for a week or two, possibly long enough for say, a protest to take place and not be pre-empted.
Comment
-
Originally posted by schmidtbag View PostFirst of all, if you want a fully open source CPU, to my knowledge, that would be SPARC.
Second, until you yourself take advantage of opened code, or, have solid proof where a feature like this is a hindrance to Linux, quit your whining.
But really, tell us, what will the community ever accomplish having this hardware-specific feature be released as open?
Comment
-
User key + chip key + a few rounds of hashing -> the key to actually access the memory. They deliberately left out the instruction to dump the key.
I can see this being useful against memory scrapers on the application level. malloc()ing until your [insert program with sensitive data in memory here] gets copied into swap space doesn't do an attacker any good if there's hardware-level encryption of that application's memory space. Or if there's another Heartbleed-type bug.
If this CPU fails and you've used this feature to encrypt your hard drive, you're up the creek without a paddle.
Oh, and GPUs aren't just for pushing pixels any more. They're powerful general-purpose computing devices in their own right. They're easily flexible enough to write malicious code for.
Comment
-
Originally posted by starshipeleven View PostA security system based on closed-sourced code isn't trustworthy. Really, this isn't a GPU where at most you render garbage on screen. This is security.
That this feature can actually be trusted to be useful for something else than DRM.
Comment
-
Originally posted by schmidtbag View PostA closed-source security system is much less likely to be hacked by intentionally malicious people,
Particularly when you've got 128 bytes involved on a piece of hardware that isn't easily accessible to the end user.
On the other hand, a closed security system increases the chance of something like a NSA backdoor.
If you're running an honest business and the NSA happens to hack in, then what? In most cases, nearly 99.9% of your client base will continue on with their day as though nothing happened. Of the ones who did get affected, they will probably have no reason to know the NSA got their data from you.
Really, most hacking isn't destructive but only steals your customer's data in a completely deniable way, so why the fuck care?
Wrong reasoning, most businnesses don't care about security, they care about plausible deniability.
A sticker saying Secure is enough for them, even if the implementation is a piece of shit.
Case in point, WPS (wireless auto-bonding and auto-encrypting mode for wifi networks) is broken since like 4 years ago and allows everyone to get in after a few days of sniffing and running algorithms, yet it's still there in all new products.
Don't forget - this is a very low-level form of security. If your security is so crappy that someone managed to reach this level, you deserve all the flak that comes your way.
Something else like what?
DRM is just a minor subset of security and one that does not really need to be secure at all (as in 99% of the cases the protocol and programs using this will be full of holes).
The only thing open-source drivers will accomplish is saving some time figuring out how to access the core in the first place. But considering people like the nouveau devs can reverse-engineer Nvidia's hardware, I'd imagine this would be significantly easier.
Comment
-
Originally posted by starshipeleven View PostBullshit. A closed system is much less likely to get hacked by script kiddies probably, but 99% of the hacking happens on closed-source systems.
It all comes down on incentive. Hacking at this scale is a businness. If none is using this feature then none will hack it.
Which is totally irrelevant as a properly configured external firewall blocks any shit traffic if you really want to lock that down.
Wrong reasoning, most businnesses don't care about security, they care about plausible deniability.
A sticker saying Secure is enough for them, even if the implementation is a piece of shit.
In another perspective, open-source tells people, including hackers, exactly how something works. This allows determined hackers to quickly figure out security flaws; faster than people can patch against. It wouldn't surprise me if things like sql injection was discovered this way.
Like fucking security and sandboxing, it's actual intended purpose.
DRM is just a minor subset of security and one that does not really need to be secure at all (as in 99% of the cases the protocol and programs using this will be full of holes).
He was talking of FIRMWARE, not drivers. This hardware is running with a closed FIRMWARE, just like the NVIDIA GPUs that can't do shit unless NVIDIA releases the firmware blobs for them.
And you also just helped prove my point - Nvidia's closed firmware has "protected" them pretty well so far from the nouveau devs.
For the record, I'm not saying I favor this. I almost never prefer anything to be closed-source. All I'm saying is this isn't anywhere near as bad as people like you make it out to be.
Comment
-
How many keys can be stored in the ARM firmware?
Confidentiality of the guest is accomplished by encrypting memory with a memory encryption key that only the SEV firmware knows. The SEV management interface does not allow the memory encryption key—or any other secret SEV state—to be exported outside of the firmware without properly authenticating the recipient.
Comment
Comment