Announcement

Collapse
No announcement yet.

The Linux State Of AMD's Zen x86 Memory Encryption

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Is there any performance counterpart for using this?

    Comment


    • #12
      Originally posted by darkbasic View Post
      Tired of this shit, why can't someone produce a fully open source cpu?
      And I'm tired of hearing people like you moan about this subject every week. First of all, if you want a fully open source CPU, to my knowledge, that would be SPARC. Second, until you yourself take advantage of opened code, or, have solid proof where a feature like this is a hindrance to Linux, quit your whining. You have yet to give a solid reason for wanting something to be open source. But really, tell us, what will the community ever accomplish having this hardware-specific feature be released as open? This is a platform-specific CPU bound feature. Unless this ends up with kernel issues (like fglrx), modifying the code or re-compiling it for other platforms will accomplish nothing.

      Comment


      • #13
        An obvious use case for this would be for a a VM (say hosting a website) where the guest cannot trust the host. It would permit a website hosted on a commercial server to be inaccessable to the server's corporate owner or those snooping on them-if it works at all given the "posession=root" situation.

        Let's admit it: no commerical web hosting or server space company can be trusted against warrants, national security letters,or just plain snooping for extra monetization the way Verizon does on their mobile Internet service. Thus the question becomes one of whether back doors can be kept out, and that is where the importance of auditable open source code comes in. As for DRM, that is simply a rather odd use of encryption in which an untrusted user must be given the keys anyway, which is why it always fails. The same might be true of encryption RAM against your server owner, but also like DRM it might slow down an attack for a week or two, possibly long enough for say, a protest to take place and not be pre-empted.

        Comment


        • #14
          Originally posted by schmidtbag View Post
          First of all, if you want a fully open source CPU, to my knowledge, that would be SPARC.
          More like RISC-V. And before you complain, modern SPARC isn't terribly easy to get either.

          Second, until you yourself take advantage of opened code, or, have solid proof where a feature like this is a hindrance to Linux, quit your whining.
          A security system based on closed-sourced code isn't trustworthy. Really, this isn't a GPU where at most you render garbage on screen. This is security.

          But really, tell us, what will the community ever accomplish having this hardware-specific feature be released as open?
          That this feature can actually be trusted to be useful for something else than DRM.

          Comment


          • #15
            User key + chip key + a few rounds of hashing -> the key to actually access the memory. They deliberately left out the instruction to dump the key.
            I can see this being useful against memory scrapers on the application level. malloc()ing until your [insert program with sensitive data in memory here] gets copied into swap space doesn't do an attacker any good if there's hardware-level encryption of that application's memory space. Or if there's another Heartbleed-type bug.
            If this CPU fails and you've used this feature to encrypt your hard drive, you're up the creek without a paddle.

            Oh, and GPUs aren't just for pushing pixels any more. They're powerful general-purpose computing devices in their own right. They're easily flexible enough to write malicious code for.

            Comment


            • #16
              Originally posted by starshipeleven View Post
              A security system based on closed-sourced code isn't trustworthy. Really, this isn't a GPU where at most you render garbage on screen. This is security.
              Depends on how you look at it. A closed-source security system is much less likely to be hacked by intentionally malicious people, particularly when you've got 128 bytes involved on a piece of hardware that isn't easily accessible to the end user. On the other hand, a closed security system increases the chance of something like a NSA backdoor. If you're running an honest business and the NSA happens to hack in, then what? In most cases, nearly 99.9% of your client base will continue on with their day as though nothing happened. Of the ones who did get affected, they will probably have no reason to know the NSA got their data from you. Don't forget - this is a very low-level form of security. If your security is so crappy that someone managed to reach this level, you deserve all the flak that comes your way.

              That this feature can actually be trusted to be useful for something else than DRM.
              Something else like what? If the hardware is tweaked in a way to only fulfill its intended purpose, there's not a whole lot else you could make it do, kind of like those Bitcoin ASICs. If AMD didn't screw with it too much, then it's just a simple ARM processor; it wouldn't surprise me if you could just take the big.LITTLE drivers, make some tweaks, and then use that core for whatever you want. The only thing open-source drivers will accomplish is saving some time figuring out how to access the core in the first place. But considering people like the nouveau devs can reverse-engineer Nvidia's hardware, I'd imagine this would be significantly easier.

              Comment


              • #17
                Cool stuff from AMD

                Comment


                • #18
                  Originally posted by schmidtbag View Post
                  A closed-source security system is much less likely to be hacked by intentionally malicious people,
                  Bullshit. A closed system is much less likely to get hacked by script kiddies probably, but 99% of the hacking happens on closed-source systems.
                  Particularly when you've got 128 bytes involved on a piece of hardware that isn't easily accessible to the end user.
                  It all comes down on incentive. Hacking at this scale is a businness. If none is using this feature then none will hack it.

                  On the other hand, a closed security system increases the chance of something like a NSA backdoor.
                  Which is totally irrelevant as a properly configured external firewall blocks any shit traffic if you really want to lock that down.

                  If you're running an honest business and the NSA happens to hack in, then what? In most cases, nearly 99.9% of your client base will continue on with their day as though nothing happened. Of the ones who did get affected, they will probably have no reason to know the NSA got their data from you.
                  That's the same bullshit reason that explains why most places still run XP or its server version.
                  Really, most hacking isn't destructive but only steals your customer's data in a completely deniable way, so why the fuck care?
                  Wrong reasoning, most businnesses don't care about security, they care about plausible deniability.
                  A sticker saying Secure is enough for them, even if the implementation is a piece of shit.
                  Case in point, WPS (wireless auto-bonding and auto-encrypting mode for wifi networks) is broken since like 4 years ago and allows everyone to get in after a few days of sniffing and running algorithms, yet it's still there in all new products.

                  Don't forget - this is a very low-level form of security. If your security is so crappy that someone managed to reach this level, you deserve all the flak that comes your way.
                  I don't give a shit, something that calls itself "security feature" should be worth of trust, not be another additional piece of powered silicon wasting energy in the SoC.

                  Something else like what?
                  Like fucking security and sandboxing, it's actual intended purpose.
                  DRM is just a minor subset of security and one that does not really need to be secure at all (as in 99% of the cases the protocol and programs using this will be full of holes).

                  The only thing open-source drivers will accomplish is saving some time figuring out how to access the core in the first place. But considering people like the nouveau devs can reverse-engineer Nvidia's hardware, I'd imagine this would be significantly easier.
                  He was talking of FIRMWARE, not drivers. This hardware is running with a closed FIRMWARE, just like the NVIDIA GPUs that can't do shit unless NVIDIA releases the firmware blobs for them.

                  Comment


                  • #19
                    Originally posted by starshipeleven View Post
                    Bullshit. A closed system is much less likely to get hacked by script kiddies probably, but 99% of the hacking happens on closed-source systems.
                    It all comes down on incentive. Hacking at this scale is a businness. If none is using this feature then none will hack it.
                    You seem to be forgetting this involves a hardware layer. It's not hard to hack a serial key for Windows, for example. It is hard to hack a 128-byte encryption on what may be a unique piece of embedded hardware. I'm not saying it's impossible, but it took years for the nouveau people to even get reclocking done properly and it's not like Nvidia was trying very hard to prevent this.
                    Which is totally irrelevant as a properly configured external firewall blocks any shit traffic if you really want to lock that down.
                    Exactly my point... If your firewall is good enough, you shouldn't have to worry about people who also manage to hack through your memory encryption in a timely manner either.

                    Wrong reasoning, most businnesses don't care about security, they care about plausible deniability.
                    A sticker saying Secure is enough for them, even if the implementation is a piece of shit.
                    Do you not see the hypocrisy in your comments? You say businesses only care about a sticker saying "Secure" is enough for them, and yet you (who I assume does not run a business or IT dept) are here whining that a security feature (targeted toward enterprises) is theoretically less secure simply because it isn't open source.
                    In another perspective, open-source tells people, including hackers, exactly how something works. This allows determined hackers to quickly figure out security flaws; faster than people can patch against. It wouldn't surprise me if things like sql injection was discovered this way.

                    Like fucking security and sandboxing, it's actual intended purpose.
                    DRM is just a minor subset of security and one that does not really need to be secure at all (as in 99% of the cases the protocol and programs using this will be full of holes).
                    But if this ends up being hackable, what's stopping people from just going their own route and improving upon it in their own way?

                    He was talking of FIRMWARE, not drivers. This hardware is running with a closed FIRMWARE, just like the NVIDIA GPUs that can't do shit unless NVIDIA releases the firmware blobs for them.
                    What difference does it make if it's firmware or drivers? If the firmware is permanent, it doesn't matter if it's open source because you can't change it anyway. If it can be overwritten and someone manages to figure out how to hack it, then again, what's stopping people from creating their own improvements?
                    And you also just helped prove my point - Nvidia's closed firmware has "protected" them pretty well so far from the nouveau devs.

                    For the record, I'm not saying I favor this. I almost never prefer anything to be closed-source. All I'm saying is this isn't anywhere near as bad as people like you make it out to be.

                    Comment


                    • #20
                      How many keys can be stored in the ARM firmware?

                      Confidentiality of the guest is accomplished by encrypting memory with a memory encryption key that only the SEV firmware knows. The SEV management interface does not allow the memory encryption key—or any other secret SEV state—to be exported outside of the firmware without properly authenticating the recipient.
                      Which means there is a hard limit to the numbers of VM's that can be assigned. Also, VM's have to be hard-locked to a CPU, right? It seems you can migrate, at least. with PDH_CERT_EXPORT, but how this is done in practice between two servers over a network, ..., maybe I'm imagining complexity.

                      Comment

                      Working...
                      X