Announcement

Collapse
No announcement yet.

Linux's New "randomize_kstack_offset" Security Feature Having Minimal Performance Impact

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Linux's New "randomize_kstack_offset" Security Feature Having Minimal Performance Impact

    Phoronix: Linux's New "randomize_kstack_offset" Security Feature Having Minimal Performance Impact

    Of the many new features in Linux 5.13 one of the prominent security features is the ability to randomize the kernel stack offset at each system call. With Linux 5.13 stable imminent, here are some performance benchmarks of the impact from enabling this security feature...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Remember like 40-ya when you learned assembly language and discovered the beauty of the Stack and the Heap? Would you have imagined that at some point the randomization of the stack offset would be a necessary counter-measure due to the total absence of built-in security in best of the best languages such as C?

    I also remember the C developers telling me C was so much more powerful and I was dumb to use Pascal, such a dinosaur language. The Borland Pascal implementation provided everything you needed to implement stuff as sophisticated as C, and as unsecure if that's what you wanted.

    All of that because C was invented in the US while Pascal came from Europe. Oh my.
    Last edited by domih; 27 June 2021, 02:26 PM.

    Comment


    • #3
      Originally posted by domih View Post
      Remember like 40-ya when you learned assembly language and discovered the beauty of the Stack and the Heap? Would you have imagined that at some point the randomization of the stack offset would be a necessary counter-measure due to the total absence of built-in security in best of the best languages such as C?

      I also remember the C developers telling me C was so much more powerful and I was dumb to use Pascal, such a dinosaur language. The Borland Pascal implementation provided everything you needed to implement stuff as sophisticated as C- and as unsecure if that's what you wanted.

      All of that because C was invented in the US while Pascal came from Europe. Oh my.
      What's wrong with stuff from the US? I totally love using base 3 and 12 mathematics doing distance calculations, using base 10 for counting, and switching over to base 16 when I get to the kitchen

      Comment


      • #4
        Originally posted by skeevy420 View Post

        What's wrong with stuff from the US? I totally love using base 3 and 12 mathematics doing distance calculations, using base 10 for counting, and switching over to base 16 when I get to the kitchen
        Then you'll really love whining about the Brits.

        Comment


        • #5
          Originally posted by domih View Post
          Remember like 40-ya when you learned assembly language and discovered the beauty of the Stack and the Heap? Would you have imagined that at some point the randomization of the stack offset would be a necessary counter-measure due to the total absence of built-in security in best of the best languages such as C?

          I also remember the C developers telling me C was so much more powerful and I was dumb to use Pascal, such a dinosaur language. The Borland Pascal implementation provided everything you needed to implement stuff as sophisticated as C, and as unsecure if that's what you wanted.

          All of that because C was invented in the US while Pascal came from Europe. Oh my.

          One can read https://www.lysator.liu.se/c/bwk-on-pascal.html by Kernighan to see where it all started. C will give you all the power you want! Yet nowhere there was a warning "with great power comes great responsibility". At the time who cared about string, array, structure and pointer checking? It must be said that networking and remote execution were still in their infancy and hackers throwing giant bowling balls into porcelain and glass castle software were quite rare.

          Yep a real lack of vision, but then again C will give you all the power you want! I guess Niklaus Wirth had a little bit more vision.

          Last but not least, no warnings about "we give you a language where you are God and you can shot yourself in the foot as many times as you want".

          It's funny to read it today, first most of the arguments were BS and yep 90+% of the local or remote hacks come from buffer overflows that were of no concern in C.

          Reality quickly forced C providers to add an endless series of ad-hoc libraries to enforce string, array, structure and pointer checking. ROTFL.

          It's a battle from the past, but that's when it was decided that the Internet and remote execution would not be safe.

          Historians will look at it shaking their heads.

          The funny thing is that we learned nothing since then: there is no language where security is really built-in. Yes, from to time, we can see things such as Promise in JavaScript addressing the issue of inversion of control. But beyond buffer overflow mitigations, I'm still looking for a language that would tell me "this piece of data" in variable <name> came from the outside, treat it as the plague, the filter function has not be implemented yet or is not on this execution path. Add it to be able to build a distrib version" or "The <name> class unit-test has been implemented but misses to test 23 execution paths. Unit-tests should test all execution paths before you can build a distrib version". Yes, you've got coverage BUT it is not the same as telling you while you are writing/modifying the code. The latter forces the developer to have security in mind and cannot avoid it. Security in AI and IoT? Woof! With AI the processed end result is basically dictated by the data, how do we protect the data, even for simple bit flips? IoT: mostly done by hardware manufacturers who have no clue about software and security. Given our records with the Internet over the last 25 years, be hopeful, we will screw it up even better with AI and IoT :-)
          Last edited by domih; 27 June 2021, 03:07 PM.

          Comment


          • #6
            Wow .. ~15 years after Solaris

            Comment


            • #7
              Originally posted by kloczek View Post
              Wow .. ~15 years after Solaris
              Where did you read this at? For example Linux got ASLR in 2004 while Solaris in 2012..
              Last edited by Volta; 27 June 2021, 04:44 PM.

              Comment


              • #8
                Originally posted by domih View Post


                One can read https://www.lysator.liu.se/c/bwk-on-pascal.html by Kernighan to see where it all started. C will give you all the power you want! Yet nowhere there was a warning "with great power comes great responsibility". At the time who cared about string, array, structure and pointer checking? It must be said that networking and remote execution were still in their infancy and hackers throwing giant bowling balls into porcelain and glass castle software were quite rare.

                Yep a real lack of vision, but then again C will give you all the power you want! I guess Niklaus Wirth had a little bit more vision.

                Last but not least, no warnings about "we give you a language where you are God and you can shot yourself in the foot as many times as you want".

                It's funny to read it today, first most of the arguments were BS and yep 90+% of the local or remote hacks come from buffer overflows that were of no concern in C.

                Reality quickly forced C providers to add an endless series of ad-hoc libraries to enforce string, array, structure and pointer checking. ROTFL.

                It's a battle from the past, but that's when it was decided that the Internet and remote execution would not be safe.

                Historians will look at it shaking their heads.
                This might be true from today's perspective, but you cannot really blame them. That article is from 1981. At that time they already realized that software complexity is always going to be a primary problem in computer systems. While software at that time was a lot simpler than it is today, this was offset by the hardships of software development compared to today. No online forums to share best practices on, no internet or StackOverflow, no widespread version-control (and none capable), no fancy IDEs, no documentation systems, (almost) no static analyzers in or ex compilers, and in general, very limited global know-how. Not to mention resource (RAM, CPU) constraints that would have made many security measures in software today technically unrealistic. So of course they took the computer language that allowed them to solve their problems in the easiest ways possible. They couldn't possibly foresee what will become of the internet. (EDIT: At that time, here wasn't even an "internet". TCP/IP was only standardized later in the eighties.) If anyone told their colleagues that they should choose a language that provides better memory safety guarantees because in 20 years, even 6-year olds will be using computers and a global computer network will connect even highly secure company-internal servers to scriptkiddies in Russia, they would have laughed at him. It wasn't a problem of vision, it was simply inconceivable and unrealistic back then and there.
                Last edited by ultimA; 27 June 2021, 05:02 PM.

                Comment


                • #9
                  Originally posted by domih View Post
                  The funny thing is that we learned nothing since then: there is no language where security is really built-in. Yes, from to time, we can see things such as Promise in JavaScript addressing the issue of inversion of control. But beyond buffer overflow mitigations, I'm still looking for a language that would tell me "this piece of data" in variable <name> came from the outside, treat it as the plague, the filter function has not be implemented yet or is not on this execution path. Add it to be able to build a distrib version" or "The <name> class unit-test has been implemented but misses to test 23 execution paths. Unit-tests should test all execution paths before you can build a distrib version". Yes, you've got coverage BUT it is not the same as telling you while you are writing/modifying the code. The latter forces the developer to have security in mind and cannot avoid it. Security in AI and IoT? Woof! With AI the processed end result is basically dictated by the data, how do we protect the data, even for simple bit flips? IoT: mostly done by hardware manufacturers who have no clue about software and security. Given our records with the Internet over the last 25 years, be hopeful, we will screw it up even better with AI and IoT :-)
                  There are always trade-offs. I think, at this point in time, Rust is the most secure thing the market will bear.

                  Using the newtype pattern (which the standard library does heavily), you can get taint tagging for data but, given the goal to be a viable replacement for C and C++, including in embedded applications where you may need to craft a pointer to a memory mapped I/O device from an integer constant, I don't think it's feasible to force more security at the language level.

                  Rust has #![forbid(unsafe_code)] which allows you to force use of unsafe into carefully prescribed locations where you can limit who's allowed to contribute and audit heavily (That attribute will irreversibly turn use of unsafe into a hard compile-time error within the current crate if you put it in the source file that defines its root.) but we already see tension between people who pitch a shit-fit over others who "stigmatize" use of unsafe and people who are overconfident in their ability to use it and wind up getting CVEs filed against their code.

                  As for the rest, like forced test coverage, that's basically something you have to construct yourself on your CI server. (And, arguably, that's the right thing, because, when you've got a strong static type system, blindly requiring tests without being able to prove that they actually test the invariants the type system can't prove is of questionable value.)

                  Comment


                  • #10
                    Originally posted by domih View Post
                    ... I'm still looking for a language that would tell me "this piece of data" in variable <name> came from the outside, treat it as the plague ...
                    I think you must have missed Perl, which has the "taint" option. With "taint mode" enabled, Perl marks any values that come from outside the program as tainted. They must be passed through a regex before they can be safely used. These regex checks are supposed to validate that it is a proper value.

                    Comment

                    Working...
                    X