Announcement

Collapse
No announcement yet.

Torvalds Voices Thoughts On Linux Mitigating Unexpected Arithmetic Overflows/Underflows

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #81
    Originally posted by F.Ultra View Post
    Actually there is, doing a proper check for if a signed integer would overflow in C is very complicated to write safely. There would be much benefits to having some form of compiler or libc helper here. Most if not all security bugs from handling input or media is due to improper length handling (which includes overflow).

    edit: in fact so much benefits that apparantly GCC already have a whole range of them: https://gcc.gnu.org/onlinedocs/gcc/I...-Builtins.html
    And this is precisely what you have to do if the language standard is lacking: built-in functions and/or compiler directives. Standards maintainers might even look at what fix-ups compiler writers are using and incorporate them in later editions of standards. The trouble is, the built-ins or directives are less portable than the language itself, because you are generally reliant on a particular compiler implementations - which might be different, or not supported across compilers.

    Comment


    • #82

      Originally posted by ssokolow View Post
      That's an illusion.
      1. The Linux kernel isn't written in ANSI C and never has been. It's written in GNU C and half the effort to compile Linux using LLVM Clang was extending LLVM Clang to implement the relevant portions of GNU C.
      Not really. What I am saying is that almost all platforms (including embedded stuff such as handheld PoS terminals) have some flavor of gcc compiler and hence can support building Linux kernel without additional dependencies (only vendor's SDK and runtime libs needed).

      Originally posted by Jakobson View Post
      Okay... How should undefined behaviors be handled then?
      Umm... just avoid writing code which exhibits UB? It's undefined behavior for a reason.

      Originally posted by Jakobson View Post
      They are certainly not defined in the C standard.
      Take as an example typical "use after free".

      If you access pointer variable p after calling free(p) different CPU architectures will behave differently -- on x86 you will be able to dereference p and read the memory it points to unless it was paged out in which case you will get access violation. On some other CPU architectures, even accessing p variable itself without dereferencing it to reach the memory it points to can trigger a trap.

      C standard says that after call to free(p) the value of p itself is undefined exactly because it can't guarantee that p hasn't been changed into a magic value that will cause a trap when you try to read it.

      Originally posted by Jakobson View Post
      How can they be detected and prevented from being written into the OS?
      Get familiar with the C standard and avoid writing code that relies on any form of UB specified there. It's really that simple.

      Originally posted by darkonix View Post
      The problem is that programmers don't need to pass an exam on the standard they use to be programmers.
      That problem won't be solved by adding more guardrails and training wheels -- it can only be solved if software engineering starts being treated like any other form of civil engineering with licenses and criminal penalties if you cause damage through negligence.

      Originally posted by darkonix View Post
      My 12 years old nephew learned C on his own to program an arduino. I don't expect him to be accepted as a kernel programmer anytime soon but I don't expect him to know these rules.
      Your nephew can also learn to play say electric bass as a beginner and play some simple bass tunes. But if he wants to play with a professional band he has to also learn all scales, progressions, grooves, and all techniques such as fingering, slapping, bending, etc. My point is -- you can dabble in C just fine without ever getting to kernel level code. Problem here is that some people think they can learn it quickly and worse yet they believe they are entitled to try working on it.

      Originally posted by darkonix View Post
      Having rules may be good, but not enough. Isn't it better to automate repetitive, error prone work like ensuring that those rules are followed correctly?
      MISRA C rules are already supported by most linters and static analyzers since the 90s. Tools are there for those who think they need them. You can't burden everyone because you feel insecure.

      Comment


      • #83
        Originally posted by F.Ultra View Post

        Actually there is, doing a proper check for if a signed integer would overflow in C is very complicated to write safely.
        Yeah, you can not check for overflow after the fact since it is undefined behavior and compilers are free to optimize based on the assumption that signed overflow never happens (and are thus also allowed to remove any checks you wrote for it, because they can never happen). So you have to detect it _before_ doing the operation that could overflow, without overflowing... Not very reasonable, so use unsigned integers, compiler extensions or write it in assembler.

        Comment


        • #84
          Originally posted by Muddy View Post
          As long as you are approaching this from a "this puts the onus on others", *YOU* are the problem.
          Be the solution, not the problem.​​
          That's just good advice for any job, very much so for us in IT/Technology.
          Ye - if you want to be replaced in 1-2 years after creating "perfect automation for production".
          Only people who creating barely working code/program with lots of "promises" that just empty placeholders, programs with lots and lots annoying bugs - only those people stay at job.

          Comment


          • #85
            Linus sure got the issue in his crosshairs.
            Almost seems as if Linus is saying legacy code works like this so we need to keep it that way.
            Programming languages should adopt the best design, not do something a lot of legacy code has been written in.
            Look at C's and C++'s adoption of operator precedence.

            I'd prefer saturation arithmetic by default and some extra operators and math functions with wraparound integrated without primarily relying on separate helper functions or function overloading.
            Some stuff that's compact and easy to mentally work with (low cognitive load).

            Comment


            • #86
              [QUOTE=CmdrShepard;n1464171]
              "C already has plenty of rules on hardening -- MISRA C being the most prominent. Following those is good enough (and has been good enough for medical, automotive, and aerospace industry where lives are at stake) -- the trouble is that young people don't want to read either the standard or the guidelines but instead want to change the standard to suit their narrative."

              Truly, I say to you that all that Misra achieves is to highlight how problematic C is.

              "All C programs contain undefined behaviour

              Ex: 2-line program in Appendix I of MISRA C:2012
              "​

              Last edited by kevlar700; 14 May 2024, 04:40 PM.

              Comment


              • #87
                Originally posted by F.Ultra View Post

                Had Linux been written in Ada then Linux would have died on 25 August 1991. Half thinking that you are making some form of joke here considering that an unhandled exception due to an signed integer overflow was the reason for the Ariadne 5 crash resulting in a loss of more than $370M ($736M in today:s worth).
                So the overflow caused an exception and Spark could have found it at compile time. Arianne 5 happened because code was copied from the previous rocket that overrode the protections.

                Ada is a better language for the Linux kernel than C or Rust will likely ever be. You can makeup whatever irrelevant arguments that you like.

                Comment


                • #88
                  Originally posted by carewolf View Post

                  Yeah, you can not check for overflow after the fact since it is undefined behavior and compilers are free to optimize based on the assumption that signed overflow never happens (and are thus also allowed to remove any checks you wrote for it, because they can never happen). So you have to detect it _before_ doing the operation that could overflow, without overflowing... Not very reasonable, so use unsigned integers, compiler extensions or write it in assembler.
                  Or use a datatype twice the size of the operands and you'll be sure they won't overflow in any case

                  Comment


                  • #89
                    Originally posted by kevlar700 View Post
                    Ada is a better language for the Linux kernel than C or Rust will likely ever be. You can makeup whatever irrelevant arguments that you like.
                    I'd say people should read this comment and make up their own minds.

                    Having extensively used both Rust and Ada, I find it difficult to see them as overlapping or fighting for the same niche. The two languages/communities put focus on very different things:

                    [...]

                    Comment


                    • #90
                      Originally posted by ssokolow View Post

                      I'd say people should read this comment and make up their own minds.


                      I hope they do by trying Ada.

                      An interactive learning platform to teach the Ada and SPARK programming languages.


                      Honestly, he does not seem to know Ada very well at all. It is quite a terrible and misleading post. First Ada is not targeted at any niche at all but was specified by the D.O.D. to replace ALL of the over 450 languages that they had in use. Also, Ada was specified to be designed for embedded and hardware use and so is far better at doing so as well as at network protocols than C or Rust. Drowning in type information noise is simply nonsense. Ada has been demonstrated a number of times to meet it's other primary design goals of readability and maintainability with reduced project lifetime costs when compared to C and Java.



                      He doesn't even seem to understand that one option is the Ada light runtime which runs on basically any chip without effort and still provides for most of Adas features.
                      Last edited by kevlar700; 14 May 2024, 08:19 PM.

                      Comment

                      Working...
                      X