Announcement

Collapse
No announcement yet.

University Banned From Contributing To Linux Kernel For Intentionally Inserting Bugs

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by chuckula View Post
    Looks like this university should stick to parsing technical words in source code that they don't understand to find "offensive" terms and scream about them for SJWs because I'm sure the internals of the Linux kernel are the only source of repression in inner city Minneapolis.

    If universities still cared about actual ethics beyond empty virtue signalling, then these "researchers" should be up on academic charges with penalties up to and including expulsion from the school. But ethics don't seem to be very important anymore.
    Can you imagine the shitstorm heading GKH's way if some of the banned researchers are in some sort of minority...

    Comment


    • #42
      Originally posted by User42 View Post
      So you mean the point of the kernel is being exploitable and exploited? They published a first paper about how they could get "malicious" code in there. Again for the 4th time, I really think they screwed up as there are other ways of handling things nicely but if anything they've shown there is little to no resistance of the kernel to "malicious" code anybody who is not bragging about it (i.e. publishing about it) could easily insert and not being caught.
      ...what? How the hell does your train of thought lead you to think that's what I meant? Just because you shouldn't be screwing with something mission-critical in production, that doesn't mean it should be exploitable and exploited. That's like saying "if you don't like chocolate, you must like vanilla". Again: this paper could have been written on a different project that isn't so important. Try Haiku or Minix for example, where you're not threatening anyone's business, compromising integrity, and wasting time of people who don't have a lot.
      Half the reason this commit even went through is because of the sheer amount of commits. This is a monolithic kernel - it ain't simple stuff. It isn't reasonably possible to thoroughly check every single thing submitted for safety and validity.

      A much better experiment would be to propose an actual solution to preventing malicious code from being submitted, and then actually test if that system works while informing people like Greg about what's going to happen.
      It's precisely because people depend on it that it is important to review the review process, study it and improve it. If backdoors are revealed a couple times in the kernel, those people depending on it won't for long. It's just like saying "let's not try to break AES because people depend on it".
      The gist of the article was open-source projects in general, not specifically Linux. If the concept of this paper is so ubiquitous (which granted, it probably is) then no, you don't actually need to screw with something people depend on. There are plenty of large open-source projects you can meddle with and not cause a major freakout.
      I'm not even that against the concept of the paper. The problem is what they decided to target.
      As for your AES example, if the goal is to break it as it is, that's fine, because that's how you prove it needs updates. That's not an apples to apples comparison though, because if it were then the AES source code would have to be meddled with.

      Comment


      • #43
        Originally posted by Setif View Post
        Good Research paper, It exposes a lot of issues in Large Open Source Projects.
        Instead of focusing on their deeds, they should focus on resolving those issues by introducing new strict policies for contributing code to a very sensitive project like Linux kernel.
        I agree. And new tools to avoid such malicious contributions.

        Everyone will will, except Minnesota.

        I'm sorry to say, but...

        Who trusts Minnesota?

        Comment


        • #44
          Originally posted by User42 View Post
          It's just like saying "let's not try to break AES because people depend on it".
          There is this thing called responsible disclosure. There are guidelines to follow if you find an exploitable problem and want to take part in correcting it. These guys didn't do that and for that they deserve adequate punishment. FTR, they didn't get banned for their research paper. They got banned because they continued to submit potentially harmful code even *after* they published their paper.

          Comment


          • #45
            Originally posted by schmidtbag View Post
            Again: this paper could have been written on a different project that isn't so important.
            I understand what you mean (and I'm sorry for misunderstanding what you were saying). Though, if they chose Minix or any other project, I would have bet you $100 somebody would have said something along the lines of the following within the first page of comments: "those numbers are really high though, we are lucky the Linux kernel gets much much more visibility so the numbers there must be much lower... we are talking about Minix after all" (and would have been repeated by dozens of people afterwards). The point was precisely to show how it is like in such a wide project with so much scrutiny. Again (6th time, not saying that for you specifically), I'm not implying this has been handled in the best way possible (actually far from it). My point is that studying the kernel itself, if done well, is still going to "waste" a bit of people's time on the spot but hopefully will give very important insight to make the process more resilient. The paper I (and another guy) cited a bit earlier is definitely going that way.

            Originally posted by schmidtbag View Post
            A much better experiment would be to propose an actual solution to preventing malicious code from being submitted, and then actually test if that system works while informing people like Greg about what's going to happen.
            Well, there is some necessary measurements that must be done. Going completely in the dark and suggesting things is nice but the shorter route definitely is to probe the state of kernel review directly. Though I 100% agree that contacting Greg, not actually just to notify this but define exactly what is going to be done and ensure those bugs never hit production was the absolute minimum they should have done.

            There is no freakout if the "contract" with the few people in the know (from the kernel side) is well defined with actual deadlines for removal.

            Comment


            • #46
              in the name of research, person has introduced or frankly verified that malicious commits can be indeed introduced without much hassle in open source projects (i mean where source code is available without any restrictions to view and with dedication resolve issues). i wonder how open source community will react to 42nd IEEE Symposium on Security and Privacy (Oakland'21). Virtual conference, May 2021.
              Contribute to QiushiWu/qiushiwu.github.io development by creating an account on GitHub.

              Comment


              • #47
                Originally posted by MadCatX View Post
                There is this thing called responsible disclosure.
                How is that related to the subject? Responsible disclosure is when you find a bug. The point of that study (7th time, ethically questionable one, as conducted) was to see how easy it is to introduce bugs in open source projects. With the kernel receiving so much scrutiny, it was probably considered "fair" and "representative" of what a project with high review standards looks like (just assuming there).

                Originally posted by MadCatX View Post
                potentially harmful
                No that's not really why things happen but anyway...

                Comment


                • #48
                  Did anybody even bother to read a little deeper or everyone is just jumping on the 'bad bad bad' hype?

                  A number of reviewers are recommending leaving some of the commits untouched because they introduced no issues, or otherwise ended up fixing issues for real.

                  If a bunch of 'malicious' commits made by a professor and a bunch of undergrad students fixes problems in the kernel, it's blatant proof of how fragile and messed up the kernel has become.

                  Comment


                  • #49
                    Originally posted by om26er View Post
                    Now imagine if the same contributions were made by a Chinese university...
                    You would not find a Chinese university doing this kind of research unless they are nuts. You would not just have a annoyed Greg here. Attempting to intentionally added flaws that could come backdoor to software China depends on in government departments would be a in China a prosecution for treason. Basically china case greg gets annoyed enough to ban the a Chinese university someone is going to jail to possible dead due to a prosecution for treason..

                    This is important to be aware of when you have international students studying aboard at times they will take this chance todo something their home countries universities will not allow at all because its illegal in their home countries.

                    Big question was this research NSA backed if not its legally treason as well. The reality here saying greg or the kernel would should bring it up with a board of ethics is wrong. Intentional sabotage even under USA is illegal.
                    Sabotage is the act of hampering, deliberating subverting, or hurting the efforts of another. It is most often an issue in the context of military law, when a person attempts to thwart a war affort,


                    Basically if the universities board of ethics signed off on this research they need to be prosecuted. If you wanted to-do research on fault detection by open source community to be legal you would need cooperation of the maintainer of that section of the Linux kernel to make sure you tests never made it mainline so never was intentional sabotage.

                    The behaviour is not just annoying its illegal in most countries so should never pass a ethics board.

                    Comment


                    • #50
                      Originally posted by User42 View Post
                      I understand what you mean (and I'm sorry for misunderstanding what you were saying). Though, if they chose Minix or any other project, I would have bet you $100 somebody would have said something along the lines of the following within the first page of comments: "those numbers are really high though, we are lucky the Linux kernel gets much much more visibility so the numbers there must be much lower... we are talking about Minix after all" (and would have been repeated by dozens of people afterwards). The point was precisely to show how it is like in such a wide project with so much scrutiny. Again (6th time, not saying that for you specifically), I'm not implying this has been handled in the best way possible (actually far from it). My point is that studying the kernel itself, if done well, is still going to "waste" a bit of people's time on the spot but hopefully will give very important insight to make the process more resilient. The paper I (and another guy) cited a bit earlier is definitely going that way.
                      I'm sure your bet would be paid too, but visibility is irrelevant. Think of it like this:
                      If you wore an offensive shirt while apart of an audience in a stadium of 100k people, nobody is going to notice you except the people who let you in and anyone you're sitting next to. You're in a public space where not only 100k people could be looking at you, but cameras with millions of other people could be looking at you too. And yet, nobody will notice, because you're just a small blip, and more importantly: you're not the reason people are there. The Linux kernel is no different. There are millions of eyes on the kernel but there are so many more millions of lines of code, most of which nobody has any reason to pay attention to. Your commits are seen by a handful of people and anyone who needs to directly work with them. Otherwise, they're effectively invisible and could remain that way.
                      With Minix, it's more like a stadium at a local high school, where maybe only a few hundred people show up. You're more likely to be spot in the crowd, but you also don't need to interact with as many people to get a seat, making you less noticed. And, you're still not the reason why people are there.
                      But in either case, you could still opt for a bigger [than Minix] open-source project that isn't mission-critical, such as Blender or LibreOffice. You're not compromising a business' integrity by meddling with those, and they're easy to downgrade if necessary.
                      So whether you go big or small, the premise of the paper still applies. Therefore, you can still gain the insight you speak of without causing a problem or false alarm.
                      Going completely in the dark and suggesting things is nice but the shorter route definitely is to probe the state of kernel review directly.
                      When we're talking mission-critical software, you don't take the shortest route, you take the route of least downtime and loss, even if theoretical. As I said before, it doesn't matter what the intention was, you just don't experiment on production releases of major products.
                      There is no freakout if the "contract" with the few people in the know (from the kernel side) is well defined with actual deadlines for removal.
                      Agreed.
                      Last edited by schmidtbag; 21 April 2021, 11:31 AM.

                      Comment

                      Working...
                      X