Originally posted by chuckula
View Post
University Banned From Contributing To Linux Kernel For Intentionally Inserting Bugs
Collapse
X
-
Originally posted by User42 View PostSo you mean the point of the kernel is being exploitable and exploited? They published a first paper about how they could get "malicious" code in there. Again for the 4th time, I really think they screwed up as there are other ways of handling things nicely but if anything they've shown there is little to no resistance of the kernel to "malicious" code anybody who is not bragging about it (i.e. publishing about it) could easily insert and not being caught.
Half the reason this commit even went through is because of the sheer amount of commits. This is a monolithic kernel - it ain't simple stuff. It isn't reasonably possible to thoroughly check every single thing submitted for safety and validity.
A much better experiment would be to propose an actual solution to preventing malicious code from being submitted, and then actually test if that system works while informing people like Greg about what's going to happen.
It's precisely because people depend on it that it is important to review the review process, study it and improve it. If backdoors are revealed a couple times in the kernel, those people depending on it won't for long. It's just like saying "let's not try to break AES because people depend on it".
I'm not even that against the concept of the paper. The problem is what they decided to target.
As for your AES example, if the goal is to break it as it is, that's fine, because that's how you prove it needs updates. That's not an apples to apples comparison though, because if it were then the AES source code would have to be meddled with.
Comment
-
-
Originally posted by Setif View PostGood Research paper, It exposes a lot of issues in Large Open Source Projects.
Instead of focusing on their deeds, they should focus on resolving those issues by introducing new strict policies for contributing code to a very sensitive project like Linux kernel.
Everyone will will, except Minnesota.
I'm sorry to say, but...
Who trusts Minnesota?
Comment
-
-
Originally posted by User42 View PostIt's just like saying "let's not try to break AES because people depend on it".
Comment
-
-
Originally posted by schmidtbag View PostAgain: this paper could have been written on a different project that isn't so important.
Originally posted by schmidtbag View PostA much better experiment would be to propose an actual solution to preventing malicious code from being submitted, and then actually test if that system works while informing people like Greg about what's going to happen.
There is no freakout if the "contract" with the few people in the know (from the kernel side) is well defined with actual deadlines for removal.
Comment
-
-
in the name of research, person has introduced or frankly verified that malicious commits can be indeed introduced without much hassle in open source projects (i mean where source code is available without any restrictions to view and with dedication resolve issues). i wonder how open source community will react to 42nd IEEE Symposium on Security and Privacy (Oakland'21). Virtual conference, May 2021.
Comment
-
-
Originally posted by MadCatX View PostThere is this thing called responsible disclosure.
Originally posted by MadCatX View Postpotentially harmful
Comment
-
-
Did anybody even bother to read a little deeper or everyone is just jumping on the 'bad bad bad' hype?
A number of reviewers are recommending leaving some of the commits untouched because they introduced no issues, or otherwise ended up fixing issues for real.
If a bunch of 'malicious' commits made by a professor and a bunch of undergrad students fixes problems in the kernel, it's blatant proof of how fragile and messed up the kernel has become.
Comment
-
-
Originally posted by om26er View PostNow imagine if the same contributions were made by a Chinese university...
This is important to be aware of when you have international students studying aboard at times they will take this chance todo something their home countries universities will not allow at all because its illegal in their home countries.
Big question was this research NSA backed if not its legally treason as well. The reality here saying greg or the kernel would should bring it up with a board of ethics is wrong. Intentional sabotage even under USA is illegal.
Sabotage is the act of hampering, deliberating subverting, or hurting the efforts of another. It is most often an issue in the context of military law, when a person attempts to thwart a war affort,
Basically if the universities board of ethics signed off on this research they need to be prosecuted. If you wanted to-do research on fault detection by open source community to be legal you would need cooperation of the maintainer of that section of the Linux kernel to make sure you tests never made it mainline so never was intentional sabotage.
The behaviour is not just annoying its illegal in most countries so should never pass a ethics board.
Comment
-
-
Originally posted by User42 View PostI understand what you mean (and I'm sorry for misunderstanding what you were saying). Though, if they chose Minix or any other project, I would have bet you $100 somebody would have said something along the lines of the following within the first page of comments: "those numbers are really high though, we are lucky the Linux kernel gets much much more visibility so the numbers there must be much lower... we are talking about Minix after all" (and would have been repeated by dozens of people afterwards). The point was precisely to show how it is like in such a wide project with so much scrutiny. Again (6th time, not saying that for you specifically), I'm not implying this has been handled in the best way possible (actually far from it). My point is that studying the kernel itself, if done well, is still going to "waste" a bit of people's time on the spot but hopefully will give very important insight to make the process more resilient. The paper I (and another guy) cited a bit earlier is definitely going that way.
If you wore an offensive shirt while apart of an audience in a stadium of 100k people, nobody is going to notice you except the people who let you in and anyone you're sitting next to. You're in a public space where not only 100k people could be looking at you, but cameras with millions of other people could be looking at you too. And yet, nobody will notice, because you're just a small blip, and more importantly: you're not the reason people are there. The Linux kernel is no different. There are millions of eyes on the kernel but there are so many more millions of lines of code, most of which nobody has any reason to pay attention to. Your commits are seen by a handful of people and anyone who needs to directly work with them. Otherwise, they're effectively invisible and could remain that way.
With Minix, it's more like a stadium at a local high school, where maybe only a few hundred people show up. You're more likely to be spot in the crowd, but you also don't need to interact with as many people to get a seat, making you less noticed. And, you're still not the reason why people are there.
But in either case, you could still opt for a bigger [than Minix] open-source project that isn't mission-critical, such as Blender or LibreOffice. You're not compromising a business' integrity by meddling with those, and they're easy to downgrade if necessary.
So whether you go big or small, the premise of the paper still applies. Therefore, you can still gain the insight you speak of without causing a problem or false alarm.
Going completely in the dark and suggesting things is nice but the shorter route definitely is to probe the state of kernel review directly.
There is no freakout if the "contract" with the few people in the know (from the kernel side) is well defined with actual deadlines for removal.Last edited by schmidtbag; 21 April 2021, 11:31 AM.
Comment
-
Comment