Originally posted by tomato
View Post
Announcement
Collapse
No announcement yet.
New Linux Kernel Vulnerability Exploited
Collapse
X
-
-
Originally posted by Sergio View PostThese kind of things... History has taugh us again and again that software is inherently buggy (insecure?); it is simply to many 'variables' that it is virtually imposible to escape this reality. It doesn't matter how much effort is put on design, it doesn't matter whether it is Linux, Windows, Solaris, BSD, MINIX, Plan 9, AIX, MULTICS, it doesn't matter if it is 'direct' or managed code...
I think that shifting away from this (apparently) natural issue about software in general requires something radical and essentially new. I hope to be able to see such thing materialize.
Comment
-
Originally posted by GreatEmerald View PostThere already is this something radical and new like that. It's called Hardened Gentoo. Or PaX kernel sources, to be specific, which is what guards from buffer overflows like this one. That is, buffer overflows still happen due to poor code, but attackers can't use them, since they can't track where the code they want to be executed is located, as it's constantly randomised in memory. It does come at a cost of some overhead, though.
When I said that something radical was needed, I was thinking more on a complete shift in the way we create computer programs (just to put an example, functional instead of imperative systems programming).
Comment
-
Also lets not forget the fact that closed code can't get audited by outsiders. Open code tends to have its flaws documented to much greater extremes than closed code does. Also turn around time for flaws in open code is much faster. Often time turn around for flaws on closed code can take years.Last edited by duby229; 15 May 2013, 06:56 PM.
Comment
-
Originally posted by Sergio View PostWhen I said that something radical was needed, I was thinking more on a complete shift in the way we create computer programs (just to put an example, functional instead of imperative systems programming).
I would say the software development process itself is a big part of it, as it determines the rate at which fixes come out and the frequency of regressions. There's still a lot of people out there who don't use DVCS and couldn't write a unit test to save their life. Just figure out what Java is doing, and then do the opposite.
A lot of it is also related to network security
Other ideas:
Micro-Kernels and modularity (stability and resilience, at the cost of overhead)
P2P package repositories (eliminate vulnerable centralized servers, and speed downloads)
P2P DNS servers (ditto)
512-bit encryption, or higher (always going to be an arms race)
Quantum networks (theoretical still)
Static Analysis (should get better with advanced AI)
Browsers are also phasing out plugins, as web standards improve. This will slow the spreading of malware, but stupid people are always going to be around with 30 toolbars installed, acting as hosts for every kind of botnet.Last edited by EmbraceUnity; 15 May 2013, 09:47 PM.
Comment
-
Originally posted by Sergio View PostOf course; just as Microsoft's people are paid to look at Windows, or Oracle's at Solaris. Sure, the code is there and everyone can debug it, but if only people being payed are looking at it, how does it differ from the situation in the closed-source model?
Comment
-
Originally posted by Sergio View PostOf course; just as Microsoft's people are paid to look at Windows, or Oracle's at Solaris. Sure, the code is there and everyone can debug it, but if only people being payed are looking at it, how does it differ from the situation in the closed-source model?
You can't just go around exploring things and trying to change things.
That's basically what they say. Partially the reason why they release bug solutions so late to the game, is not because they find it them selves I would guess, but they get told to search for them and only fix what is found.
In open source, EVERYBODY can look and poke. Everybody can submit a patch. People want to fix things to make things better. If you want to change something radically and do the work, it usually gets accepted (assuming it all makes sense etc).
Then there's academic review, teacher using open source as teaching tools. Someone might bump into something there.
And just hobbyists wanting to learn things and looking around.
All in all, its save to say, many eyes can't be bad
Comment
-
Originally posted by EmbraceUnity View Post512-bit encryption, or higher (always going to be an arms race).
It wont be an arms race forever: http://en.wikipedia.org/wiki/Limits_to_computation
Comment
-
Originally posted by LightBit View PostWe already have 512-bit encryption (RC4 1684-bit, RC6 2040-bit, Threefish 1024-bit, HPC 16384-bit), but nobody uses good and long keys.
It wont be an arms race forever: http://en.wikipedia.org/wiki/Limits_to_computation
Comment
Comment