Bug wouldn't be found if it was lroprietary. I wonder how many critical bugs are present in windows, os x or some other proprietary crap? Nobody knows.
Announcement
Collapse
No announcement yet.
Libgcrypt/GnuPG Hit By Critical Security Problem Since 1998
Collapse
X
-
Originally posted by Sonadow View PostWonderful, an 18 year old vulnerability.
All the open source eyes in the world, and not a single one managed to catch this one until today.
Just have a look at the majority of major exploitations and you will see their original source: nearly always within an API's deepest corners (usually implemented in C) and it's well hidden within macro or pointer chaos.
Therefore instead of adding features on top of features, we should stop this absurd fast-pace release-new-feature-for-the-sake-of-impressing-our-investors concept and should realize we need a stabler system with safest code that is well tested and QA-ed extensively.
I prefer having a 2-year released program with limited and well-known bugs to the developers, than having a 2-month released program full of silly, yet dangerous bugs that are difficult to replicate on different platforms.
To clear my position, this is just my own opinion and should not be taken for granted.
- Likes 1
Comment
-
Originally posted by Sonadow View PostWonderful, an 18 year old vulnerability.
All the open source eyes in the world, and not a single one managed to catch this one until today.
"An attacker who obtains 4640 bits from the RNG can trivially predict the next 160 bits of output"
Logic bugs are bitches to find in general, logic bugs in crypto code are rockstar queen bitches of bugs as they need GOOD cryptoanalists and a shitton of luck to be detected.
- Likes 1
Comment
-
Originally posted by stephen82 View PostTherefore instead of adding features on top of features, we should stop this absurd fast-pace release-new-feature-for-the-sake-of-impressing-our-investors concept and should realize we need a stabler system with safest code that is well tested and QA-ed extensively.
Marketing types (standard company leadership) don't know and don't care, and sell the product to other marketing types that don't know and don't care.
- Likes 2
Comment
-
RNG Whitening Bug Weakened All Versions of GPG http://qntra.net/2016/08/rng-whiteni...rsions-of-gpg/
"In effect, this means that no key created with GPG to date carries more than 580 bytes of effective entropy (e.g., all 4096-bit and above RSA keys have 'subkeys' which – we now find – mathematically relate, in a possibly-exploitable way, to the primary key.) ... And thus we find that, due to the staggeringly-braindamaged design of the protocol and of this implementation, GPG users who elected to use longer-than-default GPG keys ended up with smaller-than-default effective cryptographic strength."
Comment
-
Oh, the Phuctor guy. Ignore him, he has a history of spreading FUD about PGP being horribly broken and accusing anyone who corrects him of being part of some elaborate cover-up (and in at least one case, threatening to contact the university of the person who did so). If you're not familiar with Phuctor, it's a project analysing PGP keys for shared factors. He keeps releasing blog posts claiming to have broken a bunch of high-profile Debian and Linux developers' keys, when he's actually broken corrupted versions of their keys which are unusable because the self-signatures are broken. If pushed on it, he'll argue that some of the keys have valid self-signature - the ones for [email protected] and the copy of the log-broken TI calculator keys that some joker uploaded no doubt du, but they're hardly interesting.
In this case, he appears to be misreading the announcement, which says that someone who sees the first 4640 bits of RNG output can predict the next 160 bits of output. Not every single output bit from then on, which is what would be necessary for his claim to be true, just those bits. There's an explanation of exactly why this happens here: http://formal.iti.kit.edu/~klebanov/...-2016-6313.pdf In short, the mixing function incorrectly replaces each block with a hash of part of the buffer that doesn't include that block, so the last block processed can be predicted if you know the entire rest of the buffer.
Comment
-
Originally posted by makomk View PostOh, the Phuctor guy. Ignore him, he has a history of spreading FUD about PGP being horribly broken and accusing anyone who corrects him of being part of some elaborate cover-up (and in at least one case, threatening to contact the university of the person who did so). If you're not familiar with Phuctor, it's a project analysing PGP keys for shared factors. He keeps releasing blog posts claiming to have broken a bunch of high-profile Debian and Linux developers' keys, when he's actually broken corrupted versions of their keys which are unusable because the self-signatures are broken. If pushed on it, he'll argue that some of the keys have valid self-signature - the ones for [email protected] and the copy of the log-broken TI calculator keys that some joker uploaded no doubt du, but they're hardly interesting.
In this case, he appears to be misreading the announcement, which says that someone who sees the first 4640 bits of RNG output can predict the next 160 bits of output. Not every single output bit from then on, which is what would be necessary for his claim to be true, just those bits. There's an explanation of exactly why this happens here: http://formal.iti.kit.edu/~klebanov/...-2016-6313.pdf In short, the mixing function incorrectly replaces each block with a hash of part of the buffer that doesn't include that block, so the last block processed can be predicted if you know the entire rest of the buffer.
Comment
Comment