No announcement yet.

FSF Talks Up Libreboot As New Coreboot Downstream

  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by rstat1 View Post
    If you have something that "important" it shouldn't be in postion to be accessed by anyone that doesn't have physical access to the machine(s) it's on. If some gov. org wanted your crap that badly, open source firmware is not gonna stop them from getting it.
    touché. There I have to agree with, not a very convincing example. But:

    Originally posted by rstat1 View Post
    So whoever said this was "Free software fundamentalism", was perfectly right IMO, as open sources != perfectly secure as some of you people think it does. See the recent OpenSSL debacle for proof.
    Well, with that though I have to disagree. Where I was going with my argument is, that it is *not* just about "free software fundamentalism". Because its not just an idelogical "oh look, my shiny free software stack", but it actually has practical implications. OpenSSL is a good example: Yes, you're totally right, open source != perfectly secure. But again, if openSSL was a closed source libary everybody used, nobody could have fixed his/her system after the bug was found - unless the vendor releases a patch. Which the vendor might or might not do. And again, there are now people auditing the code more thoroughly. You can do that in the open. And anybody who really needs a secure SSL-connection can pay more people he/she trusts to do it right.

    There will always be bugs and nothing is 100% secure, but having the code gives the users the means to deal with them as diligently as they want to.


    • #17
      It's good enough if you can force the NSA to use an exploit worth more than you are

      Originally posted by emerge-e-world View Post
      That is true, but at least you _can_ spend time and money to audit software, if you have access to the source code. If it's blob, you can just trust your vendor.

      You probably won't spend time and money for your media center in your living room or your gaming rig. But let's say, for example, you are a company developing bleeding edge somethings you need to keep a trade secret for the time being. If it's big enough a deal, you'd probably be willing to spend money to make your system is secure from certain agencies that consider it "terrorism" that you're developing something their nation's companies do not have. If you can't get the source code, throwing an infinite amount of money and security experts on it still would not help a bit.

      Maybe, maybe not. You don't know. The NSA considered it even worth it to mess with Cisco's hardware directly. Using a custom firmware would actually be much more sneaky and easier to implement. Of cource, if you want to do it right - just use every route you can to brach a system.
      But okay, let's say it is only bugs they would exploit. Great! Luckily, bugs can be fixed, software can be patched. Oh, you have an exploitable bug in a firmware blob/driver blob/proprietary software? Sorry, you're out of luck.
      OK, let's take this to the real world. Is anyone in a "high-value" target group like al-Qaeda or ISIS going to use a Cisco router now? OK, that means all Cisco exploits can be considered burned so far as efforts by the US to see what is happening in Syria and Iraq are concerned. A high value exploit is kind of like a bullet: use it and lose it. If the Cisco story had been broken by use of a Cisco exploit to arrest anarchists for running a comms server during a protest where windows were broken but nobody harmed, trash cans full of smashed and burned Cisco gear outside known ISIS facilities in Syria and Iraq would have the CIA screaming bloody murder. Therefore, the Cisco exploits may have been no exploits at all so far as that hypothetical comms server was concerned.

      An even bigger mistake for NSA or FBI would be the use of a hardware or firmware backdoor in Intel processors just to decrypt a hard drive full of pictures of masked protesters, no matter how violent the protest. That would kill a large part of Intel's non-US business if the defense attorney was able to force the prosecution to produce the exploit in court-or if the activists discovered, defeated, and published the exploit with their own hacking skills. Even with Windows malware the FBI is reported to be extremely reluctant to deploy any kind of CIPAV or other remote malware against known or suspected hackers

      Mostly we see snitches and cellphone/iOS exploits at that level, rarely a Windows exploit, never seen a Linux or firmware exploit in a court case concerning Anarchists, Occupy, or even the ALF and ELF. It is so common for at least one person in any group in the US to have an iOS device or at least a cellphone that I suspect police agencies may not even bother to learn to attack the hardest target in a group(the Linux hacker) instead of the easiest(the iPhone owner). Getting people to leave iOS and Android devices at home and turn off cellphones is real security, worrying about BIOS keyloggers may be security theater unless you are at the Snowden level, an entirely different can of worms that requires never-networked machines for decrypting comms and separate computers running TAILS to pass the cyphertext. That's one of the best applications for Trisquel running over deblobbed Coreboot I can think of, and yes, I would expect to have to buy pre selected, known compatable hardware at a randomly-chosen store using cash.

      On the other hand, if a personal enemy or a thief after banking info knows what hardware you have and stumbles onto an exploit, all bets are off, and the same is true of the cops or FBI know both that an exploit exists and that someone else is about to publish it (or already has) anyway. In the latter scenario it's "use it while you can," like a cop having to eat a cream-filled donut before it spoils.

      Thus, we can apply the same analysis the cops and NSA do: low hanging fruit. Get rid of iOS first and foremost if state-level actors are a concern. Get rid of Windows, most of the x86 exploits are against it. Hell, they are mostly Windows payloads over exploits against Flash and Java when you really think about it. Next, isolate your data and your networks from each other: do not allow your ISP to run software on your computer, and never connect any computer with your filesystem on it directly to the Internet. Treat the router and the network as malicious, do not plug anything that connects directly to any ISP into a PCI, PCI-E, or Firewire socket. Never put sensitive data on a LAN unless you use encryption other than that provided by the router, treat that as automatically broken and read by your ISP as the router has the keys. You may or may not be able to trust your software, you certainly cannot trust the network or the router. Remember, since router exploits are now well known, they can be used freely against you unless you have one with no known exploits. Make sure it can't see anything worth reporting, think about low-value bulk data mining programs here: Your ISP and your router can't monitor Tor traffic for source and destination and can't read the content of your https traffic whether or not the NSA can. Deny them the low hanging fruit, eat it yourself!


      • #18
        Originally posted by michal View Post
        I don't have any guarantee that there is no backdoor in this computer until I don't spend time and money on auditing software.
        Scientific theories are backed up inductively but they aren't proven like math theorems, and I can't make sure the axioms mathematics is built upon are actually true. Why even bother giving my confidence to a bunch of rigorous academics, right? You might as well post your credentials and bank account information here, since you aren't secure anyway.

        I don't have any guarantee that there is no backdoor in this proprietary system unless I... Oh wait, I simply can't make sure and never will.

        Originally posted by michal View Post
        Anyway there are still software bugs that can be used to exploit OS, so I doubt if any government or corporation would spend money on backdoors. It's cheaper to have large collection of exploits.
        Haha, no damned way. Finding the vulnerability, writing an exploit and hoping that none else notices it and it doesn't get patched is in no way a cheaper and more effective backdoor than deliberately and covertly placing one on some ultra popular piece of binary blobness.

        There's no need to excuse your submissive or careless attitude towards software control and software security with wasted fallacies.


        • #19
          Originally posted by Drago View Post
          I just don't understand how some of the big vendors didn't took advantage of this recent NSA debacle. One can advertise laptops with Core/LibreBoot and Linux as NSA free, and earn big money on the people fear.
          Because the big vendors you are referring to are in bed with the NSA, and many of them profit with your personal data. Before they can use free software operating systems and firmware as a selling point they need to educate their consumers on the importance of free software; but then again, why would they do that when abusing uninformed users is more profitable?

          I know a few companies that did take advantage with the Snowden leaks. DuckDuckGo comes to my mind. However, generally speaking these are very few and particular to the FOSS niche, not exactly what you would call "big vendors". Overall, tech giants are loosing because of the spawning privacy debate and so they try to minimize its impact.

          Originally posted by doom_Oo7 View Post
          Well, the next step is to make a laptop with that GPLv3 gpu

          For ethernet :
          For basic USB : (maybe something better exists ?)

          And who needs superIO nowadays
          Sweet links mate.
          Did you hear about the Novena laptop, and OpenCores?