Announcement

Collapse
No announcement yet.

"Panthor" DRM Driver Coming Together For Newer Arm Mali GPUs

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by coder View Post
    There's a huge hole in your stance, because it might be stored in embedded NAND rather than ROM. That avoids externally loading it at runtime, but still enables it to be updated in case of bugs or feature enhancements.
    I'm not sure whether you're tongue in cheek or serious, sorry.

    I don't care whether software is stored on replaceable or rewritable DVDROMs, EEPROMs, NANDs, NORs, UVPROMS, SSDs, HDDs, floppy disks, tapes, punched cards or elsewhere. Either it can be updated (without a soldering iron) or it can't. If it can it's software and I want it free (even in NAND). If it can't then it's hardware and I want it correct and from a vendor as trustworthy as possible.

    I don't see a huge hole there. Maybe you mean the NAND may come prewritten from factory. But if there's a means for the software, with due privileges or keys, or even with user turning some switch on or so, to update the firmware, then that firmware is software and must be free. If it's some kind of flash with WE straped to false (normally it's negated, so pulled up), or some such, then it works like a ROM when I get it, and the firmware there it's hardware. At least if it's soldered down. If it's socketed then for me it's software, just like in a floppy, but I could listen to someone arguing differently. In the end it depends on how expensive is for the vendor to change the firmware. If it's cheaper than a recall it's software, but if it's more or less the same work with maybe a little less waste, then it might be considered hardware. If the vendor can change it and I can't then the vendor has power over my hardware that I don't have, after selling it to me, and I don't like that.

    Comment


    • #22
      Originally posted by phoron View Post
      I don't see a huge hole there. Maybe you mean the NAND may come prewritten from factory.
      Let's say there's embedded NAND EEPROM that's written at the factory. You have no way of knowing whether it's that or true ROM, because there's no public spec on the device which tells you it's updatable, even though the proprietary driver might do just that.

      So, you could be happily using it, unaware that it's updatable, and then when you learn that it is, you suddenly have to stop using it. To me, that makes your position look tenuous.

      The only defensible position I see is requiring that hardware devices be fully-specified, to the extent that you almost want there to be some sort of penalty if there turns out to be a backdoor or undocumented feature.

      Comment


      • #23
        Originally posted by coder View Post
        Let's say there's embedded NAND EEPROM that's written at the factory. You have no way of knowing whether it's that or true ROM, because there's no public spec on the device which tells you it's updatable, even though the proprietary driver might do just that.
        I don't use proprietary drivers. If I don't want proprietary firmware why on hell would I want proprietary drivers ?
        If a free driver does that I just hope the community would notice. That's just what the team behind linux-libre does. I wish it was a bigger community, but still.
        The point is a try to take care of what software I put on my machine, and where from internet it gets updates, and check the policies of those repositories.
        It's all standard software provisioning, I don't see what the difficulty is. Yes, when your criteria are shared by a minority is harder than when it's more people, but that does not make the reasoning false.

        So, you could be happily using it, unaware that it's updatable, and then when you learn that it is, you suddenly have to stop using it. To me, that makes your position look tenuous.
        I must be doing lot of stupid things that I'm unaware of, but I argue about what I do when and in as much as I'm aware of what I'm doing.
        I'm not infallible, but that does not make me any less rational.

        The only defensible position I see is requiring that hardware devices be fully-specified, to the extent that you almost want there to be some sort of penalty if there turns out to be a backdoor or undocumented feature.
        Full specification is good, of course. But software is information too. So a reference implementation is part of a good specification. And free software and free access in general is good for transparency, allowing not only reading but experimentation, vulnerabilities research, etc.

        Comment


        • #24
          Originally posted by phoron View Post
          I don't use proprietary drivers. If I don't want proprietary firmware why on hell would I want proprietary drivers ?
          I didn't say you were using them. You seemed concerned with the mere capability of the firmware to be updated, not just the actual fact of it happening.

          Originally posted by phoron View Post
          Full specification is good, of course. But software is information too. So a reference implementation is part of a good specification. And free software and free access in general is good for transparency, allowing not only reading but experimentation, vulnerabilities research, etc.
          If you're really so concerned about security, then I'd expect you'd also worry about backdoors and undocumented features.

          Comment


          • #25
            Originally posted by coder View Post
            I didn't say you were using them. You seemed concerned with the mere capability of the firmware to be updated, not just the actual fact of it happening.
            I'm relatively lost trying to follow you. So I guess you got lost trying to follow me ? I'm sorry, I didn't want that.

            I said something like buying the hardware requires trusting the vendor as much as possible. I said all sort of proprietary or unwanted firmware could be hidden in ROM
            (or wiring or chip logic or whatever) but I could not know it. Now you say I'm not counting on hidden rewriteable media that could also contain firmware and could be somehow updated. Well, that's or can become the same problem as some nasty firmware in a hidden ROM (in fact one could claim the hidden flash is circuitry or chip logic or something I included generically in my previous posts, but I don't care how clear I was, you could understand I was only speaking on nonreplaceable firmware, makes sense).

            I take decisions with the information I have. For hardware I can't verify much and must trust the vendor.
            For software I can build it myself or cooperate in a like-minded community to check, choose, collect, build and distribute free software.
            By provisioning only free software from communities that more or less share my goals and criteria I should be able to avoid software that updates nasty versions of firmware to hidden flashes in my hardware.
            This includes not using proprietary OSes or bootloaders, not using proprietary applications, and not using proprietary drivers. But also not using free bootloaders, OSes, applications or drivers, that load proprietary firmware updates. And not using hardware that I can't use with that choice of software. Linux-libre is part of that.

            This still leaves the problem of the hidden firmware being able to access networks and update itself on its own (OTA updates). I can try to monitor my network to avoid that,
            or I can imagine that if that happened to people often someone would have noticed network activity and blown the whistle.

            But it's a risk, yes. So what? I take my decisions in order to mitigate the risks I can assess. If I can't know about a risk, or I can imagine and still can't verify it or prevent it, it doesn't affect my decisions.

            If you're really so concerned about security, then I'd expect you'd also worry about backdoors and undocumented features.
            Sure and what about hidden explosives in hardware that get detonated when you type "gremlins" three times after 23:12 on a friday?

            Everyone should be concerned about backdoors and antifeatures, yes. But you mitigate the risks you reasonably can and live happily until the international space station toilet falls on your head while going for a walk in the park, right?

            Or should the hypothetical existence of unknowable or unmitigable risks stop you from mitigating the risks you can mitigate ?

            Now this all started with:

            But, if the firmware were stored in the SoC's internal ROM, then it would be?
            To answer explicitly:

            if the proprietary firmware was in a ROM (or in a hidden NAND) it would be usable with linux-libre, because linux-libre could not do anything about it.

            If it was in a ROM I would consider it hardware and I could use it, although I'd still prefer more open hardware if I can find it.

            If it was in a hidden NAND I might strill use it because I wouldn't know it was there, and I would do my best not to update some firmware I wouldn't now exists (with proprietary versions, at least, but how could I replace it with free versions without knowing it exists?). I don't know if the analogy is worth much, but that is similar to black money that you get without knowing it's black money. You can't do much about it, even if it might eventually pose a legal risk to you. You just try to make sure you can prove you always use legal procedures, and hope to be safe.

            If it was in a known NAND then I wouldn't like that it was proprietary, I could use it with linux-libre without updating that firmware with new proprietary versions, but
            I wouldn't use that hardware if I could find something else to use, or I'd try not using any hardware if that use is avoidable. But I could still use it if I needed it badly enough and wasn't aware of some known risk. I don't know. One does what one can in emergencies.

            Comment


            • #26
              Originally posted by phoron View Post
              But it's a risk, yes. So what? I take my decisions in order to mitigate the risks I can assess. If I can't know about a risk, or I can imagine and still can't verify it or prevent it, it doesn't affect my decisions.
              Yes, that's a reasonable attitude.

              Originally posted by phoron View Post
              ‚Äčif the proprietary firmware was in a ROM (or in a hidden NAND) it would be usable with linux-libre, because linux-libre could not do anything about it.

              If it was in a ROM I would consider it hardware and I could use it, although I'd still prefer more open hardware if I can find it.
              What I'd suggest (and this sounds similar to your perspective) is that instead of drawing a red line at loadable binary blobs, that hardware platforms should instead get a grade or some kind of score. That lets people balance the risks for their own tolerance.

              Ultimately, it'd be nice to have some FPGAs that are optimized for building CPUs and support an open source toolchain. They'd still be an order of magnitude slower than leading edge CPUs, but at least you could have total, top-to-bottom visibility. For the very most-sensitive applications, the performance tradeoff could be worthwhile. Plus, you might reserve part of the FPGA to use for hardwired acceleration of specific workloads.

              Comment


              • #27
                Originally posted by coder View Post
                What I'd suggest (and this sounds similar to your perspective) is that instead of drawing a red line at loadable binary blobs, that hardware platforms should instead get a grade or some kind of score. That lets people balance the risks for their own tolerance.

                Sounds right. The only doubt is how many people would be working to build communities for each grade. I already feel like thre're so few people with my criteria that vendors ignore us. If we further divide communities, it gets less workable. But on the other hand, we can all contribute and take contributions from other communities, as long as everyone is clear about each one's goals, so it's not so terrible, and it's unavoidable anyway, because people is going to have different goals and criteria anyway.
                Ultimately, it'd be nice to have some FPGAs that are optimized for building CPUs and support an open source toolchain. They'd still be an order of magnitude slower than leading edge CPUs, but at least you could have total, top-to-bottom visibility. For the very most-sensitive applications, the performance tradeoff could be worthwhile. Plus, you might reserve part of the FPGA to use for hardwired acceleration of specific workloads.
                Not sure I understand. On one hand I'm not sure what would be an FPGA optimized for building CPUs. People already build CPUs and microcontrollers on existing FPGAs. More free software toolchains for FPGAs would be hugely welcome, it's very needed, yes. On the other hand even with FPGAs, hardware is hardware and software is software, and hardware is hard to verify. If you are "paranoid" and program your own FPGA, how do you know the FPGA vendor did not include a microcontroller that changes the program between when you load it to the FPGA and the FPGA reconfigures itself without you knowing ? Or that exfiiltrates input and output data to somewhere, or... ?

                It's paranoia all the way down, so you have to stop it somewhere, and that's usually at the work you can't do yourself (alone or with peers) and have to buy from someone else.
                If you don't have machinery to build chips you have to trust your chip vendor. You can inform yourself about vendors, and choose them carefully, you can lobby your government to impose laws and inspections on what vendors can do (if you even share a government with some chip vendors), you can run information campaigns to build consumer pressure, whatever, but at the end of the day you have to critically trust what you buy or not buy anything.

                Now some people trust proprietary software, so they buy hardware that requires proprietary software (firmware, driver, OS, bootloader, init blobs, or whatever). If you are going to run windows on it, why should you worry about one more piece of proprietary code?

                And there're people who insist on running free software but only in part, I'm not even sure if in some definition of "most part", because the tendency seems to have more blobs everywhere, so that "most" is ever less and less. I guess it works for them for now and I guess they're happy so, I just don't understand it fully. How is that supposed to evolve to something better, or... ? I'm guessing, they just go for pretending to have what they want, or at least some of what they want without spending too much effort on it because then they don't have time/energy left for other stuff they want too. If that's so it's human, but I feel like when you have no red line at all you can be dragged anywhere. But maybe they have a very coherent red line they understand and I don't. It's their life, after all.

                I just wish there was hardware that we can use, we people who want 100% free software (including loadable firmware). One wish is not much, so I just buy carefully and buy little.

                Comment


                • #28
                  Originally posted by phoron View Post
                  Not sure I understand. On one hand I'm not sure what would be an FPGA optimized for building CPUs. People already build CPUs and microcontrollers on existing FPGAs.
                  Well, you can have a FPGA that's just a sea of gates, but some of them have prefab multipliers and other higher-level building blocks. Also, an issue is how much SRAM & registers it has, and how those are distributed. I think there could be a way to have a FPGA that basically has a lot of the high-level building blocks you'd need for CPU cores and distributes them evenly enough that your main datapaths don't end up crisscrossing the die multiple times. If you'd then co-optimize your microarchitecture to suit the mix of resources the FPGA provides, you could end up with performance and efficiency that's half way in between a soft core and hard-wired CPU.

                  Originally posted by phoron View Post
                  even with FPGAs, hardware is hardware and software is software, and hardware is hard to verify.
                  You just have to make the hardware simple enough that it's really difficult to hide anything in there.

                  Originally posted by phoron View Post
                  If you are "paranoid" and program your own FPGA, how do you know the FPGA vendor did not include a microcontroller that changes the program between when you load it to the FPGA and the FPGA reconfigures itself without you knowing ?
                  If they don't know what gate configuration you're loading on there, the amount of analysis they'd have to do to modify it in a way that implants usable backdoors would probably be computationally infeasible to do in realtime, with a little microcontroller. Also, I'd imagine you could make some timing or boundary-scan vectors to validate that your gate configuration has the configuration you expect.

                  Comment


                  • #29
                    Ok, this might be my last post in this thread. Not for anything against you. It's just that we're getting so hypothetical in fileds that I'm not so expert, that I feel I can start saying nonsense at any time... and I have things to do, too... Thanks for the covnersation.

                    Originally posted by coder View Post
                    Well, you can have a FPGA that's just a sea of gates, but some of them have prefab multipliers and other higher-level building blocks. Also, an issue is how much SRAM & registers it has, and how those are distributed. I think there could be a way to have a FPGA that basically has a lot of the high-level building blocks you'd need for CPU cores and distributes them evenly enough that your main datapaths don't end up crisscrossing the die multiple times. If you'd then co-optimize your microarchitecture to suit the mix of resources the FPGA provides, you could end up with performance and efficiency that's half way in between a soft core and hard-wired CPU.
                    I see. I don't know. I don't understand the state of the art and the tradeoffs well enough. You might be right.

                    You just have to make the hardware simple enough that it's really difficult to hide anything in there.
                    The point is you never know if the hardware you got is so simple as advertised or includes extra complexity, if you're paranoid.
                    And in a world where CPUs can check cryptographic signatures before even having access to DRAM, firmware includes whole OSes with network stacks, and customers find DRM allright, it's hard to know how paranoid one should be...

                    If they don't know what gate configuration you're loading on there, the amount of analysis they'd have to do to modify it in a way that implants usable backdoors would probably be computationally infeasible to do in realtime, with a little microcontroller. Also, I'd imagine you could make some timing or boundary-scan vectors to validate that your gate configuration has the configuration you expect.
                    But you won't get very far alone. And if you work in public, then any attacker knows possible published configurations to look for and what to do when seen.
                    And the microcrontroller might exfiltrate the configuration elsewhere with more computing resources and wait for instructions. Or even not modiying your design
                    they could still eavesdrop or alter the inputs or outputs... I don't think cat and mouse games end just by using FPGAs.

                    I don't know, it's all very sketchy and above my head. Maybe we're looking at the pointing finger instead of looking at the moon. I was just trying to convey the general idea that in general at a certain point you delegate trust to your supply chain, you can't do everything yourself. Maybe with FPGAs, or with better FPGAs you could, I don't know, but it was just an example. It's not that different from the classical "Reflections on Trusting Trust", by Ken Thompson in 1984 ACM. (with Schneider and Wheeler notwithstanding).

                    Comment

                    Working...
                    X