Announcement

Collapse
No announcement yet.

Nouveau: NVIDIA's New Hardware Is "VERY Open-Source Unfriendly"

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Luke
    replied
    GM's "we onw your car" shit can be countered

    Originally posted by SystemCrasher View Post
    Well, if some service processor in peripheral device is going to misbehave, it can do a lot of harm, too. If your HDD or CD, or flash drive/card/etc returns you WRONG data with foreign code instead of launching your cool app - it can be unexpected. If device does DMA transaction and hacks your OS, it could be unexpected, too (though IOMMU on recent hardware could make it harder). While all this could fail many ways, it could be silly to underestimate potential issues.

    And DRM is exactly one of examples of how hardware could be hostile to user and it in fact shows you're not a real owner of this thing, It dares to live its own life and obeys its real masters, enforcing their will.

    But as you can guess, all this DRM/EULA folly is simply not going to stop at this point. It shows wonderful examples how to fuck up your customers. And other companies are getting eager to do the very same after looking on this! [link - CAR MANUFACTURERS got DMCA/DRM idea!] And they're basically about to strip owners from ownership rights, ha-ha! In fact, according to GM and Ford you no longer own your car at all. It is long-term... lease (!!!!) of certain family of techs instead. So you no longer have rights to change anything in "your" car, etc, fucking guests. I can admit they want the very same neat EULA terms proprietary SW companies had. How predictable. Let one e-Parasite in, and whole swarm will follow.
    First and most important is to refuse to buy cars that come with an EULA of any kind, as well as refusing to lease them under any circumstances. Buy older cars or for urban work switch to your bicycle. Remember: any purchase of locked anything is a vote, always vote NO on copyrighted/patented/DRM'ed shit. This is just like refusing to pay for and plant Monsanto's seeds in your garden, folks!

    If the government or GM managed to not only make it illegal to do your own car work but also to enforce these laws, I would refuse to drive their cars. If older cars were banned/subjected to compulsory scrappage, I would refuse to drive at all. As of today, I refuse to drive any registered vehile to any place I have to be able to deny having been due to license plate readers. yes folks, you really DO have the right to keep the locked, DRM'ed, and monetized "internet of things" and all devices that can connect to it entirely out of your life. It's not as easy as installing NoScript and Ghostery in Firefox, but its even more important. Learn to service older hardware! From cars to coffeemakers the "razor and blade" marketing model spits out shit we really do need to boycott. If you have a car new enough to meet modern fuel economy and emissions needs but which has no telemetry and whose computer can be read with a parts store scanner, take care of it, rebuild the engine and transmission when they wear out, and if a crash destroys it get another one like it. Do not use electronic toll roads at all, get rid of your EZ-pass as police can read these even on non toll roads.

    If you have a car with a fancy networked entertainment system or especially Onstar, that needs to come out of the car to stop the manufacturer or 3ed party servers from logging where you drive. On cars like the Nissan Leaf, there is a telemetry transmitter to report your position at all times. Probably the car would still work with it removed, if not the antenna could be replaced with a "dummy load" surface-mount resistor on the same kind of connector to fake an antenna and keep the signal from being strong enough for Nissan's surveillance network to receive. It is illegal for Nissan to respond by programming the car not to run if not able to reach their server, at least for now.

    Keep in mind, NHTSA had to introduce a regulation explicitly prohibiting carmakers from programming cars to refuse to start if programmed service visits TO THE DEALER were missed or ignored. NHTSA's correct argument was that people would be killed when cars refused to start to escape criminals or natural disasters. Assume, however, that this will not remain the case forever. Never buy any car which is locked to the dealer like an iPhone to iOS, and never drive if these become the only legal cars.

    If global warming does not lead to the end of the age of the automobile, Google's self-driving cars might end up compulsory for safety reasons. These will be totally locked and DRM'ed, also for "safety" reasons. Do not be surprised if they serve ads on a telescreen that cannot be turned off! In this situation, you would have to get rid of your car and not replace it, except possibly with a motorcycle, moped, or whatever class of vehicle was exempt on grounds of not being able to cause enough damage to others in a crash. Hell I's rather walk than drive a car with telemetry in it.

    Leave a comment:


  • SystemCrasher
    replied
    Hey, device "owners", say bye-bye to your ownershit rights. For your cars, too!

    Originally posted by ossuser View Post
    If 100% auditable CPU's can do the job, I don't have any problem with that.

    But ofcourse the subject is much wider than only CPU and GPU.
    Well, if some service processor in peripheral device is going to misbehave, it can do a lot of harm, too. If your HDD or CD, or flash drive/card/etc returns you WRONG data with foreign code instead of launching your cool app - it can be unexpected. If device does DMA transaction and hacks your OS, it could be unexpected, too (though IOMMU on recent hardware could make it harder). While all this could fail many ways, it could be silly to underestimate potential issues.

    And DRM is exactly one of examples of how hardware could be hostile to user and it in fact shows you're not a real owner of this thing, It dares to live its own life and obeys its real masters, enforcing their will.

    But as you can guess, all this DRM/EULA folly is simply not going to stop at this point. It shows wonderful examples how to fuck up your customers. And other companies are getting eager to do the very same after looking on this! [link - CAR MANUFACTURERS got DMCA/DRM idea!] And they're basically about to strip owners from ownership rights, ha-ha! In fact, according to GM and Ford you no longer own your car at all. It is long-term... lease (!!!!) of certain family of techs instead. So you no longer have rights to change anything in "your" car, etc, fucking guests. I can admit they want the very same neat EULA terms proprietary SW companies had. How predictable. Let one e-Parasite in, and whole swarm will follow.

    Leave a comment:


  • Luke
    replied
    I second the applause for AMD's open source work, and real risks are in CPU

    Originally posted by ossuser View Post
    If 100% auditable CPU's can do the job, I don't have any problem with that.

    But ofcourse the subject is much wider than only CPU and GPU.
    As in everything that has a 'controller' (network, harddisk, chipset etc).

    But hey, GPU's and their manufacturers were the target here ;-)

    And let's be clear, I applaude AMD for taking their opensource initiative.
    It was AMD's open drivers that let me get away from having to shut down X for every entry of a flash drive's encryption passphrase.

    At first in Summer 2012 I accepted a performance hit of about 75% for this, but barely noticed it with what I run. I have not benchmarked Catalyst lately but I've now got performance equalling or exceeding what Catalyst could do on the same cards in 2012. FOSS games with fully open code usually are not that demanding. My big video editors can play Scorched3d or 0ad, closed games are barred for security reasons. The open games could go too in return for 100% auditability of the stack below them. I would then exile them to another machine which would not need to be encrypted at all. For AMD, the advantage of this would be obvious: Two machines require two GPU's, two CPU's, and two chipsets. Intel and Nivida are not getting that business for "corporate" reasons concerning their business practices.

    I figure that commercial spyware of the kind commonly installed by smartphone vendors is unlikely even in closed drivers but cannot be totally disproven. If Nvidia got caught having their driver email a Hushmail address with a GPU serial number and the name of each openGL application started, gamers would desert them in droves. Now let's consider putting the same exploit in firmware. Since 99% or more of the buyers run the blob, this means coding your exploit to fit a sub 10kb firmware blob's file, instead of having spacious 50MB blob to stuff it into. This is a lot of extra work to increase the take by less than 1% and you still risk being caught by the first person to run Wireshark and a game at the same time.

    Now let's consider law enforcement/NSA/CIA spyware, dropped in by an employee of the vendor who takes a payoff. The employee does not want to get caught, and the firmware is a harder place to put it The offending blob could not easily log keyboard input by being part of the video packet parser or GPU memory management unit. If it logs all screen output it will need a place to store it, thus needing access to the hard drive, thus a hard drive driver, etc. A screenlogger won't net passphrases, online banking credentials, etc because everyone knows not to echo this stuff to the screen. This is not the best place to drop a keylogger unless you are just hiding an executable to be called by and run from something else, in other words, only using the firmware blob as disk storage. In this case, that executable probably runs from Windows and breaks when running Linux. A cross platform executable is bigger and harder to hide.

    Thus, I judge the closed drivers to have a far greater attack surface than the firmware. I also see the CPU and network firmware as the primary risks here. I would not want to drop any card with closed firmware (not just GPU's, anything) into a machine that had known safe CPU, keyboard, and network firmware as it would "decertify" that machine, but running an AMD card on a machine with unauditable CPU, hard drive, and especially network firmware adds little extra risk as the big targets are already there.

    If you are encrypting information that could change the course of a war or allow a corporate competitor to beat you to market with your own product, you should not be using UEFI for sure, and even the much smaller BIOS blobs are not trusted at that level. Even after you have Coreboot running, there is still that CPU firmware Coreboot does not replace. Therefore, the CPU firmware is the ideal place to hide an utterly persistant exploit coded by a real expert who knows how to keep it slim and light. Thus,even the Coreboot machine can only be trusted not to have "always on" exploits that Wirehark users will notice. That is no doubt the real reason those Guardian journalists communicating with Snowden used a randomly purchased machine never connected to a network for decryption.

    Leave a comment:


  • ossuser
    replied
    Originally posted by nanonyme View Post
    Why bother with GPU at all if you want simpler to audit hardware? Software rendering is perfectly enough for a lot of use cases. People should just start making again hardware where CPU controls *everything*, then you would only need to worry about auditing the CPU
    If 100% auditable CPU's can do the job, I don't have any problem with that.

    But ofcourse the subject is much wider than only CPU and GPU.
    As in everything that has a 'controller' (network, harddisk, chipset etc).

    But hey, GPU's and their manufacturers were the target here ;-)

    And let's be clear, I applaude AMD for taking their opensource initiative.

    Leave a comment:


  • nanonyme
    replied
    Originally posted by ossuser View Post
    Considering FPGA usage for GPU's, point taken, bridgman.

    Luke wants 100% auditable hardware and software.
    And he wants to be able to change things if it doesn't behave like he wants to.
    Things have to be open for that.

    You know, 100% auditable should be the norm, not the exception.
    For all users, and everywhere.

    A free driverstack or OS is great, but privacy/anonimity/security will only be a dream until the hardware is open also.

    BTW: great thread, with lots of insights.
    Why bother with GPU at all if you want simpler to audit hardware? Software rendering is perfectly enough for a lot of use cases. People should just start making again hardware where CPU controls *everything*, then you would only need to worry about auditing the CPU

    Leave a comment:


  • ossuser
    replied
    Originally posted by bridgman View Post
    Couple of things...

    1. We haven't seen a big market for systems with zero gaming capabilities, although there does seem to be a big chunk of the market that is happy with very limited gaming. That still translates into at least DX9-level graphics simply because nobody bothers writing games for anything older. I guess there are flash games but don't know how interesting they are to Linux users. Even DX9-level graphics was a stretch for the biggest FPGA's last time I looked.

    2. The "light graphics" market also tends to be the most cost-conscious, and the key difference between open HW on an FPGA and legacy closed HW is cost. Implementing the functionality of a $10-20 GPU would require spending much more on FPGA, again qualified with "last time I looked".

    Luke's point was that security conscious users might see the extra cost of an all-open FPGA-based HW solution as money well spent, whereas for the rest of the market you describe would just see it as "expensive and slow".
    Considering FPGA usage for GPU's, point taken, bridgman.

    Luke wants 100% auditable hardware and software.
    And he wants to be able to change things if it doesn't behave like he wants to.
    Things have to be open for that.

    You know, 100% auditable should be the norm, not the exception.
    For all users, and everywhere.

    A free driverstack or OS is great, but privacy/anonimity/security will only be a dream until the hardware is open also.

    BTW: great thread, with lots of insights.

    Leave a comment:


  • Luke
    replied
    Security costs in perspective

    Originally posted by bridgman View Post
    Couple of things...

    1. We haven't seen a big market for systems with zero gaming capabilities, although there does seem to be a big chunk of the market that is happy with very limited gaming. That still translates into at least DX9-level graphics simply because nobody bothers writing games for anything older. I guess there are flash games but don't know how interesting they are to Linux users. Even DX9-level graphics was a stretch for the biggest FPGA's last time I looked.

    2. The "light graphics" market also tends to be the most cost-conscious, and the key difference between open HW on an FPGA and legacy closed HW is cost. Implementing the functionality of a $10-20 GPU would require spending much more on FPGA, again qualified with "last time I looked".

    Luke's point was that security conscious users might see the extra cost of an all-open FPGA-based HW solution as money well spent, whereas for the rest of the market you describe would just see it as "expensive and slow".
    Right now most US police agencies have limited capablity against encryption, even the Secret Service is said to use dictionary attacks. If in the future that becomes a reliance on hardware exploits (perhaps exploits inserted at their request), spending an extra $500, even an extra $1,000 on auditable hardware that can be proven not to contain the exploits will pay for itself if a police raid nets the machine. A first-rate lawyer can get as much as $500 for one billable hour, if cracked encrpytion means the search warrant is followed by an arrest warrant you will need many of those billable hours. Again, let's include corporate executives facing bogus charges in Russia or China as having the exact same need for this security that political protesters and governmental whistleblowers like Edward Snowden do.

    If new hardware is out of the question, another option would be audit teams under NDA from mutually opposing countries and NGO's. They could audit a known and widely produced APU (plus a known board for it), while the NDA keeps the raw code from Nvidia and Intel. If all the audit teams concur that there are no exploits, that APU should get a sales spike.

    Leave a comment:


  • nanonyme
    replied
    Originally posted by curaga View Post
    @bridgman

    Dejavu with that "open microcode for old cards" perhaps?

    @Luke

    Not enough money, all open gpu projects failed.
    Well, at least I did not claim providing it would magically spawn a group of maintainers. I doubt it would. I don't expect it would yield a direct benefit to AMD, just an academic benefit for the community

    Leave a comment:


  • Luke
    replied
    Possible governmental markets for auditable hardware?

    Originally posted by bridgman View Post
    Yeah, there is something familiar about all these discussions

    I thought Luke had an interesting point though. The "open for the sake of being open" market hasn't really materialized, primarily because most of the target market wants to be able to game etc... which means a fairly significant chunk of hardware is required.

    What Luke is suggesting, however, is a bit different - defining an "open for the sake of being 100% auditable" market where the target user is probably *not* looking to run games on the system, and where a lower level of performance would probably be sufficient.
    I would guess that no government and no military other than those of the US trusts US made closed computing hardware or software after the Snowden revelations. Same for almost anyone else outside the US. Also, this raises questions of why the US government and US military would trust hardware made in other countries. The NSA is reputed to have their own fab for when it really counts, but smaller countries might find that rather expensive. Also, it you don't know there are no hooks for the NSA in closed code/secret gates, you don't know the same exploit isn't waiting to be discovered by the Chinses MSS or even an organized crime hacking group.

    A fully open and auditable APU designed for ruggedness and durability could have a considerable market to anyone in the world who has reason to fear the security/spy services of anyone else in the world, including almost all governments, NGO's, human rights groups, political parties, even corporate engineers worried about data theft in places like China. Suppose an engineer from AMD itself is going to China, and carrying a laptop carrying just one chip on the board made in China, and on that chip is closed firmware and the gates on that chip are unknown. Without auditablity, that chip could contain keyloggers that crack the encryption, and next thing you know Chinese knockoffs of the next Radeon hit the shelves before the real one is even ready.

    That laptop does NOT need to be able to game! Surely the engineer has another machine to do that job. That engineer's laptop needs to keep secrets safe, and there are exactly three ways to do this: fabbing and controlling everything yourself, being able to audit everything yourself, or auditing by mutually opposing third parties. The power needs are enough to play video, in the future probably to play video at the resolution of the screen, from or while running a web browser.

    Leave a comment:


  • bridgman
    replied
    Originally posted by ossuser View Post
    The gaming market is a big market.

    But I also think most users don't run games, as they only have a computer to do some webbrowsing, mail etc.

    The market you mention here would thus be your main users, isn't it ?
    Couple of things...

    1. We haven't seen a big market for systems with zero gaming capabilities, although there does seem to be a big chunk of the market that is happy with very limited gaming. That still translates into at least DX9-level graphics simply because nobody bothers writing games for anything older. I guess there are flash games but don't know how interesting they are to Linux users. Even DX9-level graphics was a stretch for the biggest FPGA's last time I looked.

    2. The "light graphics" market also tends to be the most cost-conscious, and the key difference between open HW on an FPGA and legacy closed HW is cost. Implementing the functionality of a $10-20 GPU would require spending much more on FPGA, again qualified with "last time I looked".

    Luke's point was that security conscious users might see the extra cost of an all-open FPGA-based HW solution as money well spent, whereas for the rest of the market you describe would just see it as "expensive and slow".

    Leave a comment:

Working...
X