Announcement

Collapse
No announcement yet.

Rust-Written Apple DRM Linux Kernel Driver Renders First Cube

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #81
    Originally posted by Dukenukemx View Post
    That's using Apple's Media Engine which isn't going to matter once everyone switches over to AV1. Google and Netflix have already made the switch, but for now YouTube defaults to VP9. There's a reason why I ignore certain benchmarks because they will soon be irrelevant. Also, that Media Engine won't work on Linux for a very long time. How long you think it'll be when all those formats will work on Linux through Apple's Media Engine? Get Apple to throw some code or open source their drivers so this process can be done faster.
    Well in that case we can completely ignore all of your previous posts because they are irrelevant due to testing gaming scenarios (i.e. running an game under a Rosetta emulator with another emulater for DirectX/OpenGL vs natively). If you want to selectively discard benchmarks then it can be done both ways.

    Even this benchmark you are talking about is completely relevant because it is showing how energy efficient the chip is because its using software video decoding which is as close to apples vs apples comparison as you can get because you are testing the exact same software on all platforms. The reason why the game benchmarks are not relevant is because you are comparing apples to oranges, you would have to compare games which compile to Metals (Apple's graphics API) vs DirectX/OpenGL and even then you are mainly benchmarking engines.

    Originally posted by Volta View Post

    I thought bsd zealots can't stand the fact Apple doesn't give a shit about their favorite OS? And I didn't know Linux is a hardware producer. It seems Apple is for believers:

    https://discussions.apple.com/thread/252177766



    the answer:



    What's funny 212 people up voted this question.
    Yes lets quote random people from the internet in a vacuum who probably have no idea what they are talking about​
    Last edited by mdedetrich; 27 September 2022, 04:49 AM.

    Comment


    • #82
      Originally posted by mdedetrich View Post
      Yes lets quote random people from the internet in a vacuum who probably have no idea what they are talking about​
      Got it.. Apple users have no idea what they're talking about.

      And 600+ other users (so we have 800+ so far):

      https://discussions.apple.com/thread/252124336

      Unfortunately I'm also having this experience with my M1 MBP 16 Gb / 512 Gb. The whole computer feels sluggish and laggy even when performing simple tasks such as web browsing or opening and using simple built-in apps. This is very disappointing, especially that I ditched my Windows 10 laptop for the Mac experience based on my iPhone 12 Pro Max which is a very fluent device.
      I moved over from a PC and was looking forward to these "blazing speeds." Can only say am disappointed. Does not boot up particularly quick. Also has lag to open may apps. (mac book pro with M1)
      I've been using a MacBook Pro 13" M1 for over a week now, and for work doing coding for the web and dealing with large images in Sketch this computer is not up to the task. It hangs all the time, during simple things like saving files and opening a new browser tab. I don't know how much of this is a problem with Rosetta or Big Sur, or something more fundamentally wrong with the architecture.
      Last edited by Volta; 27 September 2022, 05:34 AM.

      Comment


      • #83
        Originally posted by Volta View Post

        Got it.. Apple users have no idea what they're talking about.

        And 600+ other users (so we have 800+ so far):

        https://discussions.apple.com/thread/252124336




        I just checked the first link and they are mentioning Macbook from 2017 which is half a decade ago and not the macbooks we are talking about now that have the Apple made ARM Silicon (the Mac's back then had Intel chips so you can blame Intel for being slow). If you are going to shit talk at least put some effort into it, this is just sad and embarrassing, even for a troll.
        Last edited by mdedetrich; 27 September 2022, 06:21 AM.

        Comment


        • #84
          Originally posted by karolherbst View Post

          try making the same evaluation running on battery
          Lets compare Apples to apples, comparing the Costco laptop to the Apple M2 at same price
          Model MSI Katana 15.6 Apple M2 Base Asus 14.5 VivoBook OLED
          Price $1099 $1199 $1099
          CPU i7-12700H
          Cinebench: 1806/16745 (+5/90%)
          Geekbench: 1607/12103 (-10% / +50%)
          M2 (8-core)
          Cinebench: 1700 / 8538
          Geekbench: 1874 / 8853
          i7-12700H
          Cinebench: 1806/16745 (+5/90%)
          Geekbench: 1607/12103 (-10% / +50%)​
          RAM 16GB 8GB 12GB
          Storage 512 GB NVMe 256 GB NVMe 512 GB NVMe
          GPU Nvidia 3060 Laptop
          Geekbench Vulkan 67264
          35% faster than M1
          Geekbench Metal 28000 (estimated)
          integrated
          Connectivity Bluetooth 5.2
          WiFi 6
          Bluetooth 5.0
          WiFi 6
          Bluetooth 5.2
          WiFi 6E
          Screen 1080 IPS 2560x1664 IPS 2880 x 1800 OLED
          The apple manages to cost more. It wins, by a little, in one single-core test, and loses massively in multicore. It has half (or 2/3) the RAM, half the storage, worse bluetooth, and manages to cost $100 more.

          It has better battery life, and a good screen.

          But the claim, above, was that the Apple was a killer performance for the cost. I don't call having ~50-60% the performance for $100 more with less RAM a good value proposition.

          As usual, people who get hung up in Apple zealotry shift the goalposts: "but what about the screen", "but what about the battery", etc. If you wanted a really good screen you'd get the Vivobook. If you really cared about performance you wouldn't be spending $1200 on 8GB of RAM.

          Comment


          • #85
            Originally posted by Dukenukemx View Post
            Ok fine I'll respond to Apple's employee here. Firstly, I didn't say anything about everyone should be a gamer. All I keep saying is that Apple should be held accountable for not opening up their drivers or supporting Linux directly. Other people here claim that gaming doesn't matter and therefore the GPU doesn't matter. Forgetting that UI's, web browsers, CAD, photo editing, and basically anything with visuals will use 3D acceleration. I did say gaming is important because that's one of the top uses people have for computers. That and porn, which may disgust people but it is also correct.

            Also pointing out Apple winning one benchmark doesn't mean anything. Especially because again AV1 is now the future, so says Google, Netflix, and Meta. The moment these companies decide to force you to use AV1 is when you move onto the CPU to do the decoding. Quick Google shows 30% CPU load for AV1 decoding on Apple M1, depending on the video of course.
            you continue to spread lies... i told you multible times that i am not an apple amployee. i am on wellfare.
            And i never die buy any apple product in my life this means i am as neutral as possible.

            right you did not day letter by letter word by word that everyone should be a gamer but all your arguments always end up to that.
            if someone is not a gamer you come up with gaming benchmarks to proof apple m2 was not so much battery time.

            "Apple should be held accountable for not opening up their drivers or supporting Linux directly."

            this would be right if it is a closed system like playstation 5 or Iphone but the apple M1/M2 are open systems everyone who buys it can do whatever they want inclusive porting linux to apple m1/m2

            "Forgetting that UI's, web browsers, CAD, photo editing, and basically anything with visuals will use 3D acceleration."

            right and right now these systems use LLVM-Pipe for 3D acceleration it runs on the cpu instead of gpu and because the cpu is so fast people already report that this is already "good enough"

            but if people tell you it is good enough you blame them and you claim it is not good enough (because you are a gamer)

            "I did say gaming is important because that's one of the top uses people have for computers."

            well if you want to bring up new CPU/GPUwhatever tech then gaming is a horrible market to enter just watch intel arc doing bad.

            gaming is dominated by really evil monopolists and i do not talk your bullsiht about fake isolated fake monopole fake bullshit i talk about real monopoles.

            its evil Intel with x86 ISA
            evil microsoft with Win32/DirectX
            and evil Nvidia with CUDA and DLSS and other monopole bullshit.

            if you want to battle this evil monopolists like apple tries it then gaming is the last thing you want to do because all the games are closed source x86 binaries and directX and also DLSS and other nvidia shit tech like physX and other bullshit.

            but we know all even if apple would bring up perfect linux support you my friend you buy the notebook with intel 13900K and Nvidia RTX 4000 gpu then you praise DLSS3,0 with "Fake Frames" saving 25% power consumtion ...

            people like you keep this evil kartell up to their maximum power and evilness thats for sure.

            Phantom circuit Sequence Reducer Dyslexia

            Comment


            • #86
              Originally posted by mdedetrich View Post

              Well in that case we can completely ignore all of your previous posts because they are irrelevant due to testing gaming scenarios (i.e. running an game under a Rosetta emulator with another emulater for DirectX/OpenGL vs natively). If you want to selectively discard benchmarks then it can be done both ways.
              When has OpenGL ever been replaced and then make outdated? Forget Apple which can't hold onto a standard for more than 10 years. The rest of the world still uses OpenGL for the most part, including Windows. Where as video formats come and go frequently. MPEG1 was replaced with MPEG2 then replaced with MPEG4, then h.264, h.265. But wait, there's more. VP9 is the web standard but now it seems AV1 will take it's place. Fixed function hardware is amazing at efficiency but also worthless when you don't use the format it was designed for. Where as graphic API's can be replaced or even use a wrapper to adapt them to other API's.

              I get you're trying to get a gotcha out of me but you're example wasn't well thought out.
              Even this benchmark you are talking about is completely relevant because it is showing how energy efficient the chip is because its using software video decoding which is as close to apples vs apples comparison as you can get because you are testing the exact same software on all platforms.
              YouTube video play back is 100% using Apple's Media Engine. YouTube currently uses VP9 unless you use an Android device which can use AV1. Keep in mind other laptops also accelerate VP9 as well, so Apple's implementation is superior. The fact that Apple can use far less power when there isn't much of a load is why this benchmark is doing so well, but it still isn't using the ARM hardware as much as the Media Engine.
              The reason why the game benchmarks are not relevant is because you are comparing apples to oranges, you would have to compare games which compile to Metals (Apple's graphics API) vs DirectX/OpenGL and even then you are mainly benchmarking engines.
              This has been Apple's problem since the PowerPC days and will likely never change. But again the issue that you have is because 3D graphics=gaming, which is not entirely true. If you use a web browser, video player, video editor, image editor, CAD, and etc then you will want 3D acceleration. You're listening to qarium​ which as far as I'm concerned is dealing with his raging insanity. Probably because it agrees with your confirmation bias.


              I've said it before and I'll say it again. IF you WANT your Apple silicon based product to WORK 100% in Linux then you NEED APPLE involved. There is no reason Apple can't just open source their drivers so that Linux users can just adopt it. Instead of agreeing with me, you'd rather attack me and point out that I'm a gamer as if that's a bad thing. Forgetting that Lord Gaben's Valve is probably the best contributor to Linux and freely open sources all their work, unlike Apple. We even have some people here who chime in to claim that Mac OSX is da best, forgetting this is a LINUX forum. I can tell you that buying an Apple product and hoping for a small amount of dedicated people working on this project to get Linux working is not enough to get it usable. If anyone who's ever used a port of Linux that was done like this you can bet it was a very unstable experience. You want to be at the point where they're bug fixing and optimizing, not implementing support for hardware that isn't working yet.

              Last edited by Dukenukemx; 27 September 2022, 09:00 PM.

              Comment


              • #87
                Originally posted by Dukenukemx View Post
                When has OpenGL ever been replaced and then make outdated?
                OpenGL is outdated for many years and it always was only: NvidiaGL... OpenGL code was mainly writen in a way that it only runs on nvidia hardware if you want to run it bug-free.

                that was the reason why everyone did switch to mandle/vulkan/directX12/metal/WebGPU...

                OpenGL is only supported because of legacy reasons.

                Originally posted by Dukenukemx View Post
                Forget Apple which can't hold onto a standard for more than 10 years. The rest of the world still uses OpenGL for the most part, including Windows. Where as video formats come and go frequently. MPEG1 was replaced with MPEG2 then replaced with MPEG4, then h.264, h.265. But wait, there's more. VP9 is the web standard but now it seems AV1 will take it's place. Fixed function hardware is amazing at efficiency but also worthless when you don't use the format it was designed for.
                I am 100% sure that future versions of apple m1/M2 will have VP9 and AV1 decode and encode support. its trivial even 260€ cheap rockchip systems has it.

                Originally posted by Dukenukemx View Post
                Where as graphic API's can be replaced or even use a wrapper to adapt them to other API's.
                thats a complete lie and was pushed by you in the past...
                you can try to emulate one graphics api with another but you always lose performance (much performance)
                and in the case of apple hardware if you remove all legacy tranistors you end up become very inefficient emulating outdated stuff.

                3DFX did go down because they were not able to emulate DirectX and OpenGL in their "Glide" API

                WINE did emulate DirextX9/11 with OpenGL and it was a pain in the ass with the result of maybe 50% of the performance very bad.
                and also the NvidiaGL problem the code mostly did only run on Nvidia hardware.

                mandle,dirextx12,metal,vulkan and also WebGPU can be "wrapped" but only because it is nearly the same at a low level.

                you said in the past that emulators of game consoles did run faster than on apple hardware well you painted it as fair comparison but it is not if you remove all fixed function bits on the gpu silicon who is for legacy purpose only you end up emulating a lot of stuff in shader cores what kills performance and also energy efficiency.

                Originally posted by Dukenukemx View Post

                This has been Apple's problem since the PowerPC days and will likely never change. But again the issue that you have is because 3D graphics=gaming, which is not entirely true. If you use a web browser, video player, video editor, image editor, CAD, and etc then you will want 3D acceleration. You're listening to qarium​ which as far as I'm concerned is dealing with his raging insanity. Probably because it agrees with your confirmation bias.
                thats not Apples problem thats a problem for all of us because we all suffer from this evil monopoles of (apple, intel, microsoft)
                the monopole results in: bad products and high prices and anti-consumer tech like intels: SGX

                you say this: "never change" I say it already changes more and more games have opensource game engine and also opensource engines like godot4.0 become popular. and with and opensource game engine it is no problem to build an 64bit ARM binary with native metal/WebGPU support.

                "3D acceleration"

                people report that LLVM-Pipe on the ARM cores is already fast enough... you ignore this point all the time.
                but you ignore all information what does not fit into your biased world view.

                Originally posted by Dukenukemx View Post
                ​​
                I've said it before and I'll say it again. IF you WANT your Apple silicon based product to WORK 100% in Linux then you NEED APPLE involved. There is no reason Apple can't just open source their drivers so that Linux users can just adopt it.
                wrong you are a liar plain and simple... i told you multiple times
                Patent LAW and Copyright LAW makes it 100% sure that Apple can not just open source their drivers.
                the only secure way to fix it is clean room reverse-engineering.

                look at AMD after their anauchment 2007 they want to make opensource drivers they did write it from start and they did not opensource the FGLRX closed source driver. and AMD did not opensource their closed source driver because they "can't"

                see Nvidia they did not opensource the user-space driver because they "can't"

                so apple "can't" but you liar you claim it is easy to do so.... AMD did not do it and Nvidia also did not do it. and all others did also go the clean room reverse engineering route.

                and by the way what apple can do they are already involed proofed multible times.

                but they can not break Patent-LAW and Copyright-LAW... thats why apple engineers will never touch the linux code.

                Originally posted by Dukenukemx View Post
                ​​​
                Instead of agreeing with me, you'd rather attack me and point out that I'm a gamer as if that's a bad thing. Forgetting that Lord Gaben's Valve is probably the best contributor to Linux and freely open sources all their work, unlike Apple. We even have some people here who chime in to claim that Mac OSX is da best, forgetting this is a LINUX forum. I can tell you that buying an Apple product and hoping for a small amount of dedicated people working on this project to get Linux working is not enough to get it usable. If anyone who's ever used a port of Linux that was done like this you can bet it was a very unstable experience. You want to be at the point where they're bug fixing and optimizing, not implementing support for hardware that isn't working yet.
                people do not and can not agree with you because you are complete ignorant you ignore patent law you ignore copyright law you ignore any fairness in tests or benchmarks you refuse to use 64bit ARM binaries and native graphic APIs like metal/WebGPU to benchmark apple hardware you tell people complete lies for example that OpenGL is not outdated legancy crap and apple was right to remove openGL support from macos a long time ago because they discovered OpenGL was only NvidiaGL Trojan Horse bullshit.

                you tell people lies for example that i am a apple emplyee ... but i am not and i never did buy any apple product...

                why should people agree with you ?

                you insult people for telling the truth: "qarium​ which as far as I'm concerned is dealing with his raging insanity​"
                Phantom circuit Sequence Reducer Dyslexia

                Comment


                • #88
                  This thread has become a mess and I think there are a lot of people who actually agree yelling at each other.

                  As to the hardware-- yesterday when I made my laptop comparison chart I did get to dig into benchmarks around the M2. I think everyone can agree it's an impressive piece of kit on its own, competing with current-gen Intel / AMD chips. This is something that China has sought to do for years and has never pulled off, so the fact that Apple over a few years has leapfrogged other established SOC manufacturers like Samsung and Qualcomm is impressive.

                  And no one can say that the battery life is not impressive either. But the contention made here originally was that the cost for a high-performance laptop makes Asahi and an M2 laptop a no-brainer. What I've seen in benchmarks says that, you'd be getting substantially lesser performance in most price brackets with an Apple M2 vs a PC laptop. It is always hard to do a direct comparison because Apple usually brings an interesting twist to the comparison: high-quality chassis, mid/high-quality screen, interesting connectivity options, and pretty good battery/performance.

                  But if you pick a particular 2-3 target metrics (connectivity, performance, screen), you can usually get a PC laptop that focuses more tightly on those with something like a 12900H / NVidia 3070 TI at the same pricepoint. There are some people who really do like the exact combination Apple brings to the table, and you might even be able to argue that under a reasonable scoring system the Apple would walk away with a higher aggregate score across all categories, but it's certainly not a clear win unless battery is your first objective and performance is your second.

                  All of that to say that I think most of the M2 "naysayers" here are not denying the efficiency of the chip or its respectable performance, only that it is not the ultimate answer to computing that it is frequently touted as. Intel and AMD have laptop offerings that absolutely demolish the M2 in multicore benchmarks, and for use-cases that can use the compute arguing that you get double the battery life but take 4 times as long to do the task isn't going to be persuasive.

                  Originally posted by qarium View Post
                  wrong you are a liar plain and simple... i told you multiple times...Patent LAW and Copyright LAW makes it 100% sure that Apple can not just open source their drivers.
                  the only secure way to fix it is clean room reverse-engineering.

                  The person you're responding to is sort of agreeing with you: because Apple is not disclosing their secrets, Asahi devs are having to reverse engineer things which means it will pretty much never have 100% support for the latest Apple chips.

                  Patent law does not stop Apple from disclosing their own patents. It does not even prevent Apple from licensing some of their tech to Linux. There are probably a lot of reasons Apple does not help Linux out, possibly because they're afraid that other patent trolls will come sniffing around their open-sourced secrets, possibly because there's very little actual return to Apple. They don't necessarily "want" people to get Macbooks to run Linux, because it either suggests that their own OS is not that great or it endorses an inferior experience (depending on perspective): neither are really great for Apple, whose OS+Hardware+Services full stack is what supposedly makes the brand valuable.

                  Sometimes the vendors (AMD, NVidia as you mention) come up with PR reasons they "can't" do it but it generally boils down to "we cannot make a good business case to the executives on why we should spend money and risk on open-sourcing this". At the end of the day that's what businesses care about. Intel has long made the decision that working very closely with FOSS devs is a good idea. NVidia, AMD, and Apple historically have not (though AMD has gotten much better).

                  So you both kind of agree: Asahi is never going to provide the full experience of Apple hardware because they will forever be chasing the tails of the newest Mac hardware. This isn't really unique to Apple though, Linux devs often have to write drivers for vendors who don't care.

                  Originally posted by qarium View Post
                  you ignore copyright law you ignore any fairness in tests or benchmarks you refuse to use 64bit ARM binaries and native graphic APIs like metal/WebGPU to benchmark apple hardware
                  In my hardware comparison above I compared Geekbench Metal for Apple to Geekbench Vulkan for PC. Metal is apparently higher performance (AMD hardware gets ~30-50% higher scores in it), but current NVidia / AMD hardware like the NVidia 3060 laptop GPU still double M2's scores even with that slanted comparison.

                  It's more, in my opinion, that Apple isn't really served by enabling these benchmarks. They make a product whose selling point is not raw specs but the integration, the status, and the experience. Any time you encounter a vendor selling that kind of product, benchmarks are usually not going to help their cachet. The fact that Metal makes direct comparisons with PC hardware difficult is probably very convenient for Apple.

                  Comment


                  • #89
                    Originally posted by ll1025 View Post
                    The person you're responding to is sort of agreeing with you: because Apple is not disclosing their secrets, Asahi devs are having to reverse engineer things which means it will pretty much never have 100% support for the latest Apple chips.
                    I am sorry, but as is evident the person that is being responded to has no clue what they are talking in any real sense and is just vomiting up content without even understanding what is being done or said in it. Just because he happens to be "sought of" saying the right thing doesn't mean what he is saying is of any value, as they say a broken clock is right twice a day.

                    Originally posted by ll1025 View Post
                    Patent law does not stop Apple from disclosing their own patents. It does not even prevent Apple from licensing some of their tech to Linux. There are probably a lot of reasons Apple does not help Linux out, possibly because they're afraid that other patent trolls will come sniffing around their open-sourced secrets, possibly because there's very little actual return to Apple. They don't necessarily "want" people to get Macbooks to run Linux, because it either suggests that their own OS is not that great or it endorses an inferior experience (depending on perspective): neither are really great for Apple, whose OS+Hardware+Services full stack is what supposedly makes the brand valuable.

                    Sometimes the vendors (AMD, NVidia as you mention) come up with PR reasons they "can't" do it but it generally boils down to "we cannot make a good business case to the executives on why we should spend money and risk on open-sourcing this". At the end of the day that's what businesses care about. Intel has long made the decision that working very closely with FOSS devs is a good idea. NVidia, AMD, and Apple historically have not (though AMD has gotten much better).

                    So you both kind of agree: Asahi is never going to provide the full experience of Apple hardware because they will forever be chasing the tails of the newest Mac hardware. This isn't really unique to Apple though, Linux devs often have to write drivers for vendors who don't care.
                    Patents is not the correct thing to focus on here, but if there is external IP in an already existing product its effectively almost not feasible to open source, that is definitely true. Its the same reason why people are saying that NVidia should open source their drivers are completely clueless, the drivers that NVidia has contain IP from SG silicon days and with the amount the code has changed over with a codebase literally millions of lines long its not practical to vet through it to make sure there are no legal risks in opening it. In order for NVidia, or Apple or anyone else to make such a driver they would have to do a completely clean room implementation which is a massive cost (both legally and financially) with very monetary benefit for NVidia. The only reason that AMD managed to pull this is off is that they were almost completely bankrupt and they decided to completely redesign their architecture so it was more like "may as well do it, we are at this point anyways".

                    I don't necessarily agree with NVidia drivers being open source, or the company being shitty in general however there is no point in deluding yourself (or others) of something that is completely unreasonable. There are people that are even claiming NVidia fully open sourced their driver in some vain hope that "yes, we are right and we managed to convince NVidia!!!" but in reality NVidia put most of the logic into closed firmware on the GSP chip and the only thing being open sourced is an interface to that chip (plus some wrapper code). This is also why the nvidia open source in kernel interface only works on newer NVidia GPU generation cards, it requires that GSP chip which NVidia only added recently.
                    Last edited by mdedetrich; 28 September 2022, 08:56 AM.

                    Comment


                    • #90
                      Originally posted by mdedetrich View Post
                      I am sorry, but as is evident the person that is being responded to has no clue what they are talking in any real sense and is just vomiting up content without even understanding what is being done or said in it. Just because he happens to be "sought of" saying the right thing doesn't mean what he is saying is of any value, as they say a broken clock is right twice a day.
                      All I've been saying is that Apple should be involved and held responsible because they aren't. Open sourcing their drivers would be a great move. Everything you disagree with me are words you put in my mouth, and not said directly by me.
                      Patents is not the correct thing to focus on here, but if there is external IP in an already existing product its effectively almost not feasible to open source, that is definitely true. Its the same reason why people are saying that NVidia should open source their drivers are completely clueless, the drivers that NVidia has contain IP from SG silicon days and with the amount the code has changed over with a codebase literally millions of lines long its not practical to vet through it to make sure there are no legal risks in opening it. In order for NVidia, or Apple or anyone else to make such a driver they would have to do a completely clean room implementation which is a massive cost (both legally and financially) with very monetary benefit for NVidia.
                      Yet Nvidia did open source their drivers. How much monetary benefit does Nvidia get from open sourcing their drivers? Nvidia did it because they're often called out for it, including Linus Torvalds. Peer pressure works.
                      The only reason that AMD managed to pull this is off is that they were almost completely bankrupt and they decided to completely redesign their architecture so it was more like "may as well do it, we are at this point anyways".
                      So companies that are wealthy like Apple and Nvidia don't do it because it doesn't make them money. AMD does it because they don't have money. So where does Intel stand in all this because their drivers are also open source. Broken clock indeed.
                      I don't necessarily agree with NVidia drivers being open source, or the company being shitty in general however there is no point in deluding yourself (or others) of something that is completely unreasonable. There are people that are even claiming NVidia fully open sourced their driver in some vain hope that "yes, we are right and we managed to convince NVidia!!!" but in reality NVidia put most of the logic into closed firmware on the GSP chip and the only thing being open sourced is an interface to that chip (plus some wrapper code). This is also why the nvidia open source in kernel interface only works on newer NVidia GPU generation cards, it requires that GSP chip which NVidia only added recently.
                      Nvidia are still jerks but that's an upgrade from before. Nouveau team are paying attention to Nvidia's open source drivers and will learn from it. This is what we want from Apple so that anyone working on this has an easier time to make faster progress. Apple are jerks too, so the most we could hope for is a similar move like Nvidia.

                      https://www.phoronix.com/news/LPC-20...-Source-NVIDIA

                      Comment

                      Working...
                      X