Announcement

Collapse
No announcement yet.

Asahi Linux Update Brings Experimental Apple M2 Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    Originally posted by Anux View Post
    I was refering to "They Live" a SicFi trash movie from the 80, but worth watching. Even DukeNukem got some lines copyed from that movie.
    man i know this movie it is a really nice and good movie. it is not "trash"

    however i did not get it that you mean this "they live" sunglasses but who cares .,..

    yes in the movie you put these glasses on then you see the truth...
    Phantom circuit Sequence Reducer Dyslexia

    Comment


    • #62
      Originally posted by hamishmb View Post
      Yes, they do a lot of stuff wrong too.

      The high read-write rate was caused by a software issue and was fixed IIRC. Also, they provide spare parts for some of their newer phones. They also need to provide spares for their older phones, but progress is progress.

      There are also a lot of reasons why I dislike Apple and macOS and iOS, I hasten to add, but I can see many positives in there too.

      Just because the hardware is fixed function doesn't mean the performance gain isn't there, either. If it does what people want it to do, does it well, and does it at a good price, then Apple have designed their chips well for their customer base, in my opinion.

      EDIT: Also going to note that like any piece of silicon ever, there are going to be things that it's not so good at. That's fine, and quite normal. If you eg need to run Windows, or Windows specific software, this isn't for you, obviously. Especially as Wine is still not quite there, though I hear crossover does well on Apple Silicon. You can also run Vulkan with MoltenVK, and OpenGL on top of Zink on MoltenVK. I know it isnt perfect, and Apple shouldn't have ditched OpenGL, but every platform has its problems.

      Anyway, this is getting more flamey and further away from facts, so I'll exit the conversation here - this isn't contributing anything useful.
      the point is we only "see" with our eyes what is on the surface ...

      what we can not see is what happens in secret or hidden.

      apple could already pay all these linux reverse engineering developers with an NDA to make them silence about this deal.

      and about all the negative stuff apple does to make profit and so one it is same like the apple vs USB-Typ-C case in the EU... and now the EU force apple to do USB-TYP-C iphones and laptops ...

      many of the bad happits of apple could be agaist the law next year like this apple vs typ-C case...

      Phantom circuit Sequence Reducer Dyslexia

      Comment


      • #63
        Originally posted by hamishmb View Post
        There are also a lot of reasons why I dislike Apple and macOS and iOS, I hasten to add, but I can see many positives in there too.
        For example?
        Just because the hardware is fixed function doesn't mean the performance gain isn't there, either. If it does what people want it to do, does it well, and does it at a good price, then Apple have designed their chips well for their customer base, in my opinion.
        I don't disagree with you here. What I'm pointing at is that people confuse the fixed function hardware for Apple's amazing battery life. Go ahead and play a game on it and see how long it lasts on battery. Try compiling the Linux kernel and see how much power it consumes. Before anyone says video games aren't important.
        EDIT: Also going to note that like any piece of silicon ever, there are going to be things that it's not so good at. That's fine, and quite normal. If you eg need to run Windows, or Windows specific software, this isn't for you, obviously. Especially as Wine is still not quite there, though I hear crossover does well on Apple Silicon. You can also run Vulkan with MoltenVK, and OpenGL on top of Zink on MoltenVK. I know it isnt perfect, and Apple shouldn't have ditched OpenGL, but every platform has its problems.

        Anyway, this is getting more flamey and further away from facts, so I'll exit the conversation here - this isn't contributing anything useful.
        How do I know the Apple apologists here don't run Linux as their daily OS. I visit Wine at reddit and there's a lot of M1 users complaining. I find them everywhere for nearly every game that has been released. Don't tell me this isn't for you when as a Linux user this very much applies to me as well. That's why we have DXVK, and VKD3D, and Proton, and etc because we want to do what we want with our hardware. Apple has no reason for not supporting Vulkan and OpenGL. They're just driving everyone nuts, and pushing developers to leave them behind. I use Linux Mint daily on my Ryzen 2700X with Vega 56. What are you guys on, a Macbook with OSX? I'm all for a port of Linux on Apple hardware, but don't tell me how great it is or it's going to be when you know damn well that without Apple's help this could take years before you can actually daily drive Linux on it. I'm not praising Sony for a port of Linux onto the PS4 because they didn't prevent it. We know Sony are assholes, but that's why Linux on it is great. If Linux fully works on the M series hardware, that situation is going to be no different. We get Linux working on a Windows laptop because we get help from AMD and Intel. Even Microsoft donates some code to Linux. Apple hasn't.

        Comment


        • #64
          Originally posted by Dukenukemx View Post
          I don't disagree with you here. What I'm pointing at is that people confuse the fixed function hardware for Apple's amazing battery life.
          man i am on fedora 36 with a threadripper 1920x and a vega64... and often i watch "youtube"
          this alone would make sure the "fixed function hardware" saves a ton of energy...
          there are a lot of stuff you do daily what fixed function hardware can do...
          all of this brings you: amazing battery life...

          whats your problem ? only because the apple M2 is not a general purpose compute HPC cluster?

          Originally posted by Dukenukemx View Post
          Go ahead and play a game on it and see how long it lasts on battery.
          people do this with steam deck and they report 3-4hours battery time...
          and apple users report much longer battery time even on gaming...
          now you say it is 7/6nm vs 5nm and all the excuses well right.

          some people claim it is only 5 hours on apple m2 on games but even this is a big win because the display is much larger on the apple m2 and larger displays need more energy for the backlight already...
          on the same display size you couls say 6hours of gaming on the apple hardware and 3-4hours on the amd hardware.

          Originally posted by Dukenukemx View Post
          Try compiling the Linux kernel and see how much power it consumes.
          just be one time honest who buy a laptop and then do daily work on compiling the linux kernel ?
          linus Torvalds does it on a Treadtripper workstation...

          most notebook linux users would SSH/VNC into their server or workstation and would do the compile there.

          but hey you are plain and simple not honest.. you do not buy apple m2 notebook to compile the kernel.


          Originally posted by Dukenukemx View Post
          How do I know the Apple apologists here don't run Linux as their daily OS. I visit Wine at reddit and there's a lot of M1 users complaining. I find them everywhere for nearly every game that has been released. Don't tell me this isn't for you when as a Linux user this very much applies to me as well. That's why we have DXVK, and VKD3D, and Proton, and etc because we want to do what we want with our hardware. Apple has no reason for not supporting Vulkan and OpenGL. They're just driving everyone nuts, and pushing developers to leave them behind. I use Linux Mint daily on my Ryzen 2700X with Vega 56. What are you guys on, a Macbook with OSX? I'm all for a port of Linux on Apple hardware, but don't tell me how great it is or it's going to be when you know damn well that without Apple's help this could take years before you can actually daily drive Linux on it. I'm not praising Sony for a port of Linux onto the PS4 because they didn't prevent it. We know Sony are assholes, but that's why Linux on it is great. If Linux fully works on the M series hardware, that situation is going to be no different. We get Linux working on a Windows laptop because we get help from AMD and Intel. Even Microsoft donates some code to Linux. Apple hasn't.
          to run Vulkan and OpenGL over metal with Zink and MoltenVK looks like a sane and good solution. by this the hardware does not need drivers to support these APIs...

          just get the point DXVK and VKD3D works over moltenvk over metal...

          "What are you guys"

          i have a gigaset gs290 with e/OS/ and a threadripper 1920x with vega64...

          "without Apple's help"

          just see the USB-typ-C case in the EU apple could be forced to support linux by politics...
          also they could simple change mind because of the positive PR/response...

          but one is clear with negative people like you they maybe think better not touch this toxic linux community...
          Phantom circuit Sequence Reducer Dyslexia

          Comment


          • #65
            Originally posted by qarium View Post

            man i am on fedora 36 with a threadripper 1920x and a vega64... and often i watch "youtube"
            this alone would make sure the "fixed function hardware" saves a ton of energy...
            there are a lot of stuff you do daily what fixed function hardware can do...
            all of this brings you: amazing battery life...

            whats your problem ? only because the apple M2 is not a general purpose compute HPC cluster?
            You're missing the point. I'm not against it, I'm just saying don't confuse ARM vs ASIC. There are limitations that comes inherently with ASIC's as well, such as not able to encode in a new or unsupported video format.
            people do this with steam deck and they report 3-4hours battery time...
            Depends on the game but Breath of the Wild gets at best 3 hours on the Nintendo Switch. Switch has a 4310mAh battery vs 5,313mAh on the Deck. Not a world of difference, plus you can emulate the Switch on the Deck.
            and apple users report much longer battery time even on gaming...
            No they haven't. But this again does largely depend on the game. For example, this person gets 2 hours and 40 minutes in World of Warcraft at max settings. He was able to get 10 hours but he also lowered the settings down all the way and limited the frame rate. Though getting 35fps in WoW is rather pathetic. Here's a gaming laptop also getting 10hours with lowered settings. Here's another person running Fortnite and only getting 1 hour and 36 minutes.
            now you say it is 7/6nm vs 5nm and all the excuses well right.
            These are valid, yes. AMD is going 5nm for desktop this year, and next year they'll be on 4nm on laptop. Intel bought a lot of TSMC 3nm, just like Apple so expect Intel to have 3nm parts next year, just like Apple.
            just be one time honest who buy a laptop and then do daily work on compiling the linux kernel ?
            linus Torvalds does it on a Treadtripper workstation...

            most notebook linux users would SSH/VNC into their server or workstation and would do the compile there.

            but hey you are plain and simple not honest.. you do not buy apple m2 notebook to compile the kernel.
            What you want to do can be done on a Chromebook. You're also avoiding my points. Use the CPU hard and it will eat the same power as a x86. If the point of owning a 8 or 10 core CPU with 10+ GPU's is to remote access a workstation then that's fiscally irresponsible, also a poor excuse.
            to run Vulkan and OpenGL over metal with Zink and MoltenVK looks like a sane and good solution. by this the hardware does not need drivers to support these APIs...

            just get the point DXVK and VKD3D works over moltenvk over metal...
            You're making excuses for Apple. They could and should make a native Vulkan driver. If given the choice then nobody would use Metal if Vulkan was available. Also like Rosetta2, you're losing performance with MoltenVK.
            just see the USB-typ-C case in the EU apple could be forced to support linux by politics...
            That was a good thing. I care not which wire I use to charge my device so long as they are all the same wires.
            also they could simple change mind because of the positive PR/response...
            I'm sensing a lot of potential.
            but one is clear with negative people like you they maybe think better not touch this toxic linux community...
            So I'm toxic for pointing out how toxic Apple is? The irony. Apple is a bad company for so many reasons, even making Microsoft look like a saint. Not that Microsoft is pro open source or pro consumer rights but Microsoft wishes they had what Apple has. The things Microsoft does that's evil is because they want to become Apple. When Microsoft released Windows for ARM they only let users get apps from the app store, just like on iOS. Now Microsoft is preventing Linux install, just like Apple's T2 chip. Microsoft's Pluton is no different from Apple's T2 Security Chip, just that Microsoft is many years late to the party.

            Remember that you're defending a company who again doesn't use open standards and forces their proprietary standards. You support a company who has not contributed to Linux. You support a company who has employee's that jump off building because they're making your crap. This is also the same company that had a revolt from their employee's in India due to pay. You know, the same company who dodges taxes but moving it around the world. You defending Apple makes you no saint.

            Comment


            • #66
              Thanks for sharing the things on power use, those look interesting. It's still significant that battery life is excellent for many everyday uses, however.

              People from any community/group can be toxic, and sadly a lot are from IT communities in general as far as I can tell - it's not unique to Linux/macOS/Windows/BSDs/etc but I agree that it isn't good.

              EDIT: Also, for the record, I mainly use Linux on a 3rd gen i7 (laptop) and a Ryzen-powered desktop. Windows runs in VMs, and I have a few Macs around for macOS.
              Last edited by hamishmb; 20 July 2022, 03:30 AM.

              Comment


              • #67
                Originally posted by Dukenukemx View Post
                You're missing the point. I'm not against it, I'm just saying don't confuse ARM vs ASIC. There are limitations that comes inherently with ASIC's as well, such as not able to encode in a new or unsupported video format.
                to be honest you are not agaist it but you sound like it...

                the problem of new codexes emerging who do not run on old-ASICs are not a problem at all and i can explain you why. in the past offline world this would have had a problem because on the DVD or Bluray are only 1 movie in only 1 video format and if your general purpose hardware is to slow and your ASIC does not support it then it does not run. in the time of netflix and youtube this is no longer true! why? simple they detect what your machine does support on codex and then they send you the right video with the right codex.
                if your hardware only supports x264 you get the 264 data
                if your hardware supports 264 and vp9 youtube choose to send you vp9 to save bandwidth..
                if your hardware also supports h265 and you choose 4K resolution netflix send to you the right video data on h265
                if your hardware also supports VP1 these companies send you the right video data ...

                this means you claim there is a problem on the clind sind because ASIC does not support a new emerging codex but all this is fixed server side on netflix and youtube and the clinds always get the video data they need.

                Originally posted by Dukenukemx View Post
                Depends on the game but Breath of the Wild gets at best 3 hours on the Nintendo Switch. Switch has a 4310mAh battery vs 5,313mAh on the Deck. Not a world of difference, plus you can emulate the Switch on the Deck.
                if the games would not be legacy closed source apps i am sure valve would be better of using m2 hardware from apple... but the problem is thats all legacy x86 closed source apps so ARM is not an option.

                Originally posted by Dukenukemx View Post
                No they haven't. But this again does largely depend on the game. For example, this person gets 2 hours and 40 minutes in World of Warcraft at max settings. He was able to get 10 hours but he also lowered the settings down all the way and limited the frame rate. Though getting 35fps in WoW is rather pathetic. Here's a gaming laptop also getting 10hours with lowered settings. Here's another person running Fortnite and only getting 1 hour and 36 minutes.
                well game on mobile device on "max settings" like on desktop... who in the real world would ever do this ?

                Originally posted by Dukenukemx View Post
                These are valid, yes. AMD is going 5nm for desktop this year, and next year they'll be on 4nm on laptop. Intel bought a lot of TSMC 3nm, just like Apple so expect Intel to have 3nm parts next year, just like Apple.
                well thats all true but this also shows that they are 2 years behind apple...

                Originally posted by Dukenukemx View Post
                What you want to do can be done on a Chromebook. You're also avoiding my points. Use the CPU hard and it will eat the same power as a x86. If the point of owning a 8 or 10 core CPU with 10+ GPU's is to remote access a workstation then that's fiscally irresponsible, also a poor excuse.
                yes right. thats all right but i just showed you how developer do this...
                non of these competent people buy a light small laptop and then do compiling kernels on it on a regular basis.
                of course you can do it on w chromebook but i think if you have the money the apple m2 is the better option...

                Originally posted by Dukenukemx View Post
                You're making excuses for Apple. They could and should make a native Vulkan driver. If given the choice then nobody would use Metal if Vulkan was available. Also like Rosetta2, you're losing performance with MoltenVK.
                thats not a excuse at all in my point of view mandle,vulkan,dx12,metall,WebGPU thats all nearly the same.
                it is not like openGL vs direktX11 that was a huge difference...
                all these technologies: mandle,vulkan,dx12,metall,WebGPU are all historically copy from AMD mandle..
                and the translation between all these mandle,vulkan,dx12,metall,WebGPU is simple...
                it is not like translate directX11 to openGL is a hard task because of the big architectual differences.

                and also there is a real possibility that WebGPU will replace all these standards in the future.
                because WebGPU is what they all agree on. but the difference between WebGPU and Metall is smal.. remember WebGPU directly comes from apple...




                Originally posted by Dukenukemx View Post
                That was a good thing. I care not which wire I use to charge my device so long as they are all the same wires.
                I'm sensing a lot of potential.
                yes in rare cases sometimes politics do something good.
                like the Typ-C case politics could force apple to make second operating system possible on apple iPhones..
                they could force apple to also support linux on their hardware.

                politics if not occupied by cabale deep state members could do a lot of good.

                Originally posted by Dukenukemx View Post
                So I'm toxic for pointing out how toxic Apple is? The irony. Apple is a bad company for so many reasons, even making Microsoft look like a saint. Not that Microsoft is pro open source or pro consumer rights but Microsoft wishes they had what Apple has. The things Microsoft does that's evil is because they want to become Apple. When Microsoft released Windows for ARM they only let users get apps from the app store, just like on iOS. Now Microsoft is preventing Linux install, just like Apple's T2 chip. Microsoft's Pluton is no different from Apple's T2 Security Chip, just that Microsoft is many years late to the party.
                Remember that you're defending a company who again doesn't use open standards and forces their proprietary standards. You support a company who has not contributed to Linux. You support a company who has employee's that jump off building because they're making your crap. This is also the same company that had a revolt from their employee's in India due to pay. You know, the same company who dodges taxes but moving it around the world. You defending Apple makes you no saint.
                you are not Toxic because you tell the truth you are 'Toxic in how you handle the topic.
                i think even if you would get a phone call from apple make you technical director on a new all-in-linux group you would not stop being toxic...

                dude the last apple product i ever touched was a macintosh performa 5200 from 1994 ... so i am not an apple fanboy.

                and to be honest i think microsoft is more evil because of the monopoles they have and all the good stuff from microsoft is not because of microsoft but instead because of anti trust regulations put on microsoft.

                apple does in fact impact me zero time because i just do not have apple products...

                but microsoft hurts me very much even if i do not have windows installed... but every job you do you come in touch with microsoft office and even if you want to perform a professional training in the computer sector they make the final exame on microsoft windows.

                yes right apple is evil and microsoft is even more evil ...

                but apple could change and join the linux movement at any time ...
                Phantom circuit Sequence Reducer Dyslexia

                Comment


                • #68
                  Originally posted by hamishmb View Post
                  Thanks for sharing the things on power use, those look interesting. It's still significant that battery life is excellent for many everyday uses, however.

                  People from any community/group can be toxic, and sadly a lot are from IT communities in general as far as I can tell - it's not unique to Linux/macOS/Windows/BSDs/etc but I agree that it isn't good.

                  EDIT: Also, for the record, I mainly use Linux on a 3rd gen i7 (laptop) and a Ryzen-powered desktop. Windows runs in VMs, and I have a few Macs around for macOS.
                  its not telling the truth and point to facts what makes it toxic it is how you handle the topic

                  for example act like people like me are irrational apple fanboys even if i do not have any apple product...

                  or his attack on apple metall because it is not a standard but it is fact that apple goes all on on WebGPU as a standard and WebGPU will replace all standards like dx12 or vulkan in the long run.

                  many of these toxic people claim you are not allowed to talk about apple m2 cpus because apple itself is evil even if you never had bought any apple product and you plain and simple are interested in the technology of the apple m2 cpu ...

                  Rockchip does the same as apple if you like this way to go you can buy Rockchip shares or buy rockchip hardware. it is not as fast as apple m1/m2 but it is not x86 and it also has a ton of ASIC put on the ARM..

                  Phantom circuit Sequence Reducer Dyslexia

                  Comment


                  • #69
                    Originally posted by qarium View Post

                    to be honest you are not agaist it but you sound like it...

                    the problem of new codexes emerging who do not run on old-ASICs are not a problem at all and i can explain you why. in the past offline world this would have had a problem because on the DVD or Bluray are only 1 movie in only 1 video format and if your general purpose hardware is to slow and your ASIC does not support it then it does not run. in the time of netflix and youtube this is no longer true! why? simple they detect what your machine does support on codex and then they send you the right video with the right codex.
                    if your hardware only supports x264 you get the 264 data
                    if your hardware supports 264 and vp9 youtube choose to send you vp9 to save bandwidth..
                    if your hardware also supports h265 and you choose 4K resolution netflix send to you the right video data on h265
                    if your hardware also supports VP1 these companies send you the right video data ...

                    this means you claim there is a problem on the clind sind because ASIC does not support a new emerging codex but all this is fixed server side on netflix and youtube and the clinds always get the video data they need.
                    That's for streaming. I'm the kind of person who uses what's best for quality and file size. I still convert videos to Xvid due to my cars flip down display only support that format. If Google decides that you must use VP10 or whatever then you will use it.
                    if the games would not be legacy closed source apps i am sure valve would be better of using m2 hardware from apple... but the problem is thats all legacy x86 closed source apps so ARM is not an option.
                    Excuse me but isn't ARM also legacy closed source unless open by the developer? Isn't Mac OSX closed source? You aren't making sense as I've already told you that the M2 drains battery fast when playing demanding games. As fast or faster than some x86 machines.
                    well game on mobile device on "max settings" like on desktop... who in the real world would ever do this ?
                    Max settings or a game that you barely get a playable frame rate would have the same results in battery life. The only laptops that have this luxury are ones with discrete GPU's, and ones that are fairly modern. Also WoW was getting 35fps regardless of what the person did on the M1, and 35fps is pushing it for playable.
                    well thats all true but this also shows that they are 2 years behind apple...
                    Not gonna disagree with you here.
                    thats not a excuse at all in my point of view mandle,vulkan,dx12,metall,WebGPU thats all nearly the same.
                    it is not like openGL vs direktX11 that was a huge difference...
                    all these technologies: mandle,vulkan,dx12,metall,WebGPU are all historically copy from AMD mandle..
                    and the translation between all these mandle,vulkan,dx12,metall,WebGPU is simple...
                    it is not like translate directX11 to openGL is a hard task because of the big architectual differences.
                    You can't use Vulkan on Metal without a wrapper. Doesn't matter how related they are, because the code you use for Metal can't be use for anything else. The purpose of Metal is to push developers to learn something that can only be used on their platforms and thus hoping to force developers to make stuff exclusive for them. They're willing to bet they have so much market share and influence that it doesn't matter if you want Vulkan. Apple even banned MoltenVK for using a non-public interface.

                    This is not a new tactic as it's been tried by Sony. In the end open standards win. Even Microsoft ported DX12 to Linux because the writing is on the wall. It's not open source which makes it useless but they know you need other platforms to support something for it to stay relevant.

                    and also there is a real possibility that WebGPU will replace all these standards in the future.
                    because WebGPU is what they all agree on. but the difference between WebGPU and Metall is smal.. remember WebGPU directly comes from apple...
                    WebGPU is a wrapper for other API's and therefore like MotelVK will lose some performance.
                    you are not Toxic because you tell the truth you are 'Toxic in how you handle the topic.
                    Everyone wants to shoot the messenger.
                    i think even if you would get a phone call from apple make you technical director on a new all-in-linux group you would not stop being toxic...
                    You mean my opinion would be bought then yes. As it stands my opinion is my own.
                    but apple could change and join the linux movement at any time ...
                    We would not be better off for it. Microsoft kinda did and we all know there's going to be a version of Windows using the Linux kernel. I'm not going to think of the what possibilities there is with Apple or Microsoft. Their past is more than enough for me to pass judgement. You cannot change what Apple has done with what Apple could do. They're evil and anti-consumer.

                    Comment


                    • #70
                      Originally posted by Dukenukemx View Post
                      That's for streaming. I'm the kind of person who uses what's best for quality and file size. I still convert videos to Xvid due to my cars flip down display only support that format. If Google decides that you must use VP10 or whatever then you will use it.
                      that you convert videos for your old flip down display in your car is exactly the same case as google streaming convert the video into multiple formats on the server side.
                      google discovered that harddrive space does not cost them much but IO bandwidth to the consumers hurt them much. thats why if you upload a video they convert it to h264(for legacy hardware) and VP9(legacy but compared to 264 still save internet bandwith) and AV1 and i think they even do other codexes to like 265 and so one. every customer gets the stream what is best for them. google absolute favor is in AV1 because this save them much harddrive space because 1 AV1 file can serve multible resolution streams at the same time. h264 instead is mostly low-resolution because it is very expensive to have it in 0,5K and 1K and 2K and 4K video format. AV1 is much better 1 file handle all resolutions to the streaming customer.

                      Originally posted by Dukenukemx View Post
                      Excuse me but isn't ARM also legacy closed source unless open by the developer? Isn't Mac OSX closed source?
                      in theory ARM is legacy to and also closed source to... but if you do marketshare research compared to x86 the ARM ecosystem tent to be more opensource than x86

                      Originally posted by Dukenukemx View Post
                      You aren't making sense as I've already told you that the M2 drains battery fast when playing demanding games. As fast or faster than some x86 machines.
                      you have a misconception what is the benefit of ARM compared to x86
                      its not battery drain at playing games... it maybe does not save power at games but their GPU is Tile-based Rasterization what really saves energy. but thats not about the cpu.
                      there are much more metrics than power consumtion or not...

                      in my opinion the benefit of ARM compared to x86 is the fact that ARM general purpose cpu cores (i do not talk about the ASIC parts here i only talk about the general purpose cpu parts that it saves 5% transistors.
                      now you think 5% transistors who do consume power ?? then you say there is proof that it does not consume power... right i talk about death transistors who do nothing but they are there for legacy compatibility reasons.

                      intel or amd could do the same just cut out any MMX and X87 floadingpoint unit and SSE1 and even SSE2 and make it a SSE3/AVX only cpu... it looks like ARM is just faster in droping legacy to spare transistors.

                      Originally posted by Dukenukemx View Post
                      Max settings or a game that you barely get a playable frame rate would have the same results in battery life. The only laptops that have this luxury are ones with discrete GPU's, and ones that are fairly modern. Also WoW was getting 35fps regardless of what the person did on the M1, and 35fps is pushing it for playable.
                      right... but apple did not license AMD GPU tech like intel instead they did License Imagination Technologies powerVR Tile-based Rasterization because this is the one who use less energy compared to all other variants.

                      just ask yourself what apple could do better than this ? they use the lowest nm node they use the best IP tech they can license and so one...

                      it is similar to S3TC texture vram compression in the past this was the relevant patent to get performant and powersave graphics... sure the S3TC patent expired but the Imagination Technologies powerVR Tile-based Rasterization patent still exist and AMD does not have this patent means they can not use it

                      the only technology apple could add here is the FSR3 of the radeon RX7000xt series means the 4/8bit machine learning engine to perform similar task than nvidias DLSS2.3 with their 4/8bit machine learning hardware engine.

                      Originally posted by Dukenukemx View Post
                      You can't use Vulkan on Metal without a wrapper. Doesn't matter how related they are, because the code you use for Metal can't be use for anything else.
                      yes right a wrapper but it is a simple one compared to (openGL vs directX11)
                      and your second thought is wrong you claim you can not use metal code for a open-standard.
                      thats plain and simple wrong because the metal code is the same as in WebGPU.

                      WebGPU is a zombie like creature it has the high-level language of metal it also has the bytecode of vulkan SPIR-V (Standard Portable Intermediate Representation) and then it compiles into machine code via ACO or LLVM...

                      the official way to develop WebGPU is: you use the high-level language from metal what is the same in webGPU and then the first pass compiler translate it to SPIR-V bytecode and then it is compiled into binary.

                      the not so official way is you use any language you want compile it to SPIR-V and there is a official SPIR-V to metal reverse engineering tool who produce easy to read high-level code.

                      Originally posted by Dukenukemx View Post
                      The purpose of Metal is to push developers to learn something that can only be used on their platforms and thus hoping to force developers to make stuff exclusive for them.
                      yes right this sounds right and sane but it is not....
                      all your babbling is about metal and metal only without the knowelege that apple works on WebGPU to.
                      how does WebGPU fit into your theory even google and microsoft support it

                      your theory also does not fit into the fact that the high level representative language is part of the WebGPU standard. what is then translated to SPIR-V bytecode and then compiled to binary format.

                      if WebGPU becomes the defacto standard of the internet and compared with Webassembly you can run this code everywhere... it is even a fact that with WebGPU you do no longer need: DirectX or Vulkan or metall also you do not need openGL anymore.
                      the joke about WebGPU is this: you do not even need CUDE or OpenCL anymore

                      because WebGPU can do compute to

                      all your toxic anti apple babbling about how evil metal is dissolves in the air if you see WebGPU as the future standard for graphic API and gpu compute..

                      this means apple does not force a walled garden "force developers to make stuff exclusive for them"
                      instead they force the developers to develop for this best of a future we can get with WebGPU.
                      and they need to force it to defeat CUDA and OpenCL and OpenGL and DirectX and so one and so one.

                      WebGPU will be the one standard to defeat all of them.

                      Originally posted by Dukenukemx View Post
                      They're willing to bet they have so much market share and influence that it doesn't matter if you want Vulkan. Apple even banned MoltenVK for using a non-public interface.
                      they did not banned MoltenVK in generall they did ban the use of non-public interface..
                      in my knowelege they patched MoltenVK to not use this non-public interface..


                      Originally posted by Dukenukemx View Post
                      This is not a new tactic as it's been tried by Sony. In the end open standards win. Even Microsoft ported DX12 to Linux because the writing is on the wall. It's not open source which makes it useless but they know you need other platforms to support something for it to stay relevant.

                      WebGPU is a wrapper for other API's and therefore like MotelVK will lose some performance.
                      i do not think you lose performance because in WebGPU you translate metal high level language to vulkan SPIR-V ... and from this point on it is exactly the same as vulkan... because vulkan does the same.

                      Originally posted by Dukenukemx View Post
                      Everyone wants to shoot the messenger.
                      You mean my opinion would be bought then yes. As it stands my opinion is my own.
                      We would not be better off for it. Microsoft kinda did and we all know there's going to be a version of Windows using the Linux kernel. I'm not going to think of the what possibilities there is with Apple or Microsoft. Their past is more than enough for me to pass judgement. You cannot change what Apple has done with what Apple could do. They're evil and anti-consumer.
                      all these evil and anti-consumer people have a education problem they don't know that they do not need to be evil and anti-consumer .. they smallowed their own satanist evilness propaganda over time that they think it is true. but it is nothing more than a illusion.

                      the good players like IBM/RedHat,Valve,AMD they make more money than the bad evil players...

                      even this: my last apple product was macintosh performa 5200 from 1994... this alone is proof that apple burns so many people who never again buy any apple product...

                      it would be much better for them to be on the good and right side of history like valve...

                      do you know the steve jobs video about internet explorer from microsoft ?
                      microsoft did pay apple money to integrate internet explorer into macOS.
                      and the result of the public and audience was shockingly...

                      watch this part of the video...

                      if you are on the evil side they don't cheer you on, they boo you out...

                      maybe apple change more than just the USB-Type-C stuff in near future.
                      Phantom circuit Sequence Reducer Dyslexia

                      Comment

                      Working...
                      X