Announcement

Collapse
No announcement yet.

Steam For Linux In July Shows A 1.23% Marketshare, AMD CPUs Now More Common Than Intel On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by Dukenukemx View Post
    It's very obsolete. The M2 has been out and the latest AMD laptop APU is now in the 6000's. Also notebookcheck is just comparing the official numbers, which doesn't really help. AMD's 6800U makes the M2 look pointless. The problem with benchmarks on the Apple M series is that nobody cares. The majority of reviews are done by Apple only reviewers, which clearly have a bias. The ones that aren't Apple only like Linus Tech Tips don't know how to benchmark as they'll compare the Apple M's to a desktop Intel with a RTX 3090 because these components are not designed for power efficiency. AMD uses 7nm for Desktop parts and 6nm for their laptop parts, and you'll be seeing Intel do the same next year with Intel's 7nm for desktop and TSMC's 3nm for laptops.
    This is a good review that actually did power tests on the M2.
    I've been using Linux for over a decade but never 100% left Windows due to gaming. Earlier this year I completely dumped Windows in favor of Linux and I can play my games with a bit of trouble but it can be done. Elden Ring for example runs just fine in Linux while a Mac OSX user on the M1/M2's cannot. Same goes for games like CyberPunk 2077 and Halo Infinite in that you can't get these kind of games working on Apple M series running Mac OSX. Thanks to Lord Gaben and Valve's investments into Linux with Proton, ACO, DXVK, VKD3D, and etc we can play those games on Linux. Valve isn't putting that kind of effort into Mac OSX and probably never will. We all agree that Windows is shit and the move to Windows 11 has proven that Microsoft isn't about to care about your privacy or rights as a consumer. But Apple's Mac OSX is actually worse as Apple regularly shits on their consumers and they actually like it. If Apple did care then they would have Vulkan as well as open sourced their drivers for Linux. You know like AMD and Intel has been doing for over a decade.
    your opinion is based on outdated knowlege. in 5 years no one will care anymore if a app runs on windows or linux or macos or whatever because nearly all apps will be webassembly... even the apps outside of the browser will be webassembly. only high performance critial apps will be in Rust and native ,,, also in 5 years no one will care anymore about the fight between vulkan and dx12 and metal and OpenGL and so one because nearly everyone will use WebGPU in the browser and also WebGPU outside of the browser.

    also to port vulkan to webgpu is very easy also to port metal apps to WebGPU us also easy... the only one who is not so lucky is microsoft with dx12...

    "I completely dumped Windows in favor of Linux and I can play my games with a bit of trouble but it can be done. Elden Ring for example runs just fine in Linux while a Mac OSX user on the M1/M2's cannot."

    in the near future this all will work on the M1/M2 to but not as fast as native apps.
    the translation layers from windows x86 to linux on ARM or macos on ARM is almost done.

    i tell you something i told people like 5-7 years ago valve will do an all AMD gaming computer people did not believe it.

    you will see Valve invest in the near future in Webassembly and WebGPU and also on apple m1/2 hardware.
    for valve they want to make money and they can not afford it to lose all these apple m1/2 customers
    Last edited by qarium; 04 August 2022, 03:54 PM.
    Phantom circuit Sequence Reducer Dyslexia

    Comment


    • #52
      Originally posted by Rabiator View Post
      Ah yes, I was behind the times concerning Apple.
      Found another review at Techspot, this time comparing the AMD 6800U (among others) to the M2. Without further research it seems that performance is similar enough that it should not be the deciding point. Except Adobe Premiere, where the M2 really outshines the AMD.
      If you want to video edit and only video edit then Apple is the way to go. AMD and Intel should have been doing this long ago with better video encoding and decoding hardware.
      Right now I have two reasons left to keep a Windows partition around. It is Windows 7 at the risk of catching a virus, as opposed to the known spyware Windows 10. Windows 11 is completely out.
      • World of Tanks, but this may change with my upcoming PC upgrade. I literally have the parts here, just need to get around to installing them.
      • Various diagnosis tools, for which I don't know a equal Linux alternative yet (those are rarely used).
      I fix computers and I generally insist that if you use Windows then use Windows 10 until support is over, then move onto Windows 11. Yes Windows sucks but I don't wanna deal with an outdated Windows that doesn't have security updates and watch the machines get viruses. Linux is something I recommend when the hardware is really old as Windows 10 will drop support for features that you still have in Linux. Like for example older Intel GPU's lose OpenGL while in Linux you have working OpenGL. Also, Intel GPU's that are Intel HD 3000 and up will have Vulkan support in Linux while you don't in Windows. Vulkan is a pretty big deal that it's enough to warrant the move to Linux. I'm also pretty good at making Linux Mint look like Windows, from the boot animation to the system sounds. If a person is coming from Windows is going to use Linux I want them to not have to deal with the transition.
      About Apple, their walled garden approach in general is enough to turn me off. How much they fuck their customers over on top of that is irrelevant to me. Nice image in your post though.
      Technology wise, I'm not following their software development, but how much performance they can squeeze out of their ARM-derivates is still interesting.
      When Apple moves to ARMv9 is when things get interesting. That's when the Apple silicon gets SVE2 which is the nearest instruction set to Intel's AVX-512. Which is great for emulators like RPCS3 and Yuzu as the developers explained the huge benefits you can get from using AVX-512.
      Last edited by Dukenukemx; 04 August 2022, 04:31 PM.

      Comment


      • #53
        Originally posted by qarium View Post
        your opinion is based on outdated knowlege. in 5 years no one will care anymore if a app runs on windows or linux or macos or whatever because nearly all apps will be webassembly... even the apps outside of the browser will be webassembly. only high performance critial apps will be in Rust and native ,,, also in 5 years no one will care anymore about the fight between vulkan and dx12 and metal and OpenGL and so one because nearly everyone will use WebGPU in the browser and also WebGPU outside of the browser.

        i tell you something i told people like 5-7 years ago valve will do an all AMD gaming computer people did not believe it.

        you will see Valve invest in the near future in Webassembly and WebGPU and also on apple m1/2 hardware.
        for valve they want to make money and they can not afford it to lose all these apple m1/2 customers
        Looking at the WebGPU page at ChromeDevelopers, their status is still very much pre-alpha. Now I think Google is capable of both pushing WebGPU into the market and doing a decent job in developing it. But it seems premature to predict a near-total switch of the software world to WebGPU.

        About Valve's all AMD gaming computer, the surprise was mainly in AMD coming back with Zen / RDNA. Some form of x86 based gaming computer was an obvious enough idea, but 5-7 years ago I'd have bet on some Intel/Nvidia combo.

        Linux as OS is not much of a surprise. Gabe Newell said years ago that he expected Microsoft to use its app store to push out the competition in Windows software. That might have been the main reason for Valve to build Proton into Steam, and generally try to lessen its dependence on Microsoft.

        In the future, I think Valve will have to keep supporting the trends in (Windows) games development. They are not big enough to pull a Sony or Nintendo and create their own games ecosystem. But when comparing the importance of Linux and Mac for Valve, I think Linux wins. Mac has more market share, but still below 2% of players on Steam. Linux as OS of the Steam Deck is an increasingly vital part of Valve's business.
        Last edited by Rabiator; 04 August 2022, 05:50 PM.

        Comment


        • #54
          Originally posted by qarium View Post

          your opinion is based on outdated knowlege. in 5 years no one will care anymore if a app runs on windows or linux or macos or whatever because nearly all apps will be webassembly... even the apps outside of the browser will be webassembly. only high performance critial apps will be in Rust and native ,,, also in 5 years no one will care anymore about the fight between vulkan and dx12 and metal and OpenGL and so one because nearly everyone will use WebGPU in the browser and also WebGPU outside of the browser.
          You are not very good at making predictions. In 5 years we'll doing the same as we've been doing. If we wanted to move to the web browser we would have done so already. As much as people predicted the cloud was the future, it really isn't. WebGPU will never take off because you're asking developers to yet again learn a new API. Until enough people are using Linux, the gaming world will continue to use DX12. Proton is that bridge that will allow people to move onto Linux and give developers a reason to think about a native Linux port using Vulkan.


          also to port vulkan to webgpu is very easy also to port metal apps to WebGPU us also easy... the only one who is not so lucky is microsoft with dx12...
          Just using Vulkan is even easier. DX12 isn't needed since Windows can already run Vulkan. You know an OS that can't?
          in the near future this all will work on the M1/M2 to but not as fast as native apps.
          the translation layers from windows x86 to linux on ARM or macos on ARM is almost done.
          No they won't. in 5 years we Linux users will still be running Windows games through Proton because we'll need a lot more market share to get developers to consider porting. Indie developers port like crazy to Linux because they don't care about wasting resources, but AAA developers do. Not to forget the support once the port is done.
          i tell you something i told people like 5-7 years ago valve will do an all AMD gaming computer people did not believe it.
          Valve may yet still do an Intel as well. Remember that 5-7 years ago people wrote off Steam machines as a failure and expected Valve to go back to Windows, but Valve was hard at work getting things like ACO for AMD and Vulkan for Intel. Basically Valve wanted to cover all their bases because unlike Apple, they can go to Intel and AMD anytime they want while keeping 100% compatibility. AMD makes sense because of power efficiency and cost.
          you will see Valve invest in the near future in Webassembly and WebGPU and also on apple m1/2 hardware.
          for valve they want to make money and they can not afford it to lose all these apple m1/2 customers
          No they won't. Valve spent a lot of money trying to get as much performance out of games running in Linux. What you're assuming is that Valve will go Webassembly and WebGPU to lose performance because of Apple M1/M2. Apple shit the bed with ARM when it comes to gaming because there's more Intel Macs in the wild than M1/M2 based Macs. Meaning developers have a nightmare in dealing with support for what is essentially 10% of the total computer market. Not to forget no Vulkan support.

          You wanna know what'll happen in 5 years with Apple? They would have admitted going alone in making a SoC was a mistake as by then AMD and Intel would have caught up in power efficiency. Not only that but CPU performance would be outclassed by AMD and Intel as well. GPU performance would continue to be a joke to the point that Apple would probably be using AMD GPU's instead of their own in higher end models. It'll get so bad that they may ask for Nvidia to make ARM based SoC's because the cost of making their huge monolythic SoC's is probably killing their net profits. Eventually Apple's venture to ARM will be a failure as nobody is going to want to deal with using ARM when x86 does a better job and is more open. So either Apple upgrades their x86 compatibility performance even further or goes straight x86 and goes back to Intel.

          The reason I can say these things is because it has happened in the past with PowerPC. X86 is old and extremely dated compared to PowerPC and ARM but it doesn't matter when AMD and Intel keep putting in R&D and engineer the crap out of it. They can afford to do it because again they have 90% of the computer market to profit from, while also having the server market. Also the M2 256GB SSD models have cut costs by using only one chip for an SSD, which cuts performance in half. Which is an odd thing to do when your products are already much more expensive compared to your x86 competitors. The truth is Apple's silicon is expensive for Apple to make, and this is because the chips are big which means defects are more likely to occur, and they're using 5nm which nobody else uses yet. Not to forget that Apple has to pay engineers to make a product that both AMD and Intel have spent decades engineering. Eventually the cost of making their own SoC's to try and compete with AMD, Intel, and Nvidia is going to be too high. Even Qualcomm has a much larger market share for their SoC's even though compared to Apple's it's not as good. Which Qualcomm has went to AMD for their future products to include AMD's Radeon graphics. Shouldn't be a shock to anyone considering Qualcomm's Adreno graphics was bought from ATI and the word Adreno is just Radeon with rearranged letters.

          Comment


          • #55
            Originally posted by Rabiator View Post
            Looking at the WebGPU page at ChromeDevelopers, their status is still very much pre-alpha. Now I think Google is capable of both pushing WebGPU into the market and doing a decent job in developing it. But it seems premature to predict a near-total switch of the software world to WebGPU.
            who cares that it is pre-alpha ? it is now and today clear that WebGPU will suck every GPU graphics and compute software from all directions.

            just watch what kind of companies are all in on WebGPU... its not only google...

            Originally posted by Rabiator View Post
            About Valve's all AMD gaming computer, the surprise was mainly in AMD coming back with Zen / RDNA. Some form of x86 based gaming computer was an obvious enough idea, but 5-7 years ago I'd have bet on some Intel/Nvidia combo.
            for me it was always clear that intel and nvidia are companies on the evil side means they where never an option.

            Originally posted by Rabiator View Post
            Linux as OS is not much of a surprise. Gabe Newell said years ago that he expected Microsoft to use its app store to push out the competition in Windows software. That might have been the main reason for Valve to build Proton into Steam, and generally try to lessen its dependence on Microsoft.

            In the future, I think Valve will have to keep supporting the trends in (Windows) games development. They are not big enough to pull a Sony or Nintendo and create their own games ecosystem. But when comparing the importance of Linux and Mac for Valve, I think Linux wins. Mac has more market share, but still below 2% of players on Steam. Linux as OS of the Steam Deck is an increasingly vital part of Valve's business.
            "the trends in (Windows) games development"

            i think this will never happen again that any trend is on microsoft's side.

            in the future we will see all game engines go all in on Webassembly+WebGPU even outside of the browser.
            Phantom circuit Sequence Reducer Dyslexia

            Comment


            • #56
              M1 will win in many performance/power metrics, it´s an highly optimized ARM-Core on a modern 5nm Node + a very wide memory bus.
              Most ARM-Cores are more energy efficiant than the large x86 cores on most chips, as this was a design goal of the ARM-Architecture, not with x86, although they are pretty comparable nowerdays, x86 still has to support a lot of legacy crap, which needs chipspace + some power.

              The 5nm TSMC Node gives a huge advantage to the M1 Chip as well..
              Just wait till we see Zen 4 with Quad-Channel DDR5 + large on die Cache on TSMCs N5P later this year.

              Comment


              • #57
                Originally posted by Dukenukemx View Post
                You are not very good at making predictions.
                you can check the forum i did write about nvidia driver is going opensource long long time before nvidia did it.
                i even did predict that instead of tiny firmware like AMD Nvidia will go the Big-FAT-Firmware route and i did predict that many years before it happened.

                i did predict valve will go with all AMD hardware on CPU and GPU long time before valve did it.

                Originally posted by Dukenukemx View Post
                In 5 years we'll doing the same as we've been doing. If we wanted to move to the web browser we would have done so already.
                o please stop the bullshit talk you can do Webassembly+WebGPU outside of the web-browser.
                your argument "If we wanted to move to the web browser we would have done so already" is no argument at all.
                because i did not say: we do it because we now want to do it all in the web browser.

                they just use the webbrowser to establish the standard after that you can do it outside of the browser to.

                Originally posted by Dukenukemx View Post
                WebGPU will never take off because you're asking developers to yet again learn a new API. Until enough people are using Linux, the gaming world will continue to use DX12. Proton is that bridge that will allow people to move onto Linux and give developers a reason to think about a native Linux port using Vulkan.
                you clearly have no clue at all.

                if you are a vulkan developer and you want to switch to WebGPU the switch is super easy because it is the same on the bytecode level. stop lie and accept that it is the same.

                if you are a metal developer and you want to switch to WebGPU you just use your high-level metal language and compile it to WebGPU bytecode. very easy.

                the only people who need to do more than this are the DX12 developers because dx12 use its own high-level-shading language and also on the low level there are differences to the point of the bytecode gets compiled to assembler of the gpu... at this point dx12 is the same as vulkan and also metal.

                Originally posted by Dukenukemx View Post
                Just using Vulkan is even easier. DX12 isn't needed since Windows can already run Vulkan. You know an OS that can't?
                really no one cares. just using vulkan WILL NOT HAPPEN no matter how often you say it.


                Originally posted by Dukenukemx View Post
                No they won't. in 5 years we Linux users will still be running Windows games through Proton because we'll need a lot more market share to get developers to consider porting. Indie developers port like crazy to Linux because they don't care about wasting resources, but AAA developers do. Not to forget the support once the port is done.
                you think linux need marketshare WRONG you think Vulkan need marketshare WRONG

                WebAssembly+WebGPU already has all marketshare we need.

                microsoft edge supports it
                mozilla firefox supports it
                google chrome supports it.
                linux supports it outside of the browser
                windows supports it outside of the browser
                sony playstation 5 supports it
                Xbox supports it.

                future projects will plain and simple be ported to WebAssembly+WebGPU and then the game runs on every platform.

                Originally posted by Dukenukemx View Post
                Valve may yet still do an Intel as well. Remember that 5-7 years ago people wrote off Steam machines as a failure and expected Valve to go back to Windows, but Valve was hard at work getting things like ACO for AMD and Vulkan for Intel. Basically Valve wanted to cover all their bases because unlike Apple, they can go to Intel and AMD anytime they want while keeping 100% compatibility. AMD makes sense because of power efficiency and cost.
                well it is easy to say that valve will also go with intel hardware they maybe also support intel hardware but i am very sure valve will not sell intel hardware. and there are multible reasons for this intel ARC gpus are the "loser"
                intel arc gpus are the worst 150mm² arc gpu die lose agaist 100mm² amd gpu die.
                on the CPU side intel do better on desktop but only because of massive electric power consumption.
                AMD is king if you want x86_64 because of compatibility and also you want performance per watt for mobile devices like steam deck.

                there are other reasons why valve will not sell intel hardware. valve also watched the conection between apple and intel... and valve can learn one thing from this: before valve will touch any intel hartdware they will go to apple or rockchip or Ampere or samsung and will go with custom SOC ARM chip

                rockchip is good it is linux focused also samsung could be better option for gaming than apple because they have a AMD radeon RDNA2+ license for the GPU part.

                believe me no sane person at valve will ever touch intel... apple did it and it was no good deal.

                Originally posted by Dukenukemx View Post
                No they won't. Valve spent a lot of money trying to get as much performance out of games running in Linux. What you're assuming is that Valve will go Webassembly and WebGPU to lose performance because of Apple M1/M2. Apple shit the bed with ARM when it comes to gaming because there's more Intel Macs in the wild than M1/M2 based Macs. Meaning developers have a nightmare in dealing with support for what is essentially 10% of the total computer market. Not to forget no Vulkan support.
                valve will do it thats clear. just to get you a hind: all the smartTV are Rockchip ARM SOCs with Linux run on Webassembly+WebGPU...
                ARM cpus are a much bigger market than just apple M1/M2 notebooks.
                and the software stack to dynamically translate x86 to ARM is almost ready on all platforms.

                "What you're assuming is that Valve will go Webassembly and WebGPU to lose performance"

                you clearly have no clue... there are cases where WebAssembly is faster than "native"
                the reason for this is closed source native code become old and i mean very old there are 20-30 years old x86 code around run on windows 11 in modern hardware.
                is such cases even java can be faster than native and also WebAssembly because it allows to recompile for newer hardware on the operating system side because of this the code can become faster than the old native code.

                Webassembly+WebGPU is not designed to lose performance ... compared to old native closed source it will be faster.

                Originally posted by Dukenukemx View Post
                You wanna know what'll happen in 5 years with Apple? They would have admitted going alone in making a SoC was a mistake as by then AMD and Intel would have caught up in power efficiency. Not only that but CPU performance would be outclassed by AMD and Intel as well.
                in 5 years apple is on a 2nm IBM "3 layer 3D" "gate all around" High-NA (0.55NA) node
                what makes apple able to ship 60billion tranistor SOC chips with stacked 3D cache on the mother-die.
                the 2nm node will push the clock speeds from these chips to 4,5 ghz from now 3.2ghz
                also in 5 years apple entered the linux server and linux workstation market also linux embedded market with their M1/M2 SOC chips.

                Originally posted by Dukenukemx View Post
                GPU performance would continue to be a joke to the point that Apple would probably be using AMD GPU's instead of their own in higher end models. It'll get so bad that they may ask for Nvidia to make ARM based SoC's because the cost of making their huge monolythic SoC's is probably killing their net profits. Eventually Apple's venture to ARM will be a failure as nobody is going to want to deal with using ARM when x86 does a better job and is more open. So either Apple upgrades their x86 compatibility performance even further or goes straight x86 and goes back to Intel.
                on the GPU side Samsung already has AMD RDNA2 license for ARM SOCs
                so i really would not wonder if apple gets a AMD RDNA3+ license for their high-end models.
                but on the other side on low-end and middle-ground what apple already has is fine. so they maybe only need the AMD license for the highend modells.

                and be sure apple ARM cpu on 2nm node with 60billion tranistors at 4.5ghz will run x86 binarys fast.

                Originally posted by Dukenukemx View Post
                The reason I can say these things is because it has happened in the past with PowerPC. X86 is old and extremely dated compared to PowerPC and ARM but it doesn't matter when AMD and Intel keep putting in R&D and engineer the crap out of it. They can afford to do it because again they have 90% of the computer market to profit from, while also having the server market. Also the M2 256GB SSD models have cut costs by using only one chip for an SSD, which cuts performance in half. Which is an odd thing to do when your products are already much more expensive compared to your x86 competitors. The truth is Apple's silicon is expensive for Apple to make, and this is because the chips are big which means defects are more likely to occur, and they're using 5nm which nobody else uses yet.
                apple is already on 4nm TSMC node. and has already ordered TSMC 3nm node.
                and TSMC and apple have already a technology contract for IBM 2nm node...
                Apple plans to launch a series of Macs with M2 chips based on TSMC's 4nm process later this year, according to Taiwanese publication DigiTimes....

                Apple is preparing its faster M2 Pro and M2 Max for a variety of products, and here is everything you need to know about the two chipsets

                IBM (NYSE: IBM) today unveiled a breakthrough in semiconductor design and process with the development of the world's first chip announced with 2 nanometer (nm) nanosheet technology....

                The new chips would offer a much higher performance while using less power. It could also help ease the chip shortage.


                this means your dreams of "x86-64+engineer the crap out of it" will not happen.

                it looks more like this all will become horror-show for you and people like you.

                now you can buy 10nm intel cpus and soon you can buy 7nm intel cpus and intel will manufacture mobile chips in 5nm TSMC sure and they already produce ARC on TSMC 6nm

                but soon they need to compete with 4nm and 3nm chips and in less than 5 years 2nm chips.

                this shit will explode in intel's face: 60billion [email protected] on a single SOC chip...

                Originally posted by Dukenukemx View Post
                Not to forget that Apple has to pay engineers to make a product that both AMD and Intel have spent decades engineering. Eventually the cost of making their own SoC's to try and compete with AMD, Intel, and Nvidia is going to be too high. Even Qualcomm has a much larger market share for their SoC's even though compared to Apple's it's not as good. Which Qualcomm has went to AMD for their future products to include AMD's Radeon graphics. Shouldn't be a shock to anyone considering Qualcomm's Adreno graphics was bought from ATI and the word Adreno is just Radeon with rearranged letters.
                i really don't get it why apple just license AMD GPU for their high-end parts ?

                or do you think apple will not get the license.... even samsung did get RDNA2+ radeon license.
                and it is clear that apple do not need AMD license for their low-end and middle-ground hardware.
                they only need it for highend.

                i think apple will avoid intel and nvidia at all cost but they are fine to license RDNA3+ for their highend parts.
                Phantom circuit Sequence Reducer Dyslexia

                Comment


                • #58
                  Quoting several posts that I'm collecting into one reply...
                  Originally posted by qarium View Post
                  for me it was always clear that intel and nvidia are companies on the evil side means they where never an option.

                  "the trends in (Windows) games development"
                  i think this will never happen again that any trend is on microsoft's side.

                  in the future we will see all game engines go all in on Webassembly+WebGPU even outside of the browser.
                  The world does not always do what we want. My comment was about what I guess might happen, not about what I'd like to see.
                  5-7 years ago AMD was not in the shape to displace Intel and Nvidia technology wise.

                  I agree that the trends go away from Microsoft, but it is a slow process. I'd be surprised if they lose the top position in (desktop) market share within the next five years. Meaning software developers would lose lots of income ignoring Windows. In the meantime, at least Wine and Proton improve steadily, so we can enjoy a lot of software made for Windows on Linux too.

                  Originally posted by qarium View Post
                  you clearly have no clue... there are cases where WebAssembly is faster than "native"
                  the reason for this is closed source native code become old and i mean very old there are 20-30 years old x86 code around run on windows 11 in modern hardware.
                  is such cases even java can be faster than native and also WebAssembly because it allows to recompile for newer hardware on the operating system side because of this the code can become faster than the old native code.
                  [...]
                  or do you think apple will not get the license.... even samsung did get RDNA2+ radeon license.
                  and it is clear that apple do not need AMD license for their low-end and middle-ground hardware.
                  they only need it for highend.

                  i think apple will avoid intel and nvidia at all cost but they are fine to license RDNA3+ for their highend parts.
                  That ancient closed source native code is usually that old because the company in question does not want to rewrite it. Even if it would be cheaper long-term, management does frequently not see it that way. I worked at places that are very conservative in this. Sometimes with good reason. If you have to re-do a bunch of certifications in safety critical fields, a rewrite will become even more expensive.
                  Edit: And sometimes the old code is so messed up that it is next to unmaintainable. Then the developers are happy as well if they can just leave it alone.

                  About Apple switching graphics architectures, that would be a non-trivial effort at the hardware level. Now Apple is relatively open to such moves, but even then I doubt they would do it unless they fall behind. Maybe with RDNA3+, but it would have to actually beat what Apple has in the pipeline at that point.
                  Last edited by Rabiator; 05 August 2022, 06:19 AM.

                  Comment


                  • #59
                    Originally posted by qarium View Post

                    you can check the forum i did write about nvidia driver is going opensource long long time before nvidia did it.
                    i even did predict that instead of tiny firmware like AMD Nvidia will go the Big-FAT-Firmware route and i did predict that many years before it happened.

                    i did predict valve will go with all AMD hardware on CPU and GPU long time before valve did it.
                    Nvidia going open source should not be a shock to anyone. Including the hackers that held Nvidia hostage to do that very thing.

                    you clearly have no clue at all.

                    if you are a vulkan developer and you want to switch to WebGPU the switch is super easy because it is the same on the bytecode level. stop lie and accept that it is the same.

                    if you are a metal developer and you want to switch to WebGPU you just use your high-level metal language and compile it to WebGPU bytecode. very easy.
                    Or we could have Apple adopt Vulkan and nobody has to deal with WebGPU ever. Less work, less overhead.
                    really no one cares. just using vulkan WILL NOT HAPPEN no matter how often you say it.
                    Too late, everyone already does use Vulkan. The only one who doesn't is Apple and their customers are the ones who lose in the end.

                    you think linux need marketshare WRONG you think Vulkan need marketshare WRONG

                    WebAssembly+WebGPU already has all marketshare we need.

                    microsoft edge supports it
                    mozilla firefox supports it
                    google chrome supports it.
                    linux supports it outside of the browser
                    windows supports it outside of the browser
                    sony playstation 5 supports it
                    Xbox supports it.

                    future projects will plain and simple be ported to WebAssembly+WebGPU and then the game runs on every platform.
                    How many games you see using it? Forget games, what about applications? How would a game that's like nearly 100GB going to work in a web browser? Also is WebAssembly 32-bit? Fallout 3 has problems with a 4GB limit, let alone modern games not built in 2008. It also needs a Virtual machine? None of these sound fast for gaming. I know they're working on 64-bit but it's clearly not ready at all.
                    well it is easy to say that valve will also go with intel hardware they maybe also support intel hardware but i am very sure valve will not sell intel hardware. and there are multible reasons for this intel ARC gpus are the "loser"
                    intel arc gpus are the worst 150mm² arc gpu die lose agaist 100mm² amd gpu die.
                    on the CPU side intel do better on desktop but only because of massive electric power consumption.
                    AMD is king if you want x86_64 because of compatibility and also you want performance per watt for mobile devices like steam deck.
                    Valve isn't Apple in that they have to stick with AMD or Intel. Maybe next year Intel ARC is a beast and Intel is selling them for a song. The point is Valve can switch between AMD and Intel and can do so without issue. That doesn't mean Valve will or won't, it just means the option is there.
                    there are other reasons why valve will not sell intel hardware. valve also watched the conection between apple and intel... and valve can learn one thing from this: before valve will touch any intel hartdware they will go to apple or rockchip or Ampere or samsung and will go with custom SOC ARM chip
                    Why would Valve hurt themselves by going ARM? It's bad enough getting games working on linux with x86 but now you think Valve will add ARM and the need to emulate x86 on it?
                    rockchip is good it is linux focused also samsung could be better option for gaming than apple because they have a AMD radeon RDNA2+ license for the GPU part.
                    Rockchip is usually found on cheap Chinese no name brand devices using Mali graphics. Not my top choice for gaming.
                    believe me no sane person at valve will ever touch intel... apple did it and it was no good deal.
                    Apple left Intel for a number of reasons. Top one was manufacturing as Intel wasn't doing anything to improve it. Second was the fact that Intel was falling behind AMD. Third is that Apple can make their own CPU's and save a bunch of money doing so, as Apple doesn't like to depend on other manufacturers to make their stuff. They already make them for their mobile devices, so it's natural that Apple would do it for their laptops.
                    valve will do it thats clear. just to get you a hind: all the smartTV are Rockchip ARM SOCs with Linux run on Webassembly+WebGPU...
                    ARM cpus are a much bigger market than just apple M1/M2 notebooks.
                    and the software stack to dynamically translate x86 to ARM is almost ready on all platforms.
                    That's a lot of wishful thinking. There's a lot of ARM devices but most of them don't have enough storage to install Cyberpunk 2077, let alone run it. Maybe Indie games but most likely those games like Among Us is already ported on them. Maybe if Apple discovered the SD card slot and allowed for expandable storage.
                    "What you're assuming is that Valve will go Webassembly and WebGPU to lose performance"

                    you clearly have no clue... there are cases where WebAssembly is faster than "native"
                    the reason for this is closed source native code become old and i mean very old there are 20-30 years old x86 code around run on windows 11 in modern hardware.
                    is such cases even java can be faster than native and also WebAssembly because it allows to recompile for newer hardware on the operating system side because of this the code can become faster than the old native code.

                    Webassembly+WebGPU is not designed to lose performance ... compared to old native closed source it will be faster.
                    You're making a senario where you assume all code is old and therefore WebAssembly+WebGPU will be better automatically. Minecraft is built on Java but eventually they made a Bedrock version for Windows that runs C++, and they did it for a reason because it runs faster than Java. You can get better performance on Java Edition with mods like Sodium but that's just the nature of code.

                    in 5 years apple is on a 2nm IBM "3 layer 3D" "gate all around" High-NA (0.55NA) node
                    what makes apple able to ship 60billion tranistor SOC chips with stacked 3D cache on the mother-die.
                    the 2nm node will push the clock speeds from these chips to 4,5 ghz from now 3.2ghz
                    also in 5 years apple entered the linux server and linux workstation market also linux embedded market with their M1/M2 SOC chips.
                    I don't know about 5 years but I do know that by next year Intel is going to give Apple a huge problem. Remember that Intel bought 3nm from TSMC aggressively, probably to deny Apple from as much 3nm as possible, as Intel paid the highest price for it. Intel is no longer limiting themselves to their own manufacturing. AMD will be 5nm by the end of this year, and will be 4nm by next year for mobile parts. Apple's manufacturing advantage is going to be gone.
                    Last edited by Dukenukemx; 05 August 2022, 10:11 AM.

                    Comment


                    • #60
                      Originally posted by Dukenukemx View Post
                      Why would Valve hurt themselves by going ARM? It's bad enough getting games working on linux with x86 but now you think Valve will add ARM and the need to emulate x86 on it?
                      there are 2 reasons why Valve will do it: first the ARM marketshare (Smart-TV/Smartphones/car-computer and other stuff) is already to big to ignore it. remember i do not talk about desktop marketshare or notebook marketshare. you can ignore ARM on desktop and notebook but if you just count chip-sells no matter what is the endproduct...

                      second reason is: mobile future steam deck could get better battery life if they go with ARM instead of x86...
                      o and yes i know you disagree on this point. but who cares.

                      Originally posted by Dukenukemx View Post
                      Rockchip is usually found on cheap Chinese no name brand devices using Mali graphics. Not my top choice for gaming.
                      if you are from a rich country it does not sound good but worldwide it is different for many poor countries.

                      i would never thought to think of any good of any ARM device but the newest Rockchip SOCs are really really good and do even have features apple M1/M2 does not have like AV1 encode and decode..

                      right now rockchip is not as fast as apple m1/m2 but still a impressive development and much faster than raspberry pi 4..

                      Originally posted by Dukenukemx View Post
                      Apple left Intel for a number of reasons. Top one was manufacturing as Intel wasn't doing anything to improve it. Second was the fact that Intel was falling behind AMD. Third is that Apple can make their own CPU's and save a bunch of money doing so, as Apple doesn't like to depend on other manufacturers to make their stuff. They already make them for their mobile devices, so it's natural that Apple would do it for their laptops.
                      and why valve should not follow apple ?

                      Originally posted by Dukenukemx View Post
                      That's a lot of wishful thinking. There's a lot of ARM devices but most of them don't have enough storage to install Cyberpunk 2077, let alone run it. Maybe Indie games but most likely those games like Among Us is already ported on them. Maybe if Apple discovered the SD card slot and allowed for expandable storage.
                      most of the games on steam are not like cyberpunk 2077 with 100GB and more instead they are small games like Valheim..

                      Originally posted by Dukenukemx View Post
                      You're making a senario where you assume all code is old and therefore WebAssembly+WebGPU will be better automatically. Minecraft is built on Java but eventually they made a Bedrock version for Windows that runs C++, and they did it for a reason because it runs faster than Java. You can get better performance on Java Edition with mods like Sodium but that's just the nature of code.
                      well you claimed WebAssembly+WebGPU is slow but there are many cases who it is faster and thats a fact.

                      Originally posted by Dukenukemx View Post
                      I don't know about 5 years but I do know that by next year Intel is going to give Apple a huge problem. Remember that Intel bought 3nm from TSMC aggressively, probably to deny Apple from as much 3nm as possible, as Intel paid the highest price for it. Intel is no longer limiting themselves to their own manufacturing. AMD will be 5nm by the end of this year, and will be 4nm by next year for mobile parts. Apple's manufacturing advantage is going to be gone.
                      you just don't get it intel sells 10nm cpus and apple already produce 4nm chips.

                      in the moment intel will be on 3nm apple will be on 2nm or even 1nm node ...
                      Phantom circuit Sequence Reducer Dyslexia

                      Comment

                      Working...
                      X