Announcement

Collapse
No announcement yet.

Steam For Linux In July Shows A 1.23% Marketshare, AMD CPUs Now More Common Than Intel On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • qarium
    replied
    Originally posted by Dukenukemx View Post
    Most of those devices aren't powerful enough to run AAA games. Old games or Indie games, maybe. Even then, nobody is going to let Valve install Steam and lose a cut of the profits from their own store. These are walled gardens for a reason, and that reason is to profit from the sale of apps through their store.
    Valve makes a lot of money on non-AAA-Games... no need for AAA games to make money.
    you also just think these walled gardens stuff wrong.. why? simple because valve has their steam deck walled garden to.
    this means it is some kind of this for that deal valve allows them to have a store on steam deck and because of this they allow steam to have a steam store on their devices. not a big deal.

    Originally posted by Dukenukemx View Post
    I won't disagree but that really depends on who's making the ARM chip. Nvidia, sure because Nvidia knows how to make a GPU. Rockchip with Mali graphics is a no. Apple without Vulkan is a no. Not to forget you still need to emulate x86 because the overwhelming vast majority of games are running x86.
    thats all more flexible than you think... for example Rockchip can license RDNA3+,... also like the ACO compiler for AMD hardware valve could plain and simple just write a vulkan driver for apple hardware.
    it also looks like emulate x86 is no problem... yes you lose performance but if great market success brings new games compiled for ARM you could still win on battery time.

    Originally posted by Dukenukemx View Post
    You can still buy cheap x86 machines to play games. A lot of people do this already. If you're from a really poor country then you're gaming on your smart phone. This is why the mobile market is so big because most people in the world are poor, and everyone does have a smart phone. iOS and Android are not going to pave the wave for ARM gaming. Like I said, Linux on x86 still has lots of problems.
    but technically speaking these are all ARM cpu gamers... in their smartphones ;-)


    Originally posted by Dukenukemx View Post
    Apple dumped Nvidia due to the issues from the early 2010's where the Nvidia chips failed and left Apple as well as many other manufacturers to deal with those problems. Those manufacturers came back to Nvidia, but not Apple. It does nobody any good to limit themselves to one manufacturer as price gouging will occur. Intel has a history of being assholes.
    i do not see any reason to do business with assholes like intel or nvidia...

    Originally posted by Dukenukemx View Post
    If a developer of Indie game isn't porting then it's due to difficulty. Steam's advantage is that they make it easy to bring your game. Why you think Valve backed Vulkan?
    Show me a game doing this.
    I do not know any game yet who does it because it is still a new upcoming technology.

    Originally posted by Dukenukemx View Post
    I'm telling you what's happening not what may happen. Intel bought the shit out of 3nm and by next year will have chips running 3nm. Apple's M series isn't on 4nm. The Apple M2 should be on 3nm but TSMC ran into problems and it was delayed. AMD will released Zen4 with RDNA3 on 5nm soon while next year their mobile parts will be on 4nm.
    apple m2 is not on 4nm or 3nm right but M2max will be on 4nm

    intel was only able to buy any TSMC 3nm node capacity because apple instead did buy IBM based 2nm,,,

    you think intel will be able to be on a better node than apple but i think this is wrong. on the time intel will be on 3nm apple will be on 2nm...

    Leave a comment:


  • Dukenukemx
    replied
    Originally posted by qarium View Post

    there are 2 reasons why Valve will do it: first the ARM marketshare (Smart-TV/Smartphones/car-computer and other stuff) is already to big to ignore it. remember i do not talk about desktop marketshare or notebook marketshare. you can ignore ARM on desktop and notebook but if you just count chip-sells no matter what is the endproduct...
    Most of those devices aren't powerful enough to run AAA games. Old games or Indie games, maybe. Even then, nobody is going to let Valve install Steam and lose a cut of the profits from their own store. These are walled gardens for a reason, and that reason is to profit from the sale of apps through their store.
    second reason is: mobile future steam deck could get better battery life if they go with ARM instead of x86...
    o and yes i know you disagree on this point. but who cares.
    I won't disagree but that really depends on who's making the ARM chip. Nvidia, sure because Nvidia knows how to make a GPU. Rockchip with Mali graphics is a no. Apple without Vulkan is a no. Not to forget you still need to emulate x86 because the overwhelming vast majority of games are running x86.
    if you are from a rich country it does not sound good but worldwide it is different for many poor countries.

    i would never thought to think of any good of any ARM device but the newest Rockchip SOCs are really really good and do even have features apple M1/M2 does not have like AV1 encode and decode..

    right now rockchip is not as fast as apple m1/m2 but still a impressive development and much faster than raspberry pi 4..
    You can still buy cheap x86 machines to play games. A lot of people do this already. If you're from a really poor country then you're gaming on your smart phone. This is why the mobile market is so big because most people in the world are poor, and everyone does have a smart phone. iOS and Android are not going to pave the wave for ARM gaming. Like I said, Linux on x86 still has lots of problems.


    and why valve should not follow apple ?
    Apple dumped Nvidia due to the issues from the early 2010's where the Nvidia chips failed and left Apple as well as many other manufacturers to deal with those problems. Those manufacturers came back to Nvidia, but not Apple. It does nobody any good to limit themselves to one manufacturer as price gouging will occur. Intel has a history of being assholes.


    most of the games on steam are not like cyberpunk 2077 with 100GB and more instead they are small games like Valheim..
    If a developer of Indie game isn't porting then it's due to difficulty. Steam's advantage is that they make it easy to bring your game. Why you think Valve backed Vulkan?
    well you claimed WebAssembly+WebGPU is slow but there are many cases who it is faster and thats a fact.
    Show me a game doing this.
    you just don't get it intel sells 10nm cpus and apple already produce 4nm chips.

    in the moment intel will be on 3nm apple will be on 2nm or even 1nm node ...
    I'm telling you what's happening not what may happen. Intel bought the shit out of 3nm and by next year will have chips running 3nm. Apple's M series isn't on 4nm. The Apple M2 should be on 3nm but TSMC ran into problems and it was delayed. AMD will released Zen4 with RDNA3 on 5nm soon while next year their mobile parts will be on 4nm.

    Leave a comment:


  • qarium
    replied
    Originally posted by Dukenukemx View Post
    Why would Valve hurt themselves by going ARM? It's bad enough getting games working on linux with x86 but now you think Valve will add ARM and the need to emulate x86 on it?
    there are 2 reasons why Valve will do it: first the ARM marketshare (Smart-TV/Smartphones/car-computer and other stuff) is already to big to ignore it. remember i do not talk about desktop marketshare or notebook marketshare. you can ignore ARM on desktop and notebook but if you just count chip-sells no matter what is the endproduct...

    second reason is: mobile future steam deck could get better battery life if they go with ARM instead of x86...
    o and yes i know you disagree on this point. but who cares.

    Originally posted by Dukenukemx View Post
    Rockchip is usually found on cheap Chinese no name brand devices using Mali graphics. Not my top choice for gaming.
    if you are from a rich country it does not sound good but worldwide it is different for many poor countries.

    i would never thought to think of any good of any ARM device but the newest Rockchip SOCs are really really good and do even have features apple M1/M2 does not have like AV1 encode and decode..

    right now rockchip is not as fast as apple m1/m2 but still a impressive development and much faster than raspberry pi 4..

    Originally posted by Dukenukemx View Post
    Apple left Intel for a number of reasons. Top one was manufacturing as Intel wasn't doing anything to improve it. Second was the fact that Intel was falling behind AMD. Third is that Apple can make their own CPU's and save a bunch of money doing so, as Apple doesn't like to depend on other manufacturers to make their stuff. They already make them for their mobile devices, so it's natural that Apple would do it for their laptops.
    and why valve should not follow apple ?

    Originally posted by Dukenukemx View Post
    That's a lot of wishful thinking. There's a lot of ARM devices but most of them don't have enough storage to install Cyberpunk 2077, let alone run it. Maybe Indie games but most likely those games like Among Us is already ported on them. Maybe if Apple discovered the SD card slot and allowed for expandable storage.
    most of the games on steam are not like cyberpunk 2077 with 100GB and more instead they are small games like Valheim..

    Originally posted by Dukenukemx View Post
    You're making a senario where you assume all code is old and therefore WebAssembly+WebGPU will be better automatically. Minecraft is built on Java but eventually they made a Bedrock version for Windows that runs C++, and they did it for a reason because it runs faster than Java. You can get better performance on Java Edition with mods like Sodium but that's just the nature of code.
    well you claimed WebAssembly+WebGPU is slow but there are many cases who it is faster and thats a fact.

    Originally posted by Dukenukemx View Post
    I don't know about 5 years but I do know that by next year Intel is going to give Apple a huge problem. Remember that Intel bought 3nm from TSMC aggressively, probably to deny Apple from as much 3nm as possible, as Intel paid the highest price for it. Intel is no longer limiting themselves to their own manufacturing. AMD will be 5nm by the end of this year, and will be 4nm by next year for mobile parts. Apple's manufacturing advantage is going to be gone.
    you just don't get it intel sells 10nm cpus and apple already produce 4nm chips.

    in the moment intel will be on 3nm apple will be on 2nm or even 1nm node ...

    Leave a comment:


  • Dukenukemx
    replied
    Originally posted by qarium View Post

    you can check the forum i did write about nvidia driver is going opensource long long time before nvidia did it.
    i even did predict that instead of tiny firmware like AMD Nvidia will go the Big-FAT-Firmware route and i did predict that many years before it happened.

    i did predict valve will go with all AMD hardware on CPU and GPU long time before valve did it.
    Nvidia going open source should not be a shock to anyone. Including the hackers that held Nvidia hostage to do that very thing.

    you clearly have no clue at all.

    if you are a vulkan developer and you want to switch to WebGPU the switch is super easy because it is the same on the bytecode level. stop lie and accept that it is the same.

    if you are a metal developer and you want to switch to WebGPU you just use your high-level metal language and compile it to WebGPU bytecode. very easy.
    Or we could have Apple adopt Vulkan and nobody has to deal with WebGPU ever. Less work, less overhead.
    really no one cares. just using vulkan WILL NOT HAPPEN no matter how often you say it.
    Too late, everyone already does use Vulkan. The only one who doesn't is Apple and their customers are the ones who lose in the end.

    you think linux need marketshare WRONG you think Vulkan need marketshare WRONG

    WebAssembly+WebGPU already has all marketshare we need.

    microsoft edge supports it
    mozilla firefox supports it
    google chrome supports it.
    linux supports it outside of the browser
    windows supports it outside of the browser
    sony playstation 5 supports it
    Xbox supports it.

    future projects will plain and simple be ported to WebAssembly+WebGPU and then the game runs on every platform.
    How many games you see using it? Forget games, what about applications? How would a game that's like nearly 100GB going to work in a web browser? Also is WebAssembly 32-bit? Fallout 3 has problems with a 4GB limit, let alone modern games not built in 2008. It also needs a Virtual machine? None of these sound fast for gaming. I know they're working on 64-bit but it's clearly not ready at all.
    well it is easy to say that valve will also go with intel hardware they maybe also support intel hardware but i am very sure valve will not sell intel hardware. and there are multible reasons for this intel ARC gpus are the "loser"
    intel arc gpus are the worst 150mm² arc gpu die lose agaist 100mm² amd gpu die.
    on the CPU side intel do better on desktop but only because of massive electric power consumption.
    AMD is king if you want x86_64 because of compatibility and also you want performance per watt for mobile devices like steam deck.
    Valve isn't Apple in that they have to stick with AMD or Intel. Maybe next year Intel ARC is a beast and Intel is selling them for a song. The point is Valve can switch between AMD and Intel and can do so without issue. That doesn't mean Valve will or won't, it just means the option is there.
    there are other reasons why valve will not sell intel hardware. valve also watched the conection between apple and intel... and valve can learn one thing from this: before valve will touch any intel hartdware they will go to apple or rockchip or Ampere or samsung and will go with custom SOC ARM chip
    Why would Valve hurt themselves by going ARM? It's bad enough getting games working on linux with x86 but now you think Valve will add ARM and the need to emulate x86 on it?
    rockchip is good it is linux focused also samsung could be better option for gaming than apple because they have a AMD radeon RDNA2+ license for the GPU part.
    Rockchip is usually found on cheap Chinese no name brand devices using Mali graphics. Not my top choice for gaming.
    believe me no sane person at valve will ever touch intel... apple did it and it was no good deal.
    Apple left Intel for a number of reasons. Top one was manufacturing as Intel wasn't doing anything to improve it. Second was the fact that Intel was falling behind AMD. Third is that Apple can make their own CPU's and save a bunch of money doing so, as Apple doesn't like to depend on other manufacturers to make their stuff. They already make them for their mobile devices, so it's natural that Apple would do it for their laptops.
    valve will do it thats clear. just to get you a hind: all the smartTV are Rockchip ARM SOCs with Linux run on Webassembly+WebGPU...
    ARM cpus are a much bigger market than just apple M1/M2 notebooks.
    and the software stack to dynamically translate x86 to ARM is almost ready on all platforms.
    That's a lot of wishful thinking. There's a lot of ARM devices but most of them don't have enough storage to install Cyberpunk 2077, let alone run it. Maybe Indie games but most likely those games like Among Us is already ported on them. Maybe if Apple discovered the SD card slot and allowed for expandable storage.
    "What you're assuming is that Valve will go Webassembly and WebGPU to lose performance"

    you clearly have no clue... there are cases where WebAssembly is faster than "native"
    the reason for this is closed source native code become old and i mean very old there are 20-30 years old x86 code around run on windows 11 in modern hardware.
    is such cases even java can be faster than native and also WebAssembly because it allows to recompile for newer hardware on the operating system side because of this the code can become faster than the old native code.

    Webassembly+WebGPU is not designed to lose performance ... compared to old native closed source it will be faster.
    You're making a senario where you assume all code is old and therefore WebAssembly+WebGPU will be better automatically. Minecraft is built on Java but eventually they made a Bedrock version for Windows that runs C++, and they did it for a reason because it runs faster than Java. You can get better performance on Java Edition with mods like Sodium but that's just the nature of code.

    in 5 years apple is on a 2nm IBM "3 layer 3D" "gate all around" High-NA (0.55NA) node
    what makes apple able to ship 60billion tranistor SOC chips with stacked 3D cache on the mother-die.
    the 2nm node will push the clock speeds from these chips to 4,5 ghz from now 3.2ghz
    also in 5 years apple entered the linux server and linux workstation market also linux embedded market with their M1/M2 SOC chips.
    I don't know about 5 years but I do know that by next year Intel is going to give Apple a huge problem. Remember that Intel bought 3nm from TSMC aggressively, probably to deny Apple from as much 3nm as possible, as Intel paid the highest price for it. Intel is no longer limiting themselves to their own manufacturing. AMD will be 5nm by the end of this year, and will be 4nm by next year for mobile parts. Apple's manufacturing advantage is going to be gone.
    Last edited by Dukenukemx; 05 August 2022, 10:11 AM.

    Leave a comment:


  • Rabiator
    replied
    Quoting several posts that I'm collecting into one reply...
    Originally posted by qarium View Post
    for me it was always clear that intel and nvidia are companies on the evil side means they where never an option.

    "the trends in (Windows) games development"
    i think this will never happen again that any trend is on microsoft's side.

    in the future we will see all game engines go all in on Webassembly+WebGPU even outside of the browser.
    The world does not always do what we want. My comment was about what I guess might happen, not about what I'd like to see.
    5-7 years ago AMD was not in the shape to displace Intel and Nvidia technology wise.

    I agree that the trends go away from Microsoft, but it is a slow process. I'd be surprised if they lose the top position in (desktop) market share within the next five years. Meaning software developers would lose lots of income ignoring Windows. In the meantime, at least Wine and Proton improve steadily, so we can enjoy a lot of software made for Windows on Linux too.

    Originally posted by qarium View Post
    you clearly have no clue... there are cases where WebAssembly is faster than "native"
    the reason for this is closed source native code become old and i mean very old there are 20-30 years old x86 code around run on windows 11 in modern hardware.
    is such cases even java can be faster than native and also WebAssembly because it allows to recompile for newer hardware on the operating system side because of this the code can become faster than the old native code.
    [...]
    or do you think apple will not get the license.... even samsung did get RDNA2+ radeon license.
    and it is clear that apple do not need AMD license for their low-end and middle-ground hardware.
    they only need it for highend.

    i think apple will avoid intel and nvidia at all cost but they are fine to license RDNA3+ for their highend parts.
    That ancient closed source native code is usually that old because the company in question does not want to rewrite it. Even if it would be cheaper long-term, management does frequently not see it that way. I worked at places that are very conservative in this. Sometimes with good reason. If you have to re-do a bunch of certifications in safety critical fields, a rewrite will become even more expensive.
    Edit: And sometimes the old code is so messed up that it is next to unmaintainable. Then the developers are happy as well if they can just leave it alone.

    About Apple switching graphics architectures, that would be a non-trivial effort at the hardware level. Now Apple is relatively open to such moves, but even then I doubt they would do it unless they fall behind. Maybe with RDNA3+, but it would have to actually beat what Apple has in the pipeline at that point.
    Last edited by Rabiator; 05 August 2022, 06:19 AM.

    Leave a comment:


  • qarium
    replied
    Originally posted by Dukenukemx View Post
    You are not very good at making predictions.
    you can check the forum i did write about nvidia driver is going opensource long long time before nvidia did it.
    i even did predict that instead of tiny firmware like AMD Nvidia will go the Big-FAT-Firmware route and i did predict that many years before it happened.

    i did predict valve will go with all AMD hardware on CPU and GPU long time before valve did it.

    Originally posted by Dukenukemx View Post
    In 5 years we'll doing the same as we've been doing. If we wanted to move to the web browser we would have done so already.
    o please stop the bullshit talk you can do Webassembly+WebGPU outside of the web-browser.
    your argument "If we wanted to move to the web browser we would have done so already" is no argument at all.
    because i did not say: we do it because we now want to do it all in the web browser.

    they just use the webbrowser to establish the standard after that you can do it outside of the browser to.

    Originally posted by Dukenukemx View Post
    WebGPU will never take off because you're asking developers to yet again learn a new API. Until enough people are using Linux, the gaming world will continue to use DX12. Proton is that bridge that will allow people to move onto Linux and give developers a reason to think about a native Linux port using Vulkan.
    you clearly have no clue at all.

    if you are a vulkan developer and you want to switch to WebGPU the switch is super easy because it is the same on the bytecode level. stop lie and accept that it is the same.

    if you are a metal developer and you want to switch to WebGPU you just use your high-level metal language and compile it to WebGPU bytecode. very easy.

    the only people who need to do more than this are the DX12 developers because dx12 use its own high-level-shading language and also on the low level there are differences to the point of the bytecode gets compiled to assembler of the gpu... at this point dx12 is the same as vulkan and also metal.

    Originally posted by Dukenukemx View Post
    Just using Vulkan is even easier. DX12 isn't needed since Windows can already run Vulkan. You know an OS that can't?
    really no one cares. just using vulkan WILL NOT HAPPEN no matter how often you say it.


    Originally posted by Dukenukemx View Post
    No they won't. in 5 years we Linux users will still be running Windows games through Proton because we'll need a lot more market share to get developers to consider porting. Indie developers port like crazy to Linux because they don't care about wasting resources, but AAA developers do. Not to forget the support once the port is done.
    you think linux need marketshare WRONG you think Vulkan need marketshare WRONG

    WebAssembly+WebGPU already has all marketshare we need.

    microsoft edge supports it
    mozilla firefox supports it
    google chrome supports it.
    linux supports it outside of the browser
    windows supports it outside of the browser
    sony playstation 5 supports it
    Xbox supports it.

    future projects will plain and simple be ported to WebAssembly+WebGPU and then the game runs on every platform.

    Originally posted by Dukenukemx View Post
    Valve may yet still do an Intel as well. Remember that 5-7 years ago people wrote off Steam machines as a failure and expected Valve to go back to Windows, but Valve was hard at work getting things like ACO for AMD and Vulkan for Intel. Basically Valve wanted to cover all their bases because unlike Apple, they can go to Intel and AMD anytime they want while keeping 100% compatibility. AMD makes sense because of power efficiency and cost.
    well it is easy to say that valve will also go with intel hardware they maybe also support intel hardware but i am very sure valve will not sell intel hardware. and there are multible reasons for this intel ARC gpus are the "loser"
    intel arc gpus are the worst 150mm² arc gpu die lose agaist 100mm² amd gpu die.
    on the CPU side intel do better on desktop but only because of massive electric power consumption.
    AMD is king if you want x86_64 because of compatibility and also you want performance per watt for mobile devices like steam deck.

    there are other reasons why valve will not sell intel hardware. valve also watched the conection between apple and intel... and valve can learn one thing from this: before valve will touch any intel hartdware they will go to apple or rockchip or Ampere or samsung and will go with custom SOC ARM chip

    rockchip is good it is linux focused also samsung could be better option for gaming than apple because they have a AMD radeon RDNA2+ license for the GPU part.

    believe me no sane person at valve will ever touch intel... apple did it and it was no good deal.

    Originally posted by Dukenukemx View Post
    No they won't. Valve spent a lot of money trying to get as much performance out of games running in Linux. What you're assuming is that Valve will go Webassembly and WebGPU to lose performance because of Apple M1/M2. Apple shit the bed with ARM when it comes to gaming because there's more Intel Macs in the wild than M1/M2 based Macs. Meaning developers have a nightmare in dealing with support for what is essentially 10% of the total computer market. Not to forget no Vulkan support.
    valve will do it thats clear. just to get you a hind: all the smartTV are Rockchip ARM SOCs with Linux run on Webassembly+WebGPU...
    ARM cpus are a much bigger market than just apple M1/M2 notebooks.
    and the software stack to dynamically translate x86 to ARM is almost ready on all platforms.

    "What you're assuming is that Valve will go Webassembly and WebGPU to lose performance"

    you clearly have no clue... there are cases where WebAssembly is faster than "native"
    the reason for this is closed source native code become old and i mean very old there are 20-30 years old x86 code around run on windows 11 in modern hardware.
    is such cases even java can be faster than native and also WebAssembly because it allows to recompile for newer hardware on the operating system side because of this the code can become faster than the old native code.

    Webassembly+WebGPU is not designed to lose performance ... compared to old native closed source it will be faster.

    Originally posted by Dukenukemx View Post
    You wanna know what'll happen in 5 years with Apple? They would have admitted going alone in making a SoC was a mistake as by then AMD and Intel would have caught up in power efficiency. Not only that but CPU performance would be outclassed by AMD and Intel as well.
    in 5 years apple is on a 2nm IBM "3 layer 3D" "gate all around" High-NA (0.55NA) node
    what makes apple able to ship 60billion tranistor SOC chips with stacked 3D cache on the mother-die.
    the 2nm node will push the clock speeds from these chips to 4,5 ghz from now 3.2ghz
    also in 5 years apple entered the linux server and linux workstation market also linux embedded market with their M1/M2 SOC chips.

    Originally posted by Dukenukemx View Post
    GPU performance would continue to be a joke to the point that Apple would probably be using AMD GPU's instead of their own in higher end models. It'll get so bad that they may ask for Nvidia to make ARM based SoC's because the cost of making their huge monolythic SoC's is probably killing their net profits. Eventually Apple's venture to ARM will be a failure as nobody is going to want to deal with using ARM when x86 does a better job and is more open. So either Apple upgrades their x86 compatibility performance even further or goes straight x86 and goes back to Intel.
    on the GPU side Samsung already has AMD RDNA2 license for ARM SOCs
    so i really would not wonder if apple gets a AMD RDNA3+ license for their high-end models.
    but on the other side on low-end and middle-ground what apple already has is fine. so they maybe only need the AMD license for the highend modells.

    and be sure apple ARM cpu on 2nm node with 60billion tranistors at 4.5ghz will run x86 binarys fast.

    Originally posted by Dukenukemx View Post
    The reason I can say these things is because it has happened in the past with PowerPC. X86 is old and extremely dated compared to PowerPC and ARM but it doesn't matter when AMD and Intel keep putting in R&D and engineer the crap out of it. They can afford to do it because again they have 90% of the computer market to profit from, while also having the server market. Also the M2 256GB SSD models have cut costs by using only one chip for an SSD, which cuts performance in half. Which is an odd thing to do when your products are already much more expensive compared to your x86 competitors. The truth is Apple's silicon is expensive for Apple to make, and this is because the chips are big which means defects are more likely to occur, and they're using 5nm which nobody else uses yet.
    apple is already on 4nm TSMC node. and has already ordered TSMC 3nm node.
    and TSMC and apple have already a technology contract for IBM 2nm node...
    https://www.macrumors.com/2022/03/10...c-4nm-process/
    https://wccftech.com/roundup/apple-m...u-should-know/
    https://newsroom.ibm.com/2021-05-06-...Semiconductors
    https://www.zmescience.com/science/n...tion-02082022/

    this means your dreams of "x86-64+engineer the crap out of it" will not happen.

    it looks more like this all will become horror-show for you and people like you.

    now you can buy 10nm intel cpus and soon you can buy 7nm intel cpus and intel will manufacture mobile chips in 5nm TSMC sure and they already produce ARC on TSMC 6nm

    but soon they need to compete with 4nm and 3nm chips and in less than 5 years 2nm chips.

    this shit will explode in intel's face: 60billion [email protected] on a single SOC chip...

    Originally posted by Dukenukemx View Post
    Not to forget that Apple has to pay engineers to make a product that both AMD and Intel have spent decades engineering. Eventually the cost of making their own SoC's to try and compete with AMD, Intel, and Nvidia is going to be too high. Even Qualcomm has a much larger market share for their SoC's even though compared to Apple's it's not as good. Which Qualcomm has went to AMD for their future products to include AMD's Radeon graphics. Shouldn't be a shock to anyone considering Qualcomm's Adreno graphics was bought from ATI and the word Adreno is just Radeon with rearranged letters.
    i really don't get it why apple just license AMD GPU for their high-end parts ?

    or do you think apple will not get the license.... even samsung did get RDNA2+ radeon license.
    and it is clear that apple do not need AMD license for their low-end and middle-ground hardware.
    they only need it for highend.

    i think apple will avoid intel and nvidia at all cost but they are fine to license RDNA3+ for their highend parts.

    Leave a comment:


  • Spacefish
    replied
    M1 will win in many performance/power metrics, it´s an highly optimized ARM-Core on a modern 5nm Node + a very wide memory bus.
    Most ARM-Cores are more energy efficiant than the large x86 cores on most chips, as this was a design goal of the ARM-Architecture, not with x86, although they are pretty comparable nowerdays, x86 still has to support a lot of legacy crap, which needs chipspace + some power.

    The 5nm TSMC Node gives a huge advantage to the M1 Chip as well..
    Just wait till we see Zen 4 with Quad-Channel DDR5 + large on die Cache on TSMCs N5P later this year.

    Leave a comment:


  • qarium
    replied
    Originally posted by Rabiator View Post
    Looking at the WebGPU page at ChromeDevelopers, their status is still very much pre-alpha. Now I think Google is capable of both pushing WebGPU into the market and doing a decent job in developing it. But it seems premature to predict a near-total switch of the software world to WebGPU.
    who cares that it is pre-alpha ? it is now and today clear that WebGPU will suck every GPU graphics and compute software from all directions.

    just watch what kind of companies are all in on WebGPU... its not only google...

    Originally posted by Rabiator View Post
    About Valve's all AMD gaming computer, the surprise was mainly in AMD coming back with Zen / RDNA. Some form of x86 based gaming computer was an obvious enough idea, but 5-7 years ago I'd have bet on some Intel/Nvidia combo.
    for me it was always clear that intel and nvidia are companies on the evil side means they where never an option.

    Originally posted by Rabiator View Post
    Linux as OS is not much of a surprise. Gabe Newell said years ago that he expected Microsoft to use its app store to push out the competition in Windows software. That might have been the main reason for Valve to build Proton into Steam, and generally try to lessen its dependence on Microsoft.

    In the future, I think Valve will have to keep supporting the trends in (Windows) games development. They are not big enough to pull a Sony or Nintendo and create their own games ecosystem. But when comparing the importance of Linux and Mac for Valve, I think Linux wins. Mac has more market share, but still below 2% of players on Steam. Linux as OS of the Steam Deck is an increasingly vital part of Valve's business.
    "the trends in (Windows) games development"

    i think this will never happen again that any trend is on microsoft's side.

    in the future we will see all game engines go all in on Webassembly+WebGPU even outside of the browser.

    Leave a comment:


  • Dukenukemx
    replied
    Originally posted by qarium View Post

    your opinion is based on outdated knowlege. in 5 years no one will care anymore if a app runs on windows or linux or macos or whatever because nearly all apps will be webassembly... even the apps outside of the browser will be webassembly. only high performance critial apps will be in Rust and native ,,, also in 5 years no one will care anymore about the fight between vulkan and dx12 and metal and OpenGL and so one because nearly everyone will use WebGPU in the browser and also WebGPU outside of the browser.
    You are not very good at making predictions. In 5 years we'll doing the same as we've been doing. If we wanted to move to the web browser we would have done so already. As much as people predicted the cloud was the future, it really isn't. WebGPU will never take off because you're asking developers to yet again learn a new API. Until enough people are using Linux, the gaming world will continue to use DX12. Proton is that bridge that will allow people to move onto Linux and give developers a reason to think about a native Linux port using Vulkan.


    also to port vulkan to webgpu is very easy also to port metal apps to WebGPU us also easy... the only one who is not so lucky is microsoft with dx12...
    Just using Vulkan is even easier. DX12 isn't needed since Windows can already run Vulkan. You know an OS that can't?
    in the near future this all will work on the M1/M2 to but not as fast as native apps.
    the translation layers from windows x86 to linux on ARM or macos on ARM is almost done.
    No they won't. in 5 years we Linux users will still be running Windows games through Proton because we'll need a lot more market share to get developers to consider porting. Indie developers port like crazy to Linux because they don't care about wasting resources, but AAA developers do. Not to forget the support once the port is done.
    i tell you something i told people like 5-7 years ago valve will do an all AMD gaming computer people did not believe it.
    Valve may yet still do an Intel as well. Remember that 5-7 years ago people wrote off Steam machines as a failure and expected Valve to go back to Windows, but Valve was hard at work getting things like ACO for AMD and Vulkan for Intel. Basically Valve wanted to cover all their bases because unlike Apple, they can go to Intel and AMD anytime they want while keeping 100% compatibility. AMD makes sense because of power efficiency and cost.
    you will see Valve invest in the near future in Webassembly and WebGPU and also on apple m1/2 hardware.
    for valve they want to make money and they can not afford it to lose all these apple m1/2 customers
    No they won't. Valve spent a lot of money trying to get as much performance out of games running in Linux. What you're assuming is that Valve will go Webassembly and WebGPU to lose performance because of Apple M1/M2. Apple shit the bed with ARM when it comes to gaming because there's more Intel Macs in the wild than M1/M2 based Macs. Meaning developers have a nightmare in dealing with support for what is essentially 10% of the total computer market. Not to forget no Vulkan support.

    You wanna know what'll happen in 5 years with Apple? They would have admitted going alone in making a SoC was a mistake as by then AMD and Intel would have caught up in power efficiency. Not only that but CPU performance would be outclassed by AMD and Intel as well. GPU performance would continue to be a joke to the point that Apple would probably be using AMD GPU's instead of their own in higher end models. It'll get so bad that they may ask for Nvidia to make ARM based SoC's because the cost of making their huge monolythic SoC's is probably killing their net profits. Eventually Apple's venture to ARM will be a failure as nobody is going to want to deal with using ARM when x86 does a better job and is more open. So either Apple upgrades their x86 compatibility performance even further or goes straight x86 and goes back to Intel.

    The reason I can say these things is because it has happened in the past with PowerPC. X86 is old and extremely dated compared to PowerPC and ARM but it doesn't matter when AMD and Intel keep putting in R&D and engineer the crap out of it. They can afford to do it because again they have 90% of the computer market to profit from, while also having the server market. Also the M2 256GB SSD models have cut costs by using only one chip for an SSD, which cuts performance in half. Which is an odd thing to do when your products are already much more expensive compared to your x86 competitors. The truth is Apple's silicon is expensive for Apple to make, and this is because the chips are big which means defects are more likely to occur, and they're using 5nm which nobody else uses yet. Not to forget that Apple has to pay engineers to make a product that both AMD and Intel have spent decades engineering. Eventually the cost of making their own SoC's to try and compete with AMD, Intel, and Nvidia is going to be too high. Even Qualcomm has a much larger market share for their SoC's even though compared to Apple's it's not as good. Which Qualcomm has went to AMD for their future products to include AMD's Radeon graphics. Shouldn't be a shock to anyone considering Qualcomm's Adreno graphics was bought from ATI and the word Adreno is just Radeon with rearranged letters.

    Leave a comment:


  • Rabiator
    replied
    Originally posted by qarium View Post
    your opinion is based on outdated knowlege. in 5 years no one will care anymore if a app runs on windows or linux or macos or whatever because nearly all apps will be webassembly... even the apps outside of the browser will be webassembly. only high performance critial apps will be in Rust and native ,,, also in 5 years no one will care anymore about the fight between vulkan and dx12 and metal and OpenGL and so one because nearly everyone will use WebGPU in the browser and also WebGPU outside of the browser.

    i tell you something i told people like 5-7 years ago valve will do an all AMD gaming computer people did not believe it.

    you will see Valve invest in the near future in Webassembly and WebGPU and also on apple m1/2 hardware.
    for valve they want to make money and they can not afford it to lose all these apple m1/2 customers
    Looking at the WebGPU page at ChromeDevelopers, their status is still very much pre-alpha. Now I think Google is capable of both pushing WebGPU into the market and doing a decent job in developing it. But it seems premature to predict a near-total switch of the software world to WebGPU.

    About Valve's all AMD gaming computer, the surprise was mainly in AMD coming back with Zen / RDNA. Some form of x86 based gaming computer was an obvious enough idea, but 5-7 years ago I'd have bet on some Intel/Nvidia combo.

    Linux as OS is not much of a surprise. Gabe Newell said years ago that he expected Microsoft to use its app store to push out the competition in Windows software. That might have been the main reason for Valve to build Proton into Steam, and generally try to lessen its dependence on Microsoft.

    In the future, I think Valve will have to keep supporting the trends in (Windows) games development. They are not big enough to pull a Sony or Nintendo and create their own games ecosystem. But when comparing the importance of Linux and Mac for Valve, I think Linux wins. Mac has more market share, but still below 2% of players on Steam. Linux as OS of the Steam Deck is an increasingly vital part of Valve's business.
    Last edited by Rabiator; 04 August 2022, 05:50 PM.

    Leave a comment:

Working...
X