Announcement

Collapse
No announcement yet.

Steam For Linux In July Shows A 1.23% Marketshare, AMD CPUs Now More Common Than Intel On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #71
    Originally posted by qarium View Post
    the PS5 has raytracing support and the GTX 1060 does not have it.
    That matters very little when hardly any games use it. It's also so bad that Nvidia had to invent DLSS in order for games when running Ray-Tracing to be playable. Then Apple implements it into Metal 3 and calls it MetalFX Upscaling because they want to be part of the cool kids. AMD's FSA is more useful as it can be used in any GPU. Eventually Ray-Tracing maybe a requirement for games in the future but right now a GTX 1060 is fine.
    no one cares if the 1060 has similar or higher FPS on outdated games.
    I hardly consider Borderlands 3 and Devil May Cry 5 outdated games.
    also the PS5 has 16gb vram and these cheap cards like 1060 does not have 16gb vram means they maybe make high FPS but it is impossible to store all the necessary textures into the vram.
    16GB total ram, not just GPU vram. That means developers can choose to delegate how much of the 16G is used for vram. A desktop PC with 16GB of DDR4 with a GTX 1060 with 6GB is more than a PS5, and will perform better. Keep in mind that CPU's don't care so much for bandwidth as much as latency, which is why we use DDR and not GDDR, as GDDR exchanges latency for bandwidth. If you ever tweaked your threadrippers memory you'll know that lower latency improves performance more than just max clock speed. The PS5 is using GDDR6 to save money, but in exchange the CPU performance will be weaker. Apple with their memory configuration is similar except that Apple is dealing with the latency by moving the memory physically closer to the SoC. Regardless Apple's latency is still higher than DDR. How you think Apple fixes this? Same as AMD, by adding more cache. Having a system with DDR for CPU and GDDR or HBM for GPU is still best.
    this means it is bullshit bingo to tell people the 1060 has high FPS because they do not display the same result.
    Gamers Nexus did adjust the settings in these games to match the graphics settings. He explained it very well.
    well i do play on my Threadripper 1920x...
    I'm not saying you can't just that it isn't meant for gaming. A Ryzen 58003DX will get better gaming performance than a threadripper. An Intel 12900K will perform better in games than a threadripper.
    ok dude then i spend 1 billion dollars on a comodore C64 game developed in 2022 ...
    and i am 100% sure no one will call it a AAA game no matter what i do no matter i spend 1 billion dollars on the game.
    Nobody would spend 1 billion on a Commodore 64 game because nobody uses it. Most games today will be built for what is the average hardware. A Threadripper CPU with a Vega64 is not the average. This is why E-Sports games focus a lot of lower end hardware, including World of Warcraft.
    well your argument is well lets say some kind of lost in space argument... because you make an anti-ARM argument and your proof for this argument is a x86_64 computer with a AMD R9 Fury gpu card...
    so its complete pointless really.
    The point was that when x86 machines are having trouble playing modern games in Linux, then ARM is going to be worse. You really don't want people to have difficulty to play their games because again, people will move to the path of least resistance.
    maybe they do not make money on the 399dollars version but if you watch the sales statistic the one who is the most expensive modell is the one they sell the most. then who cares if they do not make money on the 399 version? if they sell 4-10 more expensive versions for every cheap one they sell and also makes a lot of money on steam sales.

    "They make money that way"

    you are serious ??? they make money? thats impossible because we know they lose money lol
    Not everything is done to make money directly. The Steam Deck is meant to maintain Steam's presence in the market, just like Android is one of many tools Google uses to maintain their search engine's presence. Consoles have historically sold at cost or loss because the money is made up from the sale of games. Also yes Valve wanted people to go after the more expensive model, much like how Apple wanted Macbook M2 users to go after larger SSD's by crippling their 256GB models. It's called an incentive.
    it looks like it makes much more than this... it looks like they break 2 monopoles at the same time intel because it is amd hardware and also windows from microsoft...
    By going AMD only you create a monopoly but it's called an oligopoly. Microsoft is a monopoly even though they consider Linux competition.
    the sales are so high that we can say for sure the microsoft windows monopole is finished.
    Windows will fail when Microsoft tries to charge a monthly fee for a basic feature in Windows or screw up badly. Otherwise they ain't going nowhere for a long time.
    Nvidia supports OpenCL ??? you do make jokes right ? nvidia only did openCL1.1 nothing more 2.0 did come out and nvidia did not support it and 2.1 did come out and nvidia did not support it then 3.0 did come out and nvidia did support it but only because you can claim you support it by only support 1.1 ...
    in other words OpenCL is death and nvidia does not support it.
    You know Apple helped created OpenCL and they gave up on it. Also I didn't say Nvidia weren't assholes, but they do support OpenCL 3.0. Obviously Nvidia was pushing for CUDA but like all proprietary standards, it will fail.
    on the other side many people on this website reports that nvidia is forcing people into CUDA ecosystem because they have defacto monopole in compute ...

    "Apple will force you to ARM"

    I do not feel forced at all ?
    If you wanna use Mac OSX then you must use ARM. You do have the choice to go with any other manufacturer but if you have an Apple watch with iphone and other devices, then you will feel like you broke something by not using Apple. Just like you can buy AMD if you don't want to use CUDA.
    you become more and more professional in your rhetoric now you try to scare me away from apple with arguments like "apple products come in limited colors.... "
    do you really think i care about colors at all ?
    You didn't get the joke. It's that Apple doesn't give you any real choices but color.
    why you freak ?... i do not want MacOSX ... and i do not want it if they opensource it.
    Open source means we can learn from it and bring code over to Linux.
    why you freak ?... i told you WebGPU is the relevant standard.
    It will never go beyond a web browser, if even.
    Last edited by Dukenukemx; 09 August 2022, 02:52 PM.

    Comment


    • #72
      Originally posted by Dukenukemx View Post
      That matters very little when hardly any games use it. It's also so bad that Nvidia had to invent DLSS in order for games when running Ray-Tracing to be playable. Then Apple implements it into Metal 3 and calls it MetalFX Upscaling because they want to be part of the cool kids. AMD's FSA is more useful as it can be used in any GPU. Eventually Ray-Tracing maybe a requirement for games in the future but right now a GTX 1060 is fine. [...]16GB total ram, not just GPU vram.
      no of course the GTX1060 is not fine it does not have the matrix cores to accelerate DLSS
      also your babbling that PS5 does not have 16gb vram of course it is "shared" you can write software with "16gb textures" then you have no space for other stuff but still you can.
      this GTX 1060 only has 3gb VRAM it can not even display what the PS5 can display by this fact alone all the FPS numbers are fake because you will see ugly helper textures or if you walk in new area no textures at all.
      there are already games who use raytracing on the playstation5
      most of the games and engines wait until FSR1.0 and FSR2.0 hit the game engines then they can enable raytracing togeter with the FSR to get the FPS they want combined with raytracing effects.

      "but right now a GTX 1060 is fine"

      dude you just fool yourself... it looks more and more like you are not even able to talk about "apple M2"
      plain and simple because you do not even unterstand the nessary differences between playstation5 and a GTX1060 you only talk about high FPS.. but only Fools think FPS is what matters...

      if one system has high FPS with only 3GB vram forces to less textures or lower resolution textures with helper textures or no textures at all at some point in the game and another system has low FPS with maximum on vram and maximum of texture resolution and so one then by logic the slower system is better.

      same effect on apple M2... you compare it to GTX1060 or even a GTX1660 or whatever who only has low amount of vram... and the apple M1/M2 has 16/24 GB ram shared means VRAM... and the apple M2 max has even more and so one...

      then the apple solution is much better even if they have lower FPS. because by this logic you can buy 10 year old card with only 1gb vram and the cripple the texture resolution to zero and claim you have more FPS...

      what a bullshit show you do here.

      Originally posted by Dukenukemx View Post
      I hardly consider Borderlands 3 and Devil May Cry 5 outdated games.
      16GB total ram, not just GPU vram. That means developers can choose to delegate how much of the 16G is used for vram. A desktop PC with 16GB of DDR4 with a GTX 1060 with 6GB is more than a PS5, and will perform better. Keep in mind that CPU's don't care so much for bandwidth as much as latency, which is why we use DDR and not GDDR, as GDDR exchanges latency for bandwidth. If you ever tweaked your threadrippers memory you'll know that lower latency improves performance more than just max clock speed. The PS5 is using GDDR6 to save money, but in exchange the CPU performance will be weaker. Apple with their memory configuration is similar except that Apple is dealing with the latency by moving the memory physically closer to the SoC. Regardless Apple's latency is still higher than DDR. How you think Apple fixes this? Same as AMD, by adding more cache. Having a system with DDR for CPU and GDDR or HBM for GPU is still best.
      "A desktop PC with 16GB of DDR4 with a GTX 1060 with 6GB is more than a PS5"

      dude,,, honestly computer tech does not work like you claim here.
      my computer has 128gb ram and my vega64 has 8gb vram it is more than you claim...
      but the tech plain and simple does not work like this.

      the playstation 5 can hold more textures than my vega64 even if you count the FPS my Vega64 is faster.
      its faster in FPS but this is bullshit because the vega64 can not even display the same quality.
      to claim it is faster is bullshit bingo.

      same vor apple M1/M2 it can hold much more vram textures than your claimed 6gb vram card or my vega64...

      "and will perform better."

      YOU ARE A FOOL... it can not even display the same quality means it can not perform better
      it can only do more FPS if the settings are so low that you lose quality.
      really man you are a fool...

      "Keep in mind that CPU's don't care so much for bandwidth as much as latency, which is why we use DDR and not GDDR, as GDDR exchanges latency for bandwidth"

      this is wrong in modern cpus this was only true in single core and dualcore cpu and quatcore cpu time.
      if you have many cores means 64cores or more the need for latency fade away and all you need is bandwidth
      and all what needs latency is cached in L2 and L3 cache...

      "Having a system with DDR for CPU and GDDR or HBM for GPU is still best."

      in your outdated world of course. on my threadripper system with ECC and maximum amount of ram 128/256GB the system can only handle standard ram at 3200mhz and if i put the 2990WX in my system tests show latency becomes less and less relevant because the 32cores need to feed their caches thats means it needs bandwidth...

      also if you see low-core-count cpus like the 5800X3D then it is clear RAM does not matter at all.
      the effect between the fastest and slowest ram on the 5800X3D is very small.
      also best latency vs worst latency also does not matter on the 5800X3D.

      this means your opinion is already obsolete future high-core-count cpus with stacked 3D cache only need massive bandwidth...

      Originally posted by Dukenukemx View Post
      Gamers Nexus did adjust the settings in these games to match the graphics settings. He explained it very well.
      i already detected on this topic you are a complete fool. a 3gb vram card or 6gb vram card can not display the same tesult like a playstation5 this means all your FPS numbers are biase and fake.

      in the near future ther will be AMD-FSR2.0 games together with raytracing with high texture resolutions in the games... your idea you can compete with years old GTX1060 is just the fools dream.

      Originally posted by Dukenukemx View Post
      I'm not saying you can't just that it isn't meant for gaming. A Ryzen 58003DX will get better gaming performance than a threadripper. An Intel 12900K will perform better in games than a threadripper.
      first it depents of what kind of games you play and also what you do in these games. for example do you play arma3 and make your own maps with 1000-5000 of AI driven soldiers in large battele formations ? what kills any mulitcore cpu.-...

      for casual gaming like god of wars of other stuff like cacual gamers play then yes a 5800X3D is better.
      dude i save money for a 2990WX combined with my 128gb ram i am sure Gentoo is happy.

      Originally posted by Dukenukemx View Post
      Nobody would spend 1 billion on a Commodore 64 game because nobody uses it. Most games today will be built for what is the average hardware. A Threadripper CPU with a Vega64 is not the average. This is why E-Sports games focus a lot of lower end hardware, including World of Warcraft.
      i already detected you are a fool you claimed if i spend more money on a game then it will be AAA...
      now you claim my 1 billion dollar comodore 64 game is not AAA..-. hell you are a fool...

      "A Threadripper CPU with a Vega64 is not the average"

      wrong i bought it in 2017... today it is the average... many people have a 12core cpu like the AMD 5900X
      and many people have a AMD radeon 6600XT what is only 8% faster than a vega64...

      Originally posted by Dukenukemx View Post
      The point was that when x86 machines are having trouble playing modern games in Linux, then ARM is going to be worse. You really don't want people to have difficulty to play their games because again, people will move to the path of least resistance.
      Not everything is done to make money directly. The Steam Deck is meant to maintain Steam's presence in the market, just like Android is one of many tools Google uses to maintain their search engine's presence. Consoles have historically sold at cost or loss because the money is made up from the sale of games. Also yes Valve wanted people to go after the more expensive model, much like how Apple wanted Macbook M2 users to go after larger SSD's by crippling their 256GB models. It's called an incentive.
      i really don't get the point is it agaist the law to make money or even to make more money ?

      Originally posted by Dukenukemx View Post
      By going AMD only you create a monopoly but it's called an oligopoly. Microsoft is a monopoly even though they consider Linux competition.
      Windows will fail when Microsoft tries to charge a monthly fee for a basic feature in Windows or screw up badly. Otherwise they ain't going nowhere for a long time.
      it's a tipping point of a cambrian explosion... if microsoft goes down it will go fast instead if slow.

      and we already passes the point of no return for microsoft.

      Originally posted by Dukenukemx View Post
      You know Apple helped created OpenCL and they gave up on it. Also I didn't say Nvidia weren't assholes, but they do support OpenCL 3.0. Obviously Nvidia was pushing for CUDA but like all proprietary standards, it will fail.
      apple did historically an error with openCL and the error was that they did make the laguage fit for the nvidia GTX8800... AMD need long time to add the same cache buffers like the 8800 and openCL performance was because of this long time very bad.
      nvidia only supports openCL3.0 because as soon as you support 1.1 you can claim you support 3,0 even if you do not support a single feature better than 1.1.
      in fact nvidia did sapotage openCL... to push the people to CUDA

      Originally posted by Dukenukemx View Post
      If you wanna use Mac OSX then you must use ARM. You do have the choice to go with any other manufacturer but if you have an Apple watch with iphone and other devices, then you will feel like you broke something by not using Apple. Just like you can buy AMD if you don't want to use CUDA.
      no you can install macOSX in a emulator and use it on x86-64 you can emulate an ARM cpu
      but i do not know a single person who do want to use maxosx...

      "Apple watch with iphone"

      i tell you something i avoid contact to these peope who use i watch or iphone because on my point of view they are stupid people.
      and if apple does not chance their attitude this will stay this way.

      Originally posted by Dukenukemx View Post
      You didn't get the joke. It's that Apple doesn't give you any real choices but color.
      ok.

      Originally posted by Dukenukemx View Post
      Open source means we can learn from it and bring code over to Linux.
      i really don't think so ... sometimes stuff become opensource and you discover you learn zero from it because they are not talented people who write the code.

      Originally posted by Dukenukemx View Post
      It will never go beyond a web browser, if even.
      right you fool... its really like the vram/texture topic ...

      its already "beyond a web browser" but you fool want to believe that it is only some non-important web-browser technology...

      but man thats YOUR problem not my one.
      Phantom circuit Sequence Reducer Dyslexia

      Comment

      Working...
      X