Announcement

Collapse
No announcement yet.

AMD Announces Ryzen 7000 Series "Zen 4" Desktop CPUs - Linux Benchmarks To Come

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by coder View Post
    Then how is Metal not an "OS API"?
    Ahhh, now I get the confusion, with OS I meant open source.

    Comment


    • I don't get why are you talking about games so much in the context of Apple Silicon. Nobody sane is buying Mac's for AA/AAA gaming anyway. The real question is when can we expect Windows + ARM to be at least in a rough parity with Windows + x86 in the context of gaming and typical x86 software like CAD's, Simulation tools etc., which are not even available on any other platform right now. When is it going to be "the thing". Well, not in 2023, that's for sure LOL

      Comment


      • That was in response to https://www.phoronix.com/forums/foru...29#post1343729 where he claimed all gaming benchmarks are unfair and fraud.

        Comment


        • Originally posted by Dukenukemx View Post
          Intel has attempted to replace x86 many times in the past, but they never stopped developing it cold turkey. Though StrongARM wasn't developed by Intel, but was acquired due to a settlement from a lawsuit. Larrabee was x86, and not entirely a CPU but also a CPU. It was an interesting project that would have been revolutionary had it not been sorta canceled. Xeon Phi was the successor to Larrabee, and also still x86.
          compared to my vega64 what runs blender in ROCm HIP and other GPU compute stuff this Larrabee stuff is complete garbage.
          not only my vega64 outperforms any Larrabee ever existed also the compatibility with the defacto monopole in GPU-Compute "Cuda" is much better. and Cuda beats x86 a 1000 times...
          intel had the illusion that x86 still matters but in modern time you no longer want to copy x86 you just go for CUDA compatibility with ROCm HIP---

          Originally posted by Dukenukemx View Post
          They indirectly competed, not directly. Apple can compete against Dell and Asus but not against Intel and AMD.
          That's debatable, but before 2006 the Athlon 64 was king, as was the Athlon XP.
          Yes but both AMD and Intel would out engineer everyone, despite the inherit deficiency of x86. Would anyone use a Power based machine over AMD's or Intel's? Of course not because clearly AMD and Intel are superior.
          as soon as you buy me a Raptor power9 system i put my both threadripper systems into the garbage.
          you are a person who still believe x86 has still some relefance...
          but for me x86 has zero relevance anymore.

          just imagine this: Apple go and do license the AMD RDNA3+ and does your vulkan driver for linux on apple hardware... and goes all in with CUDA compatibility on ROCm HIP... intel would be completely doomed.

          CUDA beats intel x86 in the super computer space already.

          Originally posted by Dukenukemx View Post
          The reality is that ARM makes it easy for anyone to make a SoC. ANYONE CAN MAKE AN ARM SOC. If you look at how many companies made them, it's not hard to see why Apple went on their own. Also the cell phones performance doesn't matter. Nobody knows how fast their cell phone is, including myself and I'm a tech enthusiast.
          I wouldn't suck the dick of Steve Jobs if you knew his history. This was a man who died of a cancer that instead of getting proper treatment and maybe put his cancer into remission, he ate fruit. He just ate fruit. I was using smart phones long before the iPhone was even a concept. Symbian and Windows Mobile is what I used until Android was released. The only thing iPhone did was make it better by making a UI that didn't depend on a stylus and a capacitive touchscreen screen instead of a resistive one.
          LOL, when Softbank the company that owns them did.
          https://arstechnica.com/gadgets/2020...ed-apple-isnt/
          You guys live under a rock or something? Nvidia nearly bought them out, but the UK government stopped them because conflict of interest and foreign company. So far I've proven you wrong, just like qarium​ who thinks WebGPU is a thing.
          really man what a joke you are about this point of course WebGPU is a thing there is for you as a end-user right now no use in it for you... but only because it is future product and all the big tech companies work on it.
          your ability to see in the future is zero...

          Originally posted by Dukenukemx View Post
          They sort of do, but if Apple had 50% market share they would be a serious concern in anti consumer behavior. Running Windows and Linux in a VM isn't a healthy market. Apple makes their own chips for their own computers for their own software. Nobody else does that in the industry. They are an oligopoly.
          OMG the Dukenukemx man learned a new word: oligopoly... he could not proof the monopole status so now he tries oligopoly next.

          Originally posted by Dukenukemx View Post
          Apple wants to grow without losing their anti-consumer practices. A lot of people, including myself, like choice when buying or building a PC.
          i really don't get your point there is zero force to force you buy apple hardware....
          you are a gamer anyway and for now because of all the x86 legacy crap apple hardware makes no sense for you.
          apple does not even want to put a 32bit compute unit into their cpu... and nearly all games are 32bit engines ...


          Originally posted by Dukenukemx View Post
          It goes back and forth. Intel has always had GPU graphics and a chipset, which AMD had neither. They did have a chipset but it wasn't good. AMD did try to buy Nvidia, but that obviously didn't work. So they settled on ATI. Once AMD had Ryzen APU's then Intel knew they had to make better graphics. Though Nvidia also influenced Intel to make graphics since a lot of their server market was lost to Nvidia's GPU's.
          did you know that nearly all AMD chipsets are creaded by ASMedia Technology Inc.​ ?
          the 570 chipset was made by AMD and its power draw was much higher compared to ASMedia chipsets..

          Originally posted by Dukenukemx View Post
          Not really no. Intel did try to get into the mobile market with x86 as I've owned a Asus Zenfone that used an Intel x86 CPU, which did work fine. But this was Intel trying to expand market share, not keep it's current market share. Losing Apple was a huge loss for Intel, but realistically Intel is having trouble supplying enough chips. This might have been one of the many reasons Apple dumped Intel, as Intel as the time refused to use anyone else but their own facilities to make their chips. It seems Intel doesn't have a problem now using TSMC as they paid for a shit ton of manufacturing time for 3nm.
          you know intel did this before the crypto currency market crashed ...
          because after the market crashed apple and nvidia and also amd try to lower their orders from TSMC...

          you do not believe it now but in the long run you will see it... Intel did big mistake they plain and simple did ordered TSMC 6nm and 5nm and 4nm and 3nm and believe me: to much.

          this will bring a lot of pain to intel... and their 6nm ARC GPUs did show they are not competive to 7nm amd gpus and also not to 6nm amd gpus...

          i know you are an end-user you can not imagine how much pain the order of to much high tech nodes gives intel...

          to give you a hind the development of 3nm/4nm/5nm chips are so ultra high that it will litterally break intels neck.

          search on articles on google about how much this costs and you will be surprissed like 500 million euro to develop a 5nm chip

          1billion euros to develop a 3nm chip... if you can not make the sales on the market this will give intel real pain.

          Originally posted by Dukenukemx View Post
          ​
          Come back to me next year when AMD releases their Zen4+RDNA3 laptop chips, and let me know if I was right. You clearly think ARM is going to the moon, while I don't. Lets see who's right.
          the most interesting about the amd design is that they did not make full 5nm design... because 5nm is so expensive that design such chips gives you a lot of pain. believe me. there is a reason why the IOchip and igpu chip chiplet is 6nm

          "You clearly think ARM is going to the moon,"

          ARM chips are much less complex to implement in 3nm/4nm/5nm... high-end x86 chips are a complete beast to do in complexity.

          "Lets see who's right."

          the joke is that the part we told you about intels cost of design such chips will never be open-source knowlege for you...

          12nm was 50 million euro do design such a chip

          8nm is like 100million euro

          7nm is like 200 million euro

          5nm is like 500 million euro

          and 3nm is like 1 billion euro to design such a chip...

          this means if intel really ordered to much because they did it before the crypto mining boom collapsed then to alone design these chips will give intel great pain and they maybe can not sell these chips in the market. plain and simple because they produced to much.


          Phantom circuit Sequence Reducer Dyslexia

          Comment


          • Originally posted by Dukenukemx View Post
            Come back to me next year when AMD releases their Zen4+RDNA3 laptop chips, and let me know if I was right. You clearly think ARM is going to the moon, while I don't. Lets see who's right.
            " As Chip Design Costs Skyrocket, 3nm Process Node Is in Jeopardy"
            The cost of building new chips keeps rising every node -- so much so that by 3nm, there might be precious few companies that can afford new chips at all.

            i think you do not unterstand how complex and costs skyrocking modern node cpu designs are.

            intel will get big problem... their costs to produce 3nm/4nm/5nm TSMC chip designs are so high that this alone will give intel pain.
            Phantom circuit Sequence Reducer Dyslexia

            Comment


            • Originally posted by drakonas777 View Post
              You can rationalize M1/2 as long as you want - that won't change empirical evidences that in some MT heavy workloads (using cross platform native apps) M1/2 is either comparable to x86 or has even slightly worse perf/watt as shown in that Hardware Unboxed video and in another similar video by them some time ago. A metric ton of copium won't change this LOL
              Also the fact ARM has sued Qualcomm regarding NUVIA won't help much for the whole "ARM takeover/ ARM + Windows" agenda regarding high performance SKUs, you will get those mobile phone SoCs from MediaTek of course LOL
              honestly i think the most apple m1/m2 customers are not interested in this at all: "in some MT heavy workloads (using cross platform native apps) M1/2 is either comparable to x86 or has even slightly worse perf/watt"

              they use it days and days to watch youtubne... use days and days to edit videos accelerated with the spezialised hardware and so one and so one. and even people like Linus Torvalds who really do MT heavy workloads dont care...

              Phantom circuit Sequence Reducer Dyslexia

              Comment


              • Originally posted by drakonas777 View Post
                How do we know the exact amount of energy x86 spends in decoding? Some of you declare x86 as legacy garbage (which is defensible position regarding ISA design), but can you be more specific? Can you give a number instead some religious-like believes that ARM is so muh so much more better? Because now it's an empty discussion basically, comparing different products from different ecosystems with different SoC/system design, software and optimizations.

                At least some whitepaper with theoretical calculations/model made by actual CPU architects would be nice. Because now it feels like some of you believe ARM is multiple times more efficient due the better ISA, when in reality it may very be that ARM is 5-15% (given all constrains/variables are the same or very similar, just difference in ISA/frontend/decoding part) more efficient or something like that LOL
                The cost of building new chips keeps rising every node -- so much so that by 3nm, there might be precious few companies that can afford new chips at all.


                if you read this the real design win maybe is the reducion of complexity the more simple a design is the lower the costs of these highend designs for 2nm/3nm/4nm/5nm...

                in my knowlege ARM to x86 reduces the tranistor count by 5%... what reduce complexity to.



                Phantom circuit Sequence Reducer Dyslexia

                Comment


                • Originally posted by coder View Post
                  Then how is Metal not an "OS API"? I get why WebGPU isn't, because that's browser-based. However, Metal is a native GPU API on MacOS and iOS, whereas Vulkan only exists as an emulation layer running atop Metal (see: MoltenVK).
                  Not a question I can answer, but it seems like they probably should.
                  WebGPU is designed to be opensource and also it is designed to run outside of the browser to... similar to WASM of WebAssembly.
                  Phantom circuit Sequence Reducer Dyslexia

                  Comment


                  • Originally posted by Anux View Post
                    OpenGL and Vulkan. Maybe WebGPU but I have never seen any remotely taxing OS 3D game on it.
                    Do iOS games run on M1/2 MACs? Are there any open source iOS games?
                    the most important point is: WebGPU is opensource and it is designed to run outside of the browser to similar to WASM...
                    and apple supports it 100%-.-
                    Phantom circuit Sequence Reducer Dyslexia

                    Comment


                    • Originally posted by drakonas777 View Post
                      I don't get why are you talking about games so much in the context of Apple Silicon. Nobody sane is buying Mac's for AA/AAA gaming anyway. The real question is when can we expect Windows + ARM to be at least in a rough parity with Windows + x86 in the context of gaming and typical x86 software like CAD's, Simulation tools etc., which are not even available on any other platform right now. When is it going to be "the thing". Well, not in 2023, that's for sure LOL
                      right absolutly right i also do not unterstand why he makes games on apple sicilon a priority because he will because of legacy x86 apps buy x86 cpus anyway. and no one here claims otherwise... we say clearly of your most apps are x86 legacy code then buy your AMD hardware and be fine with it.

                      apple will need many years to become competive in this field. any i mean like 5 years... at minimum.
                      Phantom circuit Sequence Reducer Dyslexia

                      Comment

                      Working...
                      X