Announcement

Collapse
No announcement yet.

Apple M2 vs. AMD Rembrandt vs. Intel Alder Lake Linux Benchmarks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by mdedetrich View Post

    In my case they turned on when running JDK 8 under rosetta (couldn't get a native ARM version of JDK 8 working on mac)
    Maybe try this: https://docs.aws.amazon.com/corretto...oads-list.html

    Comment


    • Originally posted by ldesnogu View Post
      Here is an interesting document describing this difference (and others): https://www.riscosopen.org/wiki/docu...ility%20primer
      Give me little time to read up on that, I might actually have been wrong.

      Comment


      • Originally posted by Raka555 View Post
        Many thanks, I just found https://github.com/corretto/corretto...tag/8.342.07.3 and yes its working without problems!

        Comment


        • Glad to see this thread become so wholesome after 16 pages of flame war.

          Comment


          • Originally posted by Dukenukemx View Post
            When a device has dust it essentially has no fan. CPU's have been able to lower their speeds to keep themselves from burning up and destroying themselves for over a decade. The difference here is that Apple willing doesn't include a fan, and causes throttling. If dust is bad because heat is bad, then no fan is stupid. Also these aren't mobile phones or tablets.
            Keep in mind only the Macbook Air models are fanless.
            you can think this way no problem but if apple ever gets the linux support ready and i would need a notebook i would buy a Macbook Air without a fan...
            i will never buy a smartphone or touchscreen tabled or notebook with a fan.

            I have a workstation/PC with fan but it is easy to clean and i have no problem with a fan in a PC

            Originally posted by Dukenukemx View Post
            The rest like the M1 Pro and Max always come with a fan. Also, lots of laptop manufactures choose to let the fans run slow despite temperature, as to appeal to the consumer. Running 95C+ is not ideal, and the fan should run faster. I've have an old Dell XPS M140. The GPU died because the fans are setup to run slow. This is also when Nvidia put shit solder in their chips and you have to heat flow the GPU to fix the issue. But the fans ran slow regardless of temp, and there is software called i8kutils which is even on Linux to control these fans because Dell is stupid.
            well i am one of the few people who do not buy all these kind of shit products...
            i have 4 year of professional training in electronic and electric to know this products are shit.
            i have 8 years of experience in my own company in this field to also know this products are shit.

            give me linux support and give me passiv cooled mobile devices (notebook,tablet,smartphone)
            or else i will not buy it.

            Originally posted by Dukenukemx View Post
            This is why I got big into cooling because too often computers die from heat. Heat is the # 1 killer of computers. It's not always the CPU or GPU either. You don't want anything to get hot enough to burn your finger. Running a CPU to 95C+ is not good, and it doesn't matter if Apple says it's OK. Every manufacturer says it's OK, and it's always wrong. The only difference here is that Apple purposely lets them get hot because Apple cares more about aesthetics than functionality.
            well you cann see it like you want it i do not fear a device goes broke because then i just buy a new one.
            the only thing i really fear in this space is a trojan horse in the hardware what can not be removed by a fresh install of the operating system.

            Originally posted by Dukenukemx View Post
            The fact that the M2 Macbook Air slows down means it needed a fan. This was even an issue on the M1 Air's that people found creative ways to make it better.
            of course it does not need a fan... because even with these slow downs it is still fast enough ...

            Originally posted by Dukenukemx View Post
            You live in some sort of fantasy if you think Apple will ever work on Linux.
            in my knowlege they already do it in secret. with people under NDA non-disclosure agreemen
            because of PR (because to protect the IP called MacOS X) they will never openly support linux.

            Originally posted by Dukenukemx View Post
            They have Mac OSX, and if you don't like Mac OSX then they'll fix it to make you like it. That is how Apple do. Apple has never contributed to Linux in any sort of way. You see Apple on this list? I see Intel nearly on top, but no Apple. The reason Linux is being ported is because people want it and it pisses off Apple, just like it pissed off Sony to get to see Linux on the PS4 and we all know how well Sony likes Linux on their consoles. Historically Apple created its software from open-source projects, but Apple's developers rarely contribute much code back. Microsoft contributes more code to Linux than Apple ever has. Apple is a shit company.
            i am sure you will be surprised in the future.
            Phantom circuit Sequence Reducer Dyslexia

            Comment


            • Originally posted by ldesnogu View Post
              If that really pissed Apple off, don't you think they would have done as Sony did with the PS3? Just lock the bootloader and the machine is tied to signed OS images. That doesn't mean they'll help porting Linux, but I don't think that irritates them.
              The reason nothing is done yet is because Linux is not a threat to Apple... yet. Apple could open source their Mac OSX along with their drivers without any real issues, but they don't for a reason.
              Regarding MBA and the lack of fan, a huge proportion of MBA users will never notice. They just browse, listen to music, write mails and so on. As a developer, a fanless Mac is not an option and I got myself an MBP M1; fans almost never turn on even when I compile large codebases. I nonetheless need those fans as I sometimes run heavy SIMD multi-threaded code and that pushes the CPUs enough that active cooling is needed.

              Don't forget, most people here are not average users and what we want is not what the vast majority of users want
              Most people would be fine using a laptop with a Core2Duo. Realistically it'll be used to watch Netflix and look for porn. Some will actually use it for work and some will even try to play games on them. That doesn't mean not including a fan is OK. If your Macbook Pro M1 rarely turns on the fan, then why would including a fan be an issue? The Air and the Pro both have nearly the same thinkness and weight.

              Comment


              • Originally posted by Dukenukemx View Post
                The reason nothing is done yet is because Linux is not a threat to Apple... yet. Apple could open source their Mac OSX along with their drivers without any real issues, but they don't for a reason.
                most parts of MACOSX is already opensource... for example LLVM... also the kernel is a BSD style kernel.

                but the joke is even if apple turns macosx into a opensource dreamland the opensource people will stay on linux.

                "That doesn't mean not including a fan is OK."

                the solution for you is very simple: you buy a device with a fan... and everyone else like me buy the version without a fan. problem solved.

                why do you think you can force people to your will ? you claim apple does this but you look like this.
                Phantom circuit Sequence Reducer Dyslexia

                Comment


                • Originally posted by qarium View Post

                  well yes its awful... but if you are a smoker you better buy a passiv cooled system.
                  also this warranty problem with apple was a active cooled hardware not apssive cooled.
                  for smokers it is still best to just buy a passive cooled device.



                  well i run my smartphones for years and never had a problem with that.
                  but of course technically speaking you need a water cooled smartphone to avoid thermal problems.



                  well yes i unterstand. i do try to avoid such products i buy 100€ cheap smartphone i use it for many years
                  my PC is made by easily replaceable hardware parts.
                  these notebooks who can not be fixed i just avoid them.
                  i discoverd in my life that the best repair you can have it to avoid such products.
                  i have professsional training 1 year in electronic and 3 years in electric and i tell you something: i could not repair any of these products like your Louis Rossmann example...
                  this does not mean that i did not learn anything in my professional training but if you can avoid it you should avoid it. and i know you can not avoid it because you earn your money with repair this stuff.



                  my last x86 HP notebook had a trojan horse inside of the UEFI Bios and even a complete reinstall of fedora did not remove it. so you talk about: "bad solder joints".,.. but this laptop rund for years and even today i did give it away for poor people. just to get rid of the trojan horse inside of the device.

                  do you remember geforce 9500 notebook desaster ? i had a laptop with nvidia gpu and it died because of bad solder of the gpu...

                  but to be honest i do not fear: bad solder joints if this is the case i just buy a new one.
                  but what i really fear is Trojan horse inside the UEFI Bios ...



                  i have 280 games in my steam account.. most of the time i do not have time to play games.
                  if you want an example of game who use a lot of cpu power any RTS with many units does it like homeworld2.
                  or if you want multicore example ashes of singularity ...

                  but i like the technology...



                  it does not matter if it is 1,5gb or 2gb or 2,5 gb you always end up to the fact that PS5 has more vram than the 6gb vram card.

                  "If you were playing Elden Ring then 4GB is used for the game engine, resulting in 8GB for textures."

                  and this proof my point exactly ... your 6gb vram GTX1060 can not hold the textures and because of this the result will always be different.

                  now you can say 2gb vram difference does not make it a big different and yes many people can life with it.
                  but to claim the 1060 can have the same visual result of the game as a PS5 is plain and simple wrong.
                  we can say this for a vega64 with 8gb vram but even this is not true because of raytracing.



                  yes i know the PS5 is better in doing desktop work like browsing the internet because of more ram can be used... but the xbox looks better in games.

                  but i did read it multible time that PS5 is more easy to develop for it compared to the xbox who looks more like standard PC with ram for cpu and vram for gpu.

                  i think the PS5 does win in the end because of the texture streaming co-processor... who loads textures directly from the SSD sidechanneling the cpu.



                  i think in the near future SOCs will win big even against RTX3090 style hardware.

                  but the SOCs of today are not here yet... but remember like the AMD 5800X3D
                  as soon as they do 3D stacking memory/cache and do more tranistor 3D stacking in 2nm
                  and so one and so one in the end classic normal hardware like a 12900K+RTX3090 will lose.

                  the apple m2 SOC will go from 20 billion tranistors to 60 billion tranistors in the next 2 years...

                  also apple will start to 3D stack cache directly on the SOC chip.



                  the 6600xt is only like 7% faster than the vega64... and for raytracing the 6600xt is to slow.

                  dont waste your money like this... buy a AMD radeon 7600XT instead.



                  i tell you something why the cpu is NEVER the bottleneck and the GPU is always the bottleneck:
                  I have a 4K 65" display. in all games i play the vega64 is what limit the FPS and the cpu never touch 100%
                  thats why your 7% faster cpu in singlecore performance does not matter at all at 4K gaming.
                  its always the bottleneck what is the problem and at 4K it is always the vega64 what is to slow.

                  i think with your vega56 you are not a 4K gamer... you maybe game on 2K



                  i have 3200mhz ram but i am sure it runs slower if i check it in bios. but it does not matter at all.
                  at 4K the GPU is the bottleneck and not the ram and not the cpu.

                  "Much worse if you have 4 sticks of ram."
                  man you are funny.... i have 8 sticks of ram.. and these ram are only half-side ram means the full version means it is 256GB ram for my system...

                  "The 1700 I have is overlocked to 3.8Ghz"

                  my system is water cooled and has 2000W PSU and expensive OC mainboard i could easily overclock it
                  but the joke is: i have no reason to do so because the system is faster than i need it.



                  again it depend on the game... with ashes of singularity you can shot down a 32core threadripper if you want.
                  the reason is simple it is a mandle/vulkan game and you can put as much units on the map as you like.
                  and then at 5000 or 20 000 units the cpu will go down. and of course it is perfectly multithreaded..



                  ECC is slower yes but again at 4K gaming the cpu or ram is not the bottleneck.

                  also if you want to use hardware a long time non-ECC ram maybe break faster and ECC ram will run a long time even if the ram starts to produce errors over time .

                  "Good for servers, not so much for gaming."

                  you should rethink this deeply if you OC normal RAM who do not have ECC you will not get error message if your ram starts to fail because of to much OC.
                  only ECC ram can be OC securely because as soon as you clock it to high there will be error messages and even of this is the case most of the time the ram will work without problem because of ECC.

                  "Also I doubt you have 2 sticks of 64GB ECC ram"

                  it is 8 dimms every dimm has 16gb ...

                  "What clock speed do you even run the ram at?"

                  i need to go to bios to check this. but the possibility is high that the ram do not operate at 3200mhz.



                  well of course my system and your system is not the same level in money speaking.
                  my systems are already 5 years old and i am sure i will use them many more years.
                  so the cost of the systems are relativ over this long time.
                  You peaked my curiosity. Cinebench doesn't work on Linux but Geekbench 5 does. Not a fan of it but it is quick and easy to get results. This is my 2700X system. Take a snap shot of your system and upload it here.
                  https://browser.geekbench.com/v5/cpu/16594132
                  Attached Files

                  Comment


                  • Originally posted by Dukenukemx View Post
                    You peaked my curiosity. Cinebench doesn't work on Linux but Geekbench 5 does. Not a fan of it but it is quick and easy to get results. This is my 2700X system. Take a snap shot of your system and upload it here.
                    https://browser.geekbench.com/v5/cpu/16594132
                    i did watch in the bios by default without manual settings and without OC my ram run at 2666mhz (if you use 8 dimms at 16gb per dimm it always run slower compared to only 4 dimms or 2 dimms)

                    my geekbench result is here: https://browser.geekbench.com/v5/cpu/16594551

                    (edit) as you can see your system is
                    6% faster in singlecore and my system is 47% faster in multicore... i think this makes my system a winner. because it is without OC... i can beat these 6% easily... it only runs at 3,5ghz it easily runs over 4,1ghz

                    (edit2) now my CPU OC at 3,9 ghz: https://browser.geekbench.com/v5/cpu/16594734 now multicore is at 53% faster .. singlecore is slower because the manual OC clock speed is lower than the automatic turbo boost.

                    (edit3) now my cpu OC at 4ghz https://browser.geekbench.com/v5/cpu/16594923
                    it is 55% faster in multicore and 8,2% slower in single core. and you already know i prever multicore performance so i am fine with that
                    Last edited by qarium; 12 August 2022, 06:49 PM.
                    Phantom circuit Sequence Reducer Dyslexia

                    Comment


                    • Originally posted by qarium View Post

                      i did watch in the bios by default without manual settings and without OC my ram run at 2666mhz (if you use 8 dimms at 16gb per dimm it always run slower compared to only 4 dimms or 2 dimms)
                      Ryzen tends to run slower with more dimmns because the memory clock goes down.
                      my geekbench result is here: https://browser.geekbench.com/v5/cpu/16594551

                      (edit) as you can see your system is
                      6% faster in singlecore and my system is 47% faster in multicore... i think this makes my system a winner. because it is without OC... i can beat these 6% easily... it only runs at 3,5ghz it easily runs over 4,1ghz

                      (edit2) now my CPU OC at 3,9 ghz: https://browser.geekbench.com/v5/cpu/16594734 now multicore is at 53% faster .. singlecore is slower because the manual OC clock speed is lower than the automatic turbo boost.

                      (edit3) now my cpu OC at 4ghz https://browser.geekbench.com/v5/cpu/16594923
                      it is 55% faster in multicore and 8,2% slower in single core. and you already know i prever multicore performance so i am fine with that
                      That doesn't make sense since the more you overclock the worse your single threaded performance gets. That should also get better. I'm actually kinda surprised since other 1920x systems have gotten much worse scores. But yea somethings wrong if overclocking is making single threaded performance worse. Either your temperature is too high and it lowers the clock, or you have a lot of memory errors and that could be hurting performance. Also possible you don't have enough voltage.

                      Comment

                      Working...
                      X