Announcement

Collapse
No announcement yet.

Ampere Altra Max M128-30 Linux Performance Preview

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    @drakonas777
    And so what if we want ARM because it seems cool? Don't judge others on how they like to spend their free time.
    If I want an ARM desktop because I want to see how well it can emulate x86 code, does that hurt you? I'm more interested in being able to try it out that creating a local super computer.

    Given the number of Raspberry Pi computers (and alternative) that are being sold... I'm sure that there's a market for a $200/300 octa core ARM64 with 16GB DDR4 that can handle a dedicated graphics card. I'm also sure that they are coming soon...

    I've got a second hand i7 for running games, but I spend most of my free time tinkering with my low performance Raspberry Pi 4 desktop, trying to get things to run on it that it isn't specifically made for... why? Because I like, that is as good a reason as any.

    Comment


    • #22
      I'm not judging you. Sorry if you got such an impression due my post. I'm just saying that "want cause cool" and "need cause ISA actually matters to my use case" are vastly different reasons to own the ARM-based computer. That's all.

      As for power consumption, I think a lot of people are quite mislead on this matter. Somehow the common belief is that switching ISA from x86 to ARM will significantly and universally reduce power consumption. It's not gonna happen just like that. Most of the ARM-based SoC power savings come from the heavy usage of fixed-function blocks and tendency to use newer bleeding edge lithography even compared to x86 platforms. ARM cores themselves, while, yes, being more power efficient, won't dramatically destroy x86 in efficiency simply because x86 does not spend absolute majority of it's power in the frontend while x86 power saving techniques are also advancing. This all discourse is a bit dogmatic and dating back to 2000s or something, and a lot of people tend to live in those mentalities despite the fact that both x86 and ARM are fundamentally different now. I'd say most of the ARM efficiency is in the die space, compute density perhaps, also scalability maybe, but not so much in absolute power consumption/performance. If you think, that 8 core ARM CPU (in general purpose code) will destroy your 5800X using 5-15W - think again. It may do it in 15-30% less power, but you will not gonna get desktop class performance in a mobile power envelope. Not without custom fixed function + more advanced node + SW magic like Apple does. It's a wishful unrealistic thinking to expect otherwise.
      Last edited by drakonas777; 29 September 2021, 06:27 AM.

      Comment


      • #23
        Originally posted by brucethemoose View Post

        Maybe if a big platform like Steam or EGS pushes for it? And Unreal/Unity really back it. Otherwise I'm skeptical, given the history of the gaming industry.


        I think ARM is the future, but barring some emulation miracle, commercial PC gaming is one of the hardest markets for it to penetrate.
        I am not saying you do not know or understand these concepts. Just broadening the dicsussion and giving my verbose opinions in addition to you.

        TL;DR Games from major game engines are already running on ARM64 it's not remotely identical to PC gaming versions but you can play Call of Duty, Fortnite, Civilization VI...

        Playing devil's advocate the gaming industry is highly competitive market. Many big titles have come and gone, both dev studios and publishers. From my biased perspective game devs are also abused more than their friends in devops/fintech/data-science environments. The gaming industry just like, many others, opts for a low risk investment AKA the line of least resistance. There are (a few small groups of) highly specialized talented people that push the boundaries. Once something is achieved then others follow suit.

        Linux Gaming

        In the early 2000s Linux had some AAA games from Epic, Id, and smaller studios like BioWare. Most of the ports were done by Loki Entertainment which went bankrupt in Jan 2002, many of the ports after that was done by icculus and friends. More here: https://en.wikipedia.org/wiki/Loki_Entertainment#Legacy

        Many studios that targeted Linux in the 2000s moved away including some of my favourites: Quake 3, Doom 3, Unreal Tournament (1999 and 2k4), Neverwinter Nights. Others like Valve and Blizzard were experementing behind the scenes:
        Jan 2011 https://www.phoronix.com/scan.php?pa...item&px=OTA0NQ
        April 2012 https://www.phoronix.com/scan.php?pa...ux_dampfnudeln

        Luckily Valve committed to cross-platform and possibly saved gamers from a lot of **** (Windows gamers included). Valve did however invest an insane amount of money in Linux over a long time. This was way before Steam Deck or even Steam Box was an idea.

        Some game studios, like FacePunch, were more public about why Linux did not work for them and others like Blizzard did not say much. https://rust.facepunch.com/news/linux-plans

        In conclusion without efforts like Loki/SDL (and later Valve) Linux gaming would not be close to where it is today. Many others following will benefit from those early day efforts. The same goes for drivers and open standards like Vulkan.

        ARM Gaming

        IIRC the last mayor push (marketing and funding) was with the Tegra K1 in 2014. That was Nvidia and Unreal. Since then the scene has improved steadily overtime.

        Now in 2021 a few mayor game engines work well on ARM64. The games itself are targeted at devices like phones and tablets. I suspect ARM gaming will suffer from many of the problems that Linux gaming suffers or suffered from. Fragmentation is probably the worst problem. Drivers and distros causes major problems for games. It is extremely difficult to support many different targets because each needs to be tested...

        Lack of conformance is pretty bad. Take the Raspberry Pi for example, heck let's be more specific... the Pi 4 Model B 8GB. For that single device you will need every different compilation targets for Raspberry Pi OS (Raspbian), Debian, and Android. Last year (2020) the 64bit Raspberry Pi OS (Raspbian) broke completely if you ran "apt-get dist-upgrade" during some parts of the year. You might not like Raspberry Pi OS because it does crazy **** like add Microsoft sources to your package manager without you knowing about it https://www.raspberrypi.org/forums/v...7026c#p1807702 . How well does https://github.com/raspberrypi/userland work in different distros? Does Do you support VC4 or V3D drivers? Are consumders using old-default firmware, new-default firmware, or open EFI based firmware? https://rpi4-uefi.dev/v1-14-release-for-pi-4/

        What about discrete GPU support? The Pi 4 compute module has PCIe support...



        Take that concept and factor in different models, other operating systems like Windows, macOS. This nightmare is already bonkers and it just gets orders of magnitude worse when you have completely different SoCs that have varying levels of support for different operating systems. Some have better support for Android others for iOS etc... How do you inform the consumer if your device is supported or not? How do you fix a problem in a device that worked previously?

        For now in the grand sceme of things ARM gaming seems to be limited to mobile platforms like Android and iOS.

        Conclusion

        ARM is very useful for servers, workstations, mobile, and embedded devices. For "PC gaming"... I am not the biggest fan of x86 and the specs that come with it (like UEFI). In a utopia consumers to be able to choose between their patent free uarch implementation and everyone will conform to open specs. In reality I think we need something closer to the model of x86 to make supporting devices a viable task. I don't know if ARM's business model is the theoretical middle-ground between those two concepts. ARM in theory and ARM in practice are two very different things. For example Nvidia could have bought ARM and decided to **** it up.

        I am not sure if ARM based "PC gaming" is the future. IMO it's just too far off to say. PC enthusiasts pay crazy amounts of money just to have the best system or in some cases the "best brand". It's about pride not about price/performance. The tech journalists have at least become less biased in this regard, most of the big youtube channels give out good advice warning consumers of rebrands and models that have the same name but have different parts... giving people clear indications of when it's worth it to upgrade. Unfortunately majority of gamers don't watch technical youtube channels. Which is why I don't see any gamer trying out ARM out of their own accord.

        If PlayStation and Xbox moved to ARM it could really help the situation. Meanwhile we see more x86 related gaming with devices like Steam Deck that are making headlines. That makes me much more sceptical.

        Personally I don't see myself moving away from x86 in the next 8 years. VFIO gaming is just so incredibly awesome I can't go without it. Given the limited support even in x86 based hardware I would be very shocked to see it working on par without problems on ARM implementations by ~2025 and still surprised to see it by ~2030. I do hope that there's more choice by that time even if it doesn't suit my specialized gaming needs.

        Comment


        • #24
          Originally posted by brucethemoose View Post
          That's quite competitive.

          The next Ampere designs will probably be ARMv9, right? That could be another huge boost for toolchains/apps that can take advantage of it, while Intel/AMD are stuck with a fragmented x86 ISA for the foreseeable future.
          Arm also suffers from the restrictions caused by backwards compatibility, just like x86. Binary distros have a baseline of the original Armv8-A, with the use of newer instructions conditional on run-time checks (e.g. via IFUNCs, or some hand-rolled equivalent), and some features like BTI bodged in by abusing the HINT space in Armv8-A (previously defined to be NOPs, now defined to have architecturally-visible side-effects) so binaries can use those features whilst still running on older processors.

          Comment


          • #25
            Hi++

            I am wondering how Ampere performs with x265, 6 years after my https://mailman.videolan.org/piperma...er/009348.html

            Comment


            • #26
              Originally posted by drakonas777 View Post
              I'm not judging you. Sorry if you got such an impression due my post. I'm just saying that "want cause cool" and "need cause ISA actually matters to my use case" are vastly different reasons to own the ARM-based computer. That's all.

              As for power consumption, I think a lot of people are quite mislead on this matter. Somehow the common belief is that switching ISA from x86 to ARM will significantly and universally reduce power consumption. It's not gonna happen just like that. Most of the ARM-based SoC power savings come from the heavy usage of fixed-function blocks and tendency to use newer bleeding edge lithography even compared to x86 platforms. ARM cores themselves, while, yes, being more power efficient, won't dramatically destroy x86 in efficiency simply because x86 does not spend absolute majority of it's power in the frontend while x86 power saving techniques are also advancing. This all discourse is a bit dogmatic and dating back to 2000s or something, and a lot of people tend to live in those mentalities despite the fact that both x86 and ARM are fundamentally different now. I'd say most of the ARM efficiency is in the die space, compute density perhaps, also scalability maybe, but not so much in absolute power consumption/performance. If you think, that 8 core ARM CPU (in general purpose code) will destroy your 5800X using 5-15W - think again. It may do it in 15-30% less power, but you will not gonna get desktop class performance in a mobile power envelope. Not without custom fixed function + more advanced node + SW magic like Apple does. It's a wishful unrealistic thinking to expect otherwise.
              If you read the article, it shows that on average the Altra used 2/3rds of the power of EPYC 7763. That's a huge difference for a server, particularly since a lot of power is for RAM, cooling and power supply. It's a game changer for datacenters where it lowers cooling costs and increases density.

              And yes, it's a fact that M1 does destroy every x86 consumer CPU while using a tiny fraction of the power. That has nothing to do with fixed function blocks or software magic since it does it on pretty much every benchmark you throw at it. It's just the widest and deepest OoO out there.

              Comment


              • #27
                drakonas777
                tildearrow

                x86 can be (relatively) low power, but you have to re-purpose a laptop. The big problem with x86 desktop boards is that pretty much none of them will even turn on with less than 40 W.

                Comment


                • #28
                  Can the kernel boot log be shared ?
                  Also topology of Cores, PCI, MC and so on ...

                  Comment

                  Working...
                  X