Announcement

Collapse
No announcement yet.

Linux 5.10 ARM64 Has A "8~20x" Performance Optimization Forgotten About For Two Years

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Linux 5.10 ARM64 Has A "8~20x" Performance Optimization Forgotten About For Two Years

    Phoronix: Linux 5.10 ARM64 Has A "8~20x" Performance Optimization Forgotten About For Two Years

    Last week was the main set of ARM 64-bit architecture updates for Linux 5.10 while today a second batch of changes were sent in for this kernel. That first round had the Memory Tagging Extension (MTE) and Pointer Authentication support among other improvements while this secondary pull has two notable performance optimizations...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Some RPI4 benchmarks would be nice

    Comment


    • #3
      I wonder how much performance AI programmers could squeeze out of our hardware as we humans tend to forget things or make errors all the time.

      Comment


      • #4
        Originally posted by ms178 View Post
        I wonder how much performance AI programmers could squeeze out of our hardware as we humans tend to forget things or make errors all the time.
        I wonder how much performance human programmers could squeeze out of our hardware if they weren't so damn lazy. Seems like once the Core2 Quad was released, a lot of developers just gave up in trying to optimize anything. As a result, we ended up with a lot of bloatware, being jaded by the idea of a single web browser tab taking up over 2GB of RAM, and games that look like they were released 10 years ago but demand a $2000 PC to play them.

        Comment


        • #5
          AI is only so good. Its relatively still stupid and will make its own mistakes. Just have to live with it.

          Comment


          • #6
            Originally posted by schmidtbag View Post
            I wonder how much performance human programmers could squeeze out of our hardware if they weren't so damn lazy. Seems like once the Core2 Quad was released...
            (In good humour - and from someone who's only ever dabbled in Linux kernel coding)
            Substitute "once the 80286 was released" or "once the 68020 was released" and people have been saying that my entire career ;-)

            Programming is hard, and solving one decently hard problem (like a root-and-branch performance optimisation) in 50 million lines+ of *multiarchitecture* code is _really_ hard.
            You never have infinite time to spend, you never have fewer than a dozen other competing issues to deal with.
            (and an aside, training an AI to balance those issues would also be "non-trivial" but interesting to see how that goes)

            Bryan Lunduke's "Linux Sucks" stuff is so funny for me because I know where he's coming from - Linux has been my daily drive for two decades and there's still no less than half a dozen things that piss me off every day - and yet to sit down and diagnose that problem and post a decent bug report takes hours... to check out the code and figure where the problem is can take days. Quite often, it's one line of code that's not "wrong" but sub-optimal in some case.

            So: the Git repo is thataway, have at it ;-)

            Comment


            • #7
              Originally posted by schmidtbag View Post
              I wonder how much performance human programmers could squeeze out of our hardware if they weren't so damn lazy. Seems like once the Core2 Quad was released, a lot of developers just gave up in trying to optimize anything. As a result, we ended up with a lot of bloatware, being jaded by the idea of a single web browser tab taking up over 2GB of RAM, and games that look like they were released 10 years ago but demand a $2000 PC to play them.
              The problem isn't the individual people, it's our industry.

              When your web application takes 80 MB of RAM in the browser when it should take 4 MB, what are the odds the person issuing your paycheck is going to ask you to cut the bloat compared to the odds they'll ask you to add a new feature or change the color of the 'subscribe' button? Even in a free software application, the web software that takes 20 seconds to start and devours RAM on the server and in the browser but innovates rapidly is likely to get bigger adoption than the lean, mean application focused on efficient resource use on client and server.

              Likewise for games. Maybe someone had all of the ideas in Minecraft for their own game before Notch, but they were working in C on something that would use 20% of the CPU resources and 5% of the RAM of Notch's Java code. But Notch got to market first, and conquered the world, and not enough people said, "This is a resource hog for what it does, I'm not running this crap."

              And when iOS next or Android next needs too many resources to run on older phones, instead of giving Apple and Google the finger and looking for a more efficient mobile operating system, people shrugged and bought newer devices.

              Electron should have been stillborn because everyone refused to use it. Websites like Youtube and Reddit should have imploded from lack of use because they're so resource hungry. Android and iOS should be dead because users laughed at the idea that 256MB of RAM and a single 800 MHz CPU core weren't able to provide an excellent end user experience on a smart phone. ...and here we are.

              Our whole industry is fucked.

              Comment


              • #8
                Originally posted by Happy Heyoka View Post
                Bryan Lunduke's "Linux Sucks" stuff is so funny for me because I know where he's coming from - Linux has been my daily drive for two decades and there's still no less than half a dozen things that piss me off every day
                The way I look at it, the broader free software community contributing to Linux desktops (and *BSD desktops) has an insignificant fraction of the money that Apple, Google, and Microsoft do.

                I'd rather deal with 100 bugs from software put together by thousands of loosely connected and mostly volunteer contributors than 1 bug from a company that has literally more than 100 billion dollars in gross revenue per year. They have literally 10,000 times the resources to throw at QA, and they're too greedy to give it the investment it deserves.

                It still irritates when things break on my home computer, but when there are bugs in my work laptop running Windows - and there are lots of bugs - I fly into a rage.

                Comment


                • #9
                  Originally posted by schmidtbag View Post
                  I wonder how much performance human programmers could squeeze out of our hardware if they weren't so damn lazy. Seems like once the Core2 Quad was released, a lot of developers just gave up in trying to optimize anything. As a result, we ended up with a lot of bloatware, being jaded by the idea of a single web browser tab taking up over 2GB of RAM, and games that look like they were released 10 years ago but demand a $2000 PC to play them.
                  LMAO literally the exact same thing was said back in the mid 1990's after the Pentium processor. And how much CPU power is wasted today running Java enterprise apps in frameworks like Oracle's Weblogic and SOA Suite, when a native binary would be literally 1000x faster and consume a tiny fraction of the RAM?

                  Comment


                  • #10
                    Originally posted by schmidtbag View Post
                    I wonder how much performance human programmers could squeeze out of our hardware if they weren't so damn lazy. Seems like once the Core2 Quad was released, a lot of developers just gave up in trying to optimize anything. As a result, we ended up with a lot of bloatware, being jaded by the idea of a single web browser tab taking up over 2GB of RAM, and games that look like they were released 10 years ago but demand a $2000 PC to play them.
                    Yeah, nowadays is a pure waste of computational power....
                    And the most used software to write softwares: visual studio code, this thing based on electron, that is based on node.js. that in base on chrome javascript engine....

                    Comment

                    Working...
                    X