Announcement

Collapse
No announcement yet.

Tiger Lake + Renoir On Ubuntu Linux For Battery vs. AC Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by commarmi View Post
    Hello,

    maybe someone commented it, but: https://arstechnica.com/gadgets/2020...lay-heres-why/

    Br,
    Exactly what I wanted to say! An extract from that benchmark:
    During this Cinebench R23 run, the laptop [MSI Prestige Evo 14 with Core i7-1185G7] spent its first 10 to 15 seconds running at the full PL2 power limit of 51W, with temperatures up to a blistering 98°C. After that initial extremely high-performance power and heat generating burst, the CPU dropped down to sustain an average power consumption of 34W. By contrast, an 8 core/16 thread Ryzen 7 Pro 4750U—at cTDP upward of 25W—consumed an average of 27.9W, with a high of 29.9W.
    So, what's new here? Exactly nothing, Intel is still playing dirty as usual.

    Comment


    • #22
      Michael found pretty much the same thing Charlie Demerjian found over at his site semiaccurate.com

      Charlie was more explicit of what he found that Intel was trying to promote as "bullshit" Have a read of a snippet. Link to the article is below.

      " Today Intel is trying to show that their ‘Evo’ tuning results in a better laptop than AMD’s CPUs. SemiAccurate went over the details and checked up on the claims, and the result is mixed. "

      " With that out of the way, the official reason for the Evo brand isn’t just blatant anti-competitive behavior is it makes a more usabe laptop. They do this by tuning the devices to narrow the gap between performance on AC power and performance on battery power, a laudable goal. Intel claims a 5% drop in performance when moving to battery power and that AMD has a 38% performance drop. Seems like a serious problem for AMD, right? Sort of but that is a small enough slice of the bigger picture that it is mostly ignorable. Lets take a look at why. "

      " Intel goes on to show how AMD’s performance radically drops on DC power vs AC, and Intel wins on each test as well. The problem? The tests are MobileMark, 2018, PCMark 10, WebXPRT v3, SysMark 25, a PPT to PDF conversion/save, Excel to Word import, Word to PDF, and Outlook mail merge. Again what is the problem? The canned tests are about as close to bullshit as you can get, Anything that uses them should be seen as badly skewed data and any conclusions they draw should be dismissed out of hand, it is that bad."

      " As for the others, well, being a Linux user I am on Libre Office rather than MS Office so I can’t say the numbers presented are out of line. That said what kind of Word file takes 30-60 seconds to save as PDF on a modern machine with a PCIe3/4 SSD? I could go on but you get the idea. From SemiAccurate’s point of view, we don’t see many people doing large mail merges and huge batch conversions on battery power repeatedly as part of an average user’s casual work day. "

      The whole scathing article about Intel's attempt to resurrect the "Ultrabook" era with Project Athena and these bullshit benchmarks is below. It tracks nicely with Michael's findings.

      Today Intel is trying to show that their ‘Evo’ tuning results in a better laptop than AMD’s CPUs.







      Comment


      • #23
        Of course...every Intel and AMD CPU and SoC look like pieces of power hungry shit at their most power sipping compared to Apple's M1.

        Comment


        • #24
          Between Phoronix and the linked Arstechnica and SemiAccurate articles, I learned something today. Good work all around. Except for Intel.

          I do wonder if AMD will massage their power/performance implementation a bit, though. They seem to be running conservatively in some cases where it seems like users might prefer more performance. Arstechnica mentioned boot, for instance. That seems like a good time to run at full speed, and I would wonder what the reasoning is if they don't.

          Comment


          • #25
            Originally posted by Jumbotron View Post
            Of course...every Intel and AMD CPU and SoC look like pieces of power hungry shit at their most power sipping compared to Apple's M1.
            Welcome to the Phoronix Forums, the largest community of those devoted to enriching the Linux hardware experience, Linux graphics drivers, and more.

            Good luck running Linux on Apple ARM hardware.

            Comment


            • #26
              Originally posted by Jumbotron
              Tell you what son. Why don't you make your next shit post a public condemnation of the owner of this site for originally posting a detailed benchmark article on Apple's M1. Oh, yeah, condemn his SECOND article detailing how he added the M1 to his benchmarking suite.

              Piss off junior. My statement stands as factual. x86 CPUs will NEVER achieve power per watt performance of ANY high end ARM SoC much less the M1.
              starshipeleven will be back from vacation soon. You are giving him a lot of raw material to work with.

              I can't find any links to the GFLOPS/watt rating for the M1. Do you know if anyone independent has run a LINPACK or other FP tests on the M1 to see what its actual PPW rating is?

              Comment


              • #27
                So "race to sleep" is a common trend now, where you fire the CPU core up to a high-speed, get the work done quickly, and then park the cores again. But conversely, if "getting the work done quickly", means you're slamming the CPU cores well out of their favorable V/F range and running up against the maximum power limit, you're probably going to absolutely demolish battery life.

                I'd like it if the Phoronix Test Suite could calculate the total energy consumed (milliwatt-hours) from each machine to complete the test. Maybe turn the screens off or set them to the same brightness to try and null out extraneous power draws. That would tell the real story.

                If Intel can complete the test with their super boosting technology while consuming less overall power than the AMD system, then they are in the right. If their system is doing this while consuming significantly more total energy, well, then they have nothing.

                Comment


                • #28
                  Originally posted by Jumbotron
                  Piss off junior. My statement stands as factual. x86 CPUs will NEVER achieve power per watt performance of ANY high end ARM SoC much less the M1.
                  Unless Apple decides to let us run open source on it, the M1's power per watt performance improvement over current day x86 is really just a threat to our freedoms. I hope it fails.

                  Long live the Raspberry Pi!

                  Comment


                  • #29
                    Originally posted by Slartifartblast View Post

                    Welcome to the Phoronix Forums, the largest community of those devoted to enriching the Linux hardware experience, Linux graphics drivers, and more.

                    Good luck running Linux on Apple ARM hardware.
                    FWIW, if Apple can do it, probably someone will try it too, maybe with high success. Maybe even AMD. Intel is famous for not inventing anything. From i7 2700k (overclocked) to 10900k with only 1% perf improvements per generation. AMD typically provides around 30 to 100% speedup per generation. For instance, people switch from X USD 4-core to X USD 8-core etc. The perf will be twice as high. Now the 3rd gen Zens are again twice as fast as the first gen. Then there's Apple, with lots of chip level innovation. Of course they also have a marketing department. But Intel only has this marketing department (+ truckloads of 14nm++ wafers). Hard to compete.

                    Comment


                    • #30
                      I don't really see how laptops running with reduced performance when running off the battery is news - but this is Intel moving the goalposts yet again because one of their last bastions (raw single thread performance, eg: many games) has been comprehensively bombarded by AMD with Zen 3.

                      My old workstation laptop - an Intel i7 4800-something with Iris Pro graphics - would see reduced performance in both Windows and Linux when running on battery, and that doesn't have a lot of the super modern internal monitoring/clocking/boosting algorithms that modern CPUs do without the OS even getting involved.

                      But this is basically just a marketing exercise. What was the saying about dogs? "The smallest dog barks the loudest"... well, when AMD, Intel and nVidia have had amazing products, they've been fairly quiet pre-launch. It's when the product is weaker that they talk it up. Happens in every field, I think. Not too surprising, but a little disappointing. Makes me worried that the new Intel chips next year won't be quite as amazing as hoped, though.

                      Comment

                      Working...
                      X