Announcement

Collapse
No announcement yet.

Intel Launches 11th Gen Core H-Series "Tiger Lake H"

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Doing some quick math here, intel advertised TDP of 45w, plus their insane turbo boost, equals approximately 135w of actual TDP under load. So basically these parts will be running at 90C or higher, while leaving burn marks on your desk, and causing hearing loss from the tiny fans at a constant 7000 rpm. Yay?

    Comment


    • #12
      Originally posted by torsionbar28 View Post
      Doing some quick math here, intel advertised TDP of 45w, plus their insane turbo boost, equals approximately 135w of actual TDP under load. So basically these parts will be running at 90C or higher, while leaving burn marks on your desk, and causing hearing loss from the tiny fans at a constant 7000 rpm. Yay?
      laughs in FX 9590

      I love how we need a spreadsheet for all the turbo boost frequencies

      Comment


      • #13
        Originally posted by brunosalezze View Post
        "Intel's top-end Core i9 11980HK model is reported by the company to outperform AMD's Ryzen 9 5900HX by 11% to 26% in various (Windows) games."

        With the IGPU? Sure, DG? No... TGL is too far behind amd in power efficiency numbers, in single core workloads, the mobile 5000 amd chps hits 4.5Ghz at a peak of 15w, while a TGL requeres 30w 4.8Ghz, and they score mostl the same. There is a huge efficiency gap between these process nodes, and amd is still on 7nm, with avaiable 6 and 5. 6nm is a drop in upgrade path.
        Well, according to Anandtech who dug into the test configuration Intel used, it looks like Intel may have given itself a significant TDP advantage in their internal benchmarks.

        "Intel’s benchmarks against AMD are a bit trickier, given it is hard to compare laptop-vs-laptop given that different designs will have different optimization points. However in an iso-comparison with an RTX 3080 (giving AMD the benefit of the doubt on TDP and a bigger chassis for cooling), and with all settings and cooling features and fans set to maximum, Intel is claiming 20% better average gaming performance with a Core i9-11980HK (65W?) against a Ryzen 9 5900HX (45W?), or iso-performance comparing a Core i5-11400H (45 W?) and a Ryzen 9 5900HS (35 W?). There are a lot of question marks there because Intel’s appendix of test setups mentions only ‘maximum performance setting’ for each system, and that in-itself is variable based on the OEM."

        Comment


        • #14
          Interesting to see so many on topic posts. Normally, topics like these derail because of Intel vs AMD flame wars.

          Comment


          • #15
            Originally posted by Vlad42 View Post

            Well, according to Anandtech who dug into the test configuration Intel used, it looks like Intel may have given itself a significant TDP advantage in their internal benchmarks.

            "Intel’s benchmarks against AMD are a bit trickier, given it is hard to compare laptop-vs-laptop given that different designs will have different optimization points. However in an iso-comparison with an RTX 3080 (giving AMD the benefit of the doubt on TDP and a bigger chassis for cooling), and with all settings and cooling features and fans set to maximum, Intel is claiming 20% better average gaming performance with a Core i9-11980HK (65W?) against a Ryzen 9 5900HX (45W?), or iso-performance comparing a Core i5-11400H (45 W?) and a Ryzen 9 5900HS (35 W?). There are a lot of question marks there because Intel’s appendix of test setups mentions only ‘maximum performance setting’ for each system, and that in-itself is variable based on the OEM."
            To be honest it's not a big deal since Intel basically compares the fastest mobile CPUs from both companies and when your GPU alone eats around 120W, "mobileness" gets thrown out of the window anyways. It's not fair of course but it is what it is.

            Originally posted by Vistaus View Post
            Interesting to see so many on topic posts. Normally, topics like these derail because of Intel vs AMD flame wars.
            Volta is sleeping. It's temporary.

            Comment


            • #16
              Originally posted by birdie View Post

              To be honest it's not a big deal since Intel basically compares the fastest mobile CPUs from both companies and when your GPU alone eats around 120W, "mobileness" gets thrown out of the window anyways. It's not fair of course but it is what it is.



              Volta is sleeping. It's temporary.
              But their not. AMDs highest end SKUs are the 5980HX and 5980HS for 45W and 35W respectively. In addition, the 5980HX is an overclocking model that can be set to a higher than 45W TDP.

              While TDP is not a be all end all, it can potentially influence a chip's turbo algorithm (i.e, a chip may turbo more aggressively with a higher TDP because it thinks it has more cooling capacity to deal with the extra heat in a laptop chassis). If the TDP did not matter, then why would Intel have set it to the maximum level for their chip and not done the same for the competition's? It is most likely that it gives Intel advantageous benchmark results. If it did not, then one would benchmark in a manner that made sense and could not be called into question. Either you would set all performance settings to maximum for both chips (including TDP) or you hold TDP, the rated required cooling capacity, constant, these are laptop chips after all. Anything else would be an unfair comparison. If Intel wants to assert that their 65W TDP is equivalent to AMD's 45W TDP, then they need to provide evidence to this effect.

              As far as the GPU power draw, that does not matter in the slightest when talking about which CPUs have better performance. A GPU like that is used solely to minimize the possibility of having a GPU bottleneck in gaming benchmarks, thus showing of the difference between the CPUs - assuming there are no other differences in the system configuration. However, most laptops with H series CPUs will have either have a dGPU with much lower power draw and TDP or no dGPU at all.

              Comment


              • #17
                Originally posted by Vlad42 View Post

                But their not. AMDs highest end SKUs are the 5980HX and 5980HS for 45W and 35W respectively. In addition, the 5980HX is an overclocking model that can be set to a higher than 45W TDP.

                While TDP is not a be all end all, it can potentially influence a chip's turbo algorithm (i.e, a chip may turbo more aggressively with a higher TDP because it thinks it has more cooling capacity to deal with the extra heat in a laptop chassis). If the TDP did not matter, then why would Intel have set it to the maximum level for their chip and not done the same for the competition's? It is most likely that it gives Intel advantageous benchmark results. If it did not, then one would benchmark in a manner that made sense and could not be called into question. Either you would set all performance settings to maximum for both chips (including TDP) or you hold TDP, the rated required cooling capacity, constant, these are laptop chips after all. Anything else would be an unfair comparison. If Intel wants to assert that their 65W TDP is equivalent to AMD's 45W TDP, then they need to provide evidence to this effect.

                As far as the GPU power draw, that does not matter in the slightest when talking about which CPUs have better performance. A GPU like that is used solely to minimize the possibility of having a GPU bottleneck in gaming benchmarks, thus showing of the difference between the CPUs - assuming there are no other differences in the system configuration. However, most laptops with H series CPUs will have either have a dGPU with much lower power draw and TDP or no dGPU at all.
                I cannot argue with you any longer because you ignored the most important part of my message which is the CPU power consumptions is dwarfed by the dGPU power consumption, so even if mobile Intel CPUs consume a whole 20W more than AMD CPUs that doesn't fundamentally change anything at all. You're still looking at the laptop whose gaming battery life is less than two hours. And cooling performance for a 45W/65W CPU + 120W GPU combo shouldn't be too much different because such laptops have long been using a single cooling loop for everything. I'm not trying to vindicate Intel or say what they've done as their PR message is accurate or even acceptable but overall for the users of such gaming systems performance comes first and everything else is secondary.

                To be honest I believe gaming laptops are a stupid fad and people who buy them are idiots. They overpay and get an inferior product with a shortened longevity and very limited upgrade options. And gaming at such a laptop is just horrible: you cannot sit properly, you have a smaller keyboard, the whole system is noisy as hell, the screen is small.

                Of course if you have a ton of money, you need to be on the move all the time, and you want to game, such laptops are an option. Otherwise a gaming PC or even a console are much much better options.
                Last edited by birdie; 11 May 2021, 01:58 PM.

                Comment


                • #18
                  they should compare the best of amd not the mid one, this reports of brands are always a garbage, oem will need to cap the tdp for these intel ones if they use small laptops, always the same story, the first 1 min is amazing after ten we have less 20to 40% multi performance

                  Comment


                  • #19
                    Originally posted by birdie View Post

                    I cannot argue with you any longer because you ignored the most important part of my message which is the CPU power consumptions is dwarfed by the dGPU power consumption, so even if mobile Intel CPUs consume a whole 20W more than AMD CPUs that doesn't fundamentally change anything at all. You're still looking at the laptop whose gaming battery life is less than two hours. And cooling performance for a 45W/65W CPU + 120W GPU combo shouldn't be too much different because such laptops have long been using a single cooling loop for everything. I'm not trying to vindicate Intel or say what they've done as their PR message is accurate or even acceptable but overall for the users of such gaming systems performance comes first and everything else is secondary.

                    To be honest I believe gaming laptops are a stupid fad and people who buy them are idiots. They overpay and get an inferior product with a shortened longevity and very limited upgrade options. And gaming at such a laptop is just horrible: you cannot sit properly, you have a smaller keyboard, the whole system is noisy as hell, the screen is small.

                    Of course if you have a ton of money, you need to be on the move all the time, and you want to game, such laptops are an option. Otherwise a gaming PC or even a console are much much better options.
                    The GPU's power draw does not matter at all when benchmarking the CPU! You always want to remove a GPU bottleneck when showcasing the CPU performance in games because you want to see how the CPU effects the performance. This is benchmarking 101. Intel was only making a statement about maximum possible performance, not battery life. It is possible, for example, that we could find these processors with 45W dGPUs and 12+ hour non-gaming battery life. Also, would it be valid to make a general claim that Rocket Lake is better than Zen 3 because you compare the performance of a system with a 5800U to a 11900K when both systems have a 2000W GPU? The logic of your argument says it is because the GPU power dwarfs everything else.

                    You are probably right that cooling probably does not matter, but we would need to examine the model used to know for certain. Apple has famously not not turned on their active cooling on Intel based laptops until the chips were already thermally throttling and many other OEMs are just really cheap.

                    That said, the CPU will operate differently depending on the TDP it is set to. For example, a CPU could have an effective 3.6 GHz all core turbo when set to a 45W TDP and a 4.2 GHz all core turbo when set to 65W due to changes in the turbo algorithm from the TDP alone (and this has nothing to do with the operating temperature). Last I checked, on Intel systems the programed TDP is one of the key factors that determines the the tau value on laptop SKUs (desktop SKUs typically have weird motherboard overrides in place). Intel could have fairly easily acquired the AMD chips from a distributor and used an open air bench platform for both systems with the CPU cTDPs set to the same value - they have done this before.

                    Comment


                    • #20
                      Originally posted by andre30correia View Post
                      they should compare the best of amd not the mid one, this reports of brands are always a garbage, oem will need to cap the tdp for these intel ones if they use small laptops, always the same story, the first 1 min is amazing after ten we have less 20to 40% multi performance
                      Exactly! You need to make a like for like comparison or it is completely meaningless.

                      Comment

                      Working...
                      X