Announcement

Collapse
No announcement yet.

Intel 11th Gen Core "Tiger Lake" Launches

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    I never thought that Ian was pro-AMD, He just isn't biased.

    Comment


    • #12
      Originally posted by phoenk View Post

      More and more programmers are also moving their workloads to the cloud. I have a four core laptop for work, but I rarely use all of the performance it has to offer since anything reasonably demanding will get run somewhere else anyways. If more cores means more power draw, I'd rather stay where I am in terms of performance so I can maintain long battery life.
      How many cores does the ARM chip have in your phone ?

      Parkinson's Law, software expands to fill processor capacity.
      Last edited by Slartifartblast; 02 September 2020, 03:29 PM.

      Comment


      • #13
        ...and the possibility of higher speed memory compared to Renoir.
        Michael Renoir mobile already supports DDR4-3200 and LPDDR4X-4266.

        Comment


        • #14
          Originally posted by chuckula View Post

          Well considering the most expensive AMD GPU that I can buy on Newegg right now is $3K and won't do AV1 decoding, and actually has LESS than 4 CPU cores, I'm pretty sure a NUC that will cost a tiny fraction of that is a much better idea.
          Do you realise this is a 6y old workstation GPU left over from Dell and has nothing to do with today's AMD stack?

          Comment


          • #15
            Originally posted by grigi View Post

            If it's a sunny cove core, just like ice-lake, how does a 9% clock increase result in a 20% improvement?
            Tigerlake uses Willow Cove, it's literally in the article (and on every single tech news website that's talked about Tigerlake until now).

            Comment


            • #16
              Originally posted by phoenk View Post

              More and more programmers are also moving their workloads to the cloud. I have a four core laptop for work, but I rarely use all of the performance it has to offer since anything reasonably demanding will get run somewhere else anyways. If more cores means more power draw, I'd rather stay where I am in terms of performance so I can maintain long battery life.
              Not all programmers are using "the cloud" to do stuff. For Android development atleast, we compile locally so moar cores is great (or rather higher compile throughput, however it's achieved).

              And if you want to work on Linux kernel, Chromium, AOSP, Firefox, LLVM, Mesa etc. local build speed matters a lot.

              Comment


              • #17
                Originally posted by chuckula View Post
                Well considering the most expensive AMD GPU that I can buy on Newegg right now is $3K and won't do AV1 decoding, and actually has LESS than 4 CPU cores, I'm pretty sure a NUC that will cost a tiny fraction of that is a much better idea.
                Who in the hell still shops at Newegg?? Ever since the Chinese bought them out a few years ago, that place is trash. The prices suck, most of the items listed aren't even sold by Newegg but rather by 3rd party marketplace sellers you've never heard of. Like the video card you linked to, for example. No thanks. My go-to nowadays is Bhphotovideo, they are better in every measure. They are what Newegg used to be.
                Last edited by torsionbar28; 02 September 2020, 10:53 PM.

                Comment


                • #18
                  Originally posted by sandy8925 View Post

                  Tigerlake uses Willow Cove, it's literally in the article (and on every single tech news website that's talked about Tigerlake until now).
                  Willow Cove is essentially the same thing as Sunny Cove, though. There is a change to some of the caching around the chip, but that's expected to have very small effects, overall, with the chip basically being an Ice Lake that's just clocked way faster now that they've fixed their 10nm manufacturing process.

                  Comment


                  • #19
                    When the AMD & Intel CPU & GPU hardware is tested, could you also include the power consumption? Excess power does not just ruin the portable battery supplies. It also contributes to cooling fan noises, overheating and stomping on excess performance when overheated.
                    AMD's move to 7nm Zen 2 delivers inter-generational improvements of between 15% and 20% for single-threaded tasks, and 25-30% in multithreaded scenarios. It is less power hungry, and often lower dollar costs. Usually the AMD CPU needs fewer motherboard design changes than the Intel units, so upgrading the CPU alone is usually easier and cheaper. However Intel offers the ability of using external GPU's. To some users, this is important.
                    Finally, it should be mentioned that The Linux Foundation prefers the AMD CPU, for the compiling speed of the Linux kernel. phoronix.com/scan.php?page=article&item=amd-linux-3960x-3970x&num=9

                    Comment


                    • #20
                      Originally posted by phoenk View Post
                      More and more programmers are also moving their workloads to the cloud. I have a four core laptop for work, but I rarely use all of the performance it has to offer since anything reasonably demanding will get run somewhere else anyways. If more cores means more power draw, I'd rather stay where I am in terms of performance so I can maintain long battery life.
                      Actually in most cases, more cores is the more efficient option. Every CPU architecture has a "sweet spot" for frequency and wattage. If you clock too far outside that range, efficiency drops. So depending on the task, a pair of cores operating at 2.3GHz is probably going to use less power than a single core at 4.6GHz. If you are running 2 independent tasks, the pair of slower cores will complete them faster, since they don't have to swap out instructions in their cache.

                      Fewer cores also means each core has more work to do at any given time, which then boosts the frequency. Being able to spread out the load can, in some cases, improve efficiency.

                      Meanwhile, there's no reason you can't have a whole bunch of cores where only one or two clock super high for single-threaded workloads when you need them. With modern technology, there's just no reason to have fewer than 4 cores for desktop/laptop use.
                      Last edited by schmidtbag; 03 September 2020, 09:26 AM.

                      Comment

                      Working...
                      X