I never thought that Ian was pro-AMD, He just isn't biased.
Announcement
Collapse
No announcement yet.
Intel 11th Gen Core "Tiger Lake" Launches
Collapse
X
-
Originally posted by phoenk View Post
More and more programmers are also moving their workloads to the cloud. I have a four core laptop for work, but I rarely use all of the performance it has to offer since anything reasonably demanding will get run somewhere else anyways. If more cores means more power draw, I'd rather stay where I am in terms of performance so I can maintain long battery life.
Parkinson's Law, software expands to fill processor capacity.Last edited by Slartifartblast; 02 September 2020, 03:29 PM.
- Likes 3
Comment
-
Originally posted by chuckula View Post
Well considering the most expensive AMD GPU that I can buy on Newegg right now is $3K and won't do AV1 decoding, and actually has LESS than 4 CPU cores, I'm pretty sure a NUC that will cost a tiny fraction of that is a much better idea.
- Likes 11
Comment
-
Originally posted by grigi View Post
If it's a sunny cove core, just like ice-lake, how does a 9% clock increase result in a 20% improvement?
- Likes 2
Comment
-
Originally posted by phoenk View Post
More and more programmers are also moving their workloads to the cloud. I have a four core laptop for work, but I rarely use all of the performance it has to offer since anything reasonably demanding will get run somewhere else anyways. If more cores means more power draw, I'd rather stay where I am in terms of performance so I can maintain long battery life.
And if you want to work on Linux kernel, Chromium, AOSP, Firefox, LLVM, Mesa etc. local build speed matters a lot.
- Likes 3
Comment
-
Originally posted by chuckula View PostWell considering the most expensive AMD GPU that I can buy on Newegg right now is $3K and won't do AV1 decoding, and actually has LESS than 4 CPU cores, I'm pretty sure a NUC that will cost a tiny fraction of that is a much better idea.Last edited by torsionbar28; 02 September 2020, 10:53 PM.
- Likes 4
Comment
-
Originally posted by sandy8925 View Post
Tigerlake uses Willow Cove, it's literally in the article (and on every single tech news website that's talked about Tigerlake until now).
Comment
-
When the AMD & Intel CPU & GPU hardware is tested, could you also include the power consumption? Excess power does not just ruin the portable battery supplies. It also contributes to cooling fan noises, overheating and stomping on excess performance when overheated.
AMD's move to 7nm Zen 2 delivers inter-generational improvements of between 15% and 20% for single-threaded tasks, and 25-30% in multithreaded scenarios. It is less power hungry, and often lower dollar costs. Usually the AMD CPU needs fewer motherboard design changes than the Intel units, so upgrading the CPU alone is usually easier and cheaper. However Intel offers the ability of using external GPU's. To some users, this is important.
Finally, it should be mentioned that The Linux Foundation prefers the AMD CPU, for the compiling speed of the Linux kernel. phoronix.com/scan.php?page=article&item=amd-linux-3960x-3970x&num=9
- Likes 1
Comment
-
Originally posted by phoenk View PostMore and more programmers are also moving their workloads to the cloud. I have a four core laptop for work, but I rarely use all of the performance it has to offer since anything reasonably demanding will get run somewhere else anyways. If more cores means more power draw, I'd rather stay where I am in terms of performance so I can maintain long battery life.
Fewer cores also means each core has more work to do at any given time, which then boosts the frequency. Being able to spread out the load can, in some cases, improve efficiency.
Meanwhile, there's no reason you can't have a whole bunch of cores where only one or two clock super high for single-threaded workloads when you need them. With modern technology, there's just no reason to have fewer than 4 cores for desktop/laptop use.Last edited by schmidtbag; 03 September 2020, 09:26 AM.
- Likes 2
Comment
Comment