Announcement

Collapse
No announcement yet.

AMD Ryzen 3 3300X vs. Intel Core i3 10100 In 350+ Benchmarks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by cjcox View Post
    Probably just me (throw the stones gently), but given the performance of the Intel CPU with iGPU and it's lower power... makes you wonder if Intel gets its act together and does 10nm or less, might it spell trouble for AMD? Granted it seems to be a big "if" for Intel right now.
    By the time intel can produce 10nm en masse, TSMC/AMD will have moved to 5nm. Intel is seriously behind right now due to their lack of process node investment in the 2012-2016 timeframe. Also the performance hit of vulnerability mitigations. While AMD struggled with Bulldozer, Intel chose to prioritize profit; now they're paying the price for that choice.

    Comment


    • #22
      It would be interesting to see what the actual powerdraw from the systems is, instead of the sensor readings. Such a big difference, while performance is equal, makes the i3 actually a winner, I think. And it's not 2,3 watts, but lots more. I love AMD, but their power usage is always a lot higher, which is a shame, cause their performance is good. Happy with my Ryzen 3600, but hope that they will improve this in future generations. (Intel's latest 10nm+++++ is mostly bad in this regard though, but not this i3)
      Last edited by peterdk; 12 June 2020, 05:51 PM.

      Comment


      • #23
        Originally posted by torsionbar28 View Post
        By the time intel can produce 10nm en masse, TSMC/AMD will have moved to 5nm. Intel is seriously behind right now due to their lack of process node investment in the 2012-2016 timeframe. Also the performance hit of vulnerability mitigations. While AMD struggled with Bulldozer, Intel chose to prioritize profit; now they're paying the price for that choice.
        It was a longshot..I was sort of doing an "imagine if" remembering that Intel was able to pull out the miracle "you'll never be able to this" sort of thing back in the Opteron days. But maybe not this time around.

        Comment


        • #24
          Originally posted by cjcox View Post
          Probably just me (throw the stones gently), but given the performance of the Intel CPU with iGPU and it's lower power... makes you wonder if Intel gets its act together and does 10nm or less, might it spell trouble for AMD? Granted it seems to be a big "if" for Intel right now.
          Only if they manage hefty IPC improvements, because their 10nm/7nm process, at least for a while, will clock like shit, so no "5 gigahurtz cock speed" to save their ass anymore.

          By the time they unfuck their fabs AMD will already be at a smaller node, with far better IPC.

          Comment


          • #25
            Originally posted by peterdk View Post
            It would be interesting to see what the actual powerdraw from the systems is, instead of the sensor readings. Such a big difference, while performance is equal, makes the i3 actually a winner, I think. And it's not 2,3 watts, but lots more. I love AMD, but their power usage is always a lot higher, which is a shame, cause their performance is good. Happy with my Ryzen 3600, but hope that they will improve this in future generations. (Intel's latest 10nm+++++ is mostly bad in this regard though, but not this i3)
            I have to say you are incorrect on a number of points.

            Michael stated in the article the average power draw in compute tests was 35 watts for the i3 vs 48 watts for the Ryzen 3. It's more than the 2-3 watts you seem to be concerned with, but 13 watts doesn't make the Intel product a "winner". As blackshard and Schmidtbag pointed out in this comment thread, chipset makes a difference. But you also have to remember the Ryzen 3000 desktop chips are also packing a 12nm I/O die in addition to their 7nm compute dies. That I/O die is relatively fixed power draw, so it only dominates power usage for low wattage parts like the Ryzen 3. As you go up the stack, the I/O will be a smaller and smaller percentage of the power consumed.

            In more constrained environments, AMD already has superior consumption with the Ryzen 4000 APUs compared to 15-45 watt Intel mobile parts. This Ryzen 3 is in a weird no mans land for power where it's using the desktop architecture that trades power for packaging flexibility. It gives up power efficiency at the low end to be able to use more 7nm Zen2 compute dies that might have otherwise been thrown away. But its target market is going to be cheap office PCs that never draw more than 120 watts from the wall anyway, or budget gaming enthusiasts care way more about a few extra FPS than a measly 13 watts.

            Comment


            • #26
              Are there going to be next-gen CPUs with iGPU from AMD that will draw less power in idle compared to Intel's CPUs along with low temperature? Something with 4 cores with 8 threads? Low power consumption for a server is important to me and I don't care about performance except when it comes to transcoding live TV content. I suppose undervolting might be an important factor...

              I do want to have a couple of M.2 drives in a motherboard, though, so I would definitely go with ATX. Plus, I also have an Intel 4-port gigabit card for networking. I might plan in switching to an Intel card with 2 SFP+ ports.
              Last edited by GraysonPeddie; 12 June 2020, 11:16 PM.

              Comment


              • #27
                Originally posted by coder View Post
                That entirely depends on what you want it for. If you're already planning to use a dGPU, then the i3's integrated GPU is essentially irrelevant. I'm actually panning to use one on a server board with a BMC, which puts me in effectively the same boat as the dGPU crowd.



                The one blemish on the 3300X is its poor power-efficiency. In this regard, it really deviates from its low/mid-range Ryzen 3k brethren.


                Good point, although at least he's not using a PCIe 4.0 GPU.
                The iGPU can be relevant for streaming gameplay, if the dGPU doesn't support the relevant codec.

                Comment


                • #28
                  Originally posted by M@GOid View Post

                  Trouble? Probably not, since they are not bond to the lack of competitiveness of Global Foundries anymore. Unless Intel magically surpasses TMSC lithography process, AMD is probably on a safe place.

                  In my view, Intel's biggest problem in the near term is not competitive products, but profit margins. if you think about it, on the desktop side, they didn't lack high core count CPUs, they always had them, but were selling under the Xeon brand, with a fat profit margin. So AMD twisted their arm, forcing them to make the quad-core/eight-thread i7 in to the i3. Before 6/8/10 core CPUs that were exclusively on the Xeon brand are now being sold under much less profit margins. I have no doubt they can sell a 16 core to compete with the 3950X, but to them, that is a race to the bottom of the profit margins, they are dragging their feet the maximum they can.

                  But they undoubtedly are on a uncomfortable place right now. Desktop sales are down and soon laptops will be too. When Apple finally ditches them for their own ARM designs, Intel will have a very difficult time convincing people they are better than AMD, if even a long time, prestigious client like Apple ditched them for something else. The fact that their CEO told customers to forget about benchmarks paints the picture their back is against the wall right now.
                  I don't know if Intel is hurting that much, I really don't have the proof to back that up. From what I do understand, Intel has some incredibly deep pockets, its just that their greed and laziness is right in your face currently. In 2017 when I built my Intel i7 7700 system, I was glad and felt it made for a powerful gaming/production station, and it still is but I eventually got so annoyed by all the vulnerabilities and the fact that I wanted more than 8 threads but I got screwed out of a 8700k, I had Z270 board and was just pretty sure I would be able to upgrade to a 8700k and so did many other people on that platform which thoroughly really pissed them off, including me. There are a lot of X570 converts that crossed over from the Z270 platform, stung horribly for reasons revealed over time.

                  The I7 8700k I wanted is easily replaced by the 3600 for only $175, the latest AMD 12 thread CPU vs the i7 8700k's 12 threads is a pretty good comparison and the 3600 does not disappoint and it was cheaper upgrading to AMD instead of a newer intel platform.

                  Now own a 3600 coupled to a X570 board and a 32GB cl16 3200Mhz memory kit, things could not be better. I was actually pleasantly surprised at how great a system it has been so far and yes I make use of those 12 threads, in fact for me I am thinking 16 threads will most likely be the sweet spot for my uses. Hopefully I will be able to upgrade to a R7 4700X greater for an alright price not too far down the road.

                  Anyone right now denying that a 3600 makes for one of the most powerful systems you can build around, needs a reality check, cause it totally crushes due to its affordability for those such as myself.

                  What I think is about to happen is that we are about to experience the greatest technological development stall ever recording in current human history. What we will experience is a software revolution in terms of thinking outside the box and the first true renaissance and evolution of the Linux operating system, especially after finding out just what it can do while the world is in dire straights.

                  Steam proton is an incredible example for what a revolutionary and evolutionary leap Linux has already made. (Proton in my opinion, contrary to what other people think—is an already refined and effective technology). It is a true miracle brought down into manifestation by magicians and craftsman.
                  Last edited by creative; 13 June 2020, 03:26 PM.

                  Comment


                  • #29
                    Originally posted by duby229 View Post

                    Hold up.... Can you name -any- Intel CPU that -doesn't- have an APU? I'm reasonably certain there aren't any. ALL Intel CPU's have an integrated GPU. And it's been identical for like the last 5 generations and was inadequate even the first gen it was used with. Now that GPU is just dead weight. The vast majority of games don't list Intel GPU's as supported and it's only useful for retro gaming at lowest settings and resolutions. Intel only does it so they can claim marketshare and in doing so they hurt PC gaming.
                    I strongly disagree. Intel GPU become much more performant lately. It's not very useful for any kind of serious gaming, but nowadays the GPU is used for a lot of things: browsers (either by browser itself, or via WebGL for javascript apps), casual gaming, accelerated video encoding/decoding, GPGPU...
                    It's not dead weight at all, if you consider that most of the computers are preassembled Dell/Hp/Acer/whatever that will never see a discrete GPU.

                    Comment


                    • #30
                      Originally posted by creative View Post
                      I don't know if Intel is hurting that much, I really don't have the proof to back that up. From what I understand Intel has some incredibly deep pockets, its just that their greed and laziness is right in your face currently.
                      Deep pockets are helpful, but they're not everything — it still takes time to turn money into a competitive product.

                      Comment

                      Working...
                      X