Announcement

Collapse
No announcement yet.

AMD Ryzen 3 3100 + Ryzen 3 3300X Offering Great Budget Linux CPU Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by andre30correia View Post

    buy a new generation and if I was you ryzen 5 3600 is better choice because of more cores for the next years ahed
    I agree for the new generation.
    Not for the model.

    You can see the difference between 3300X and 3600X is around 10%, maybe 15%. The 3600 might only be 5-7% more powerful than the 3300X. It doesn't justify a 50€ difference in my eyes (will probably be more like 135-140€ in Europe). And I'd rather spend that money on better RAM or a better AM4 motherboard instead. I could probably get a CPU+16 GB DDR4 3200 RAM+B450M MB for around 250-270€ with the Ryzen 3300X. This is unbeatable price for that quality. And for 3x faster performance over my current setup (close to the A10-7870K in the test).
    Last edited by Mez'; 07 May 2020, 02:11 PM.

    Comment


    • #22
      Originally posted by birdie View Post
      Intel laugh thread again? Only 14nm Core i5 9400 has an idle power consumption around 4W while its super duper 7nm AMD counterparts eats around 18W.

      And just before you say ... but AMD is FASTER remember for 95% of people out there 95% of the time CPUs sit idle and do nothing.

      Oh, and to run these Ryzen CPUs you need a dedicated GPU, so ... it looks like Core i3 10100 is a clear winner here.

      BS

      Comment


      • #23
        Originally posted by tildearrow View Post

        ...In the meanwhile, AMD high-end card consumes 4W when idle, while its NVIDIA counterpart consumes more than 15W.
        Say again? How many such cards are in use worldwide? And how many of them actually sit idle? And the source for that please? Because GTX16/RTX are extremely power efficient for their 12nm node.

        Speaking of the idle power consumption in case people wanna see raw data:





        Paints quite a grim picture doesn't it? Not everything is rosy in the AMD fanboys land.

        Comment


        • #24
          Originally posted by birdie View Post
          Intel laugh thread again? Only 14nm Core i5 9400 has an idle power consumption around 4W while its super duper 7nm AMD counterparts eats around 18W.
          And just before you say ... but AMD is FASTER remember for 95% of people out there 95% of the time CPUs sit idle and do nothing.
          Oh, and to run these Ryzen CPUs you need a dedicated GPU, so ... it looks like Core i3 10100 is a clear winner here.
          Following your link, you also forgot, that the Intel Core i7 7700K is better than the Ryzen 3300X - if you play a lot of Dwarf Fortress that is.... Hahaha

          Comment


          • #25
            Originally posted by Veto View Post
            Following your link, you also forgot, that the Intel Core i7 7700K is better than the Ryzen 3300X - if you play a lot of Dwarf Fortress that is.... Hahaha
            I know there's a crazy AMD asslicking cult nowadays but I've never mentioned the 7700K even once.

            What I mentioned is the Core i3 10100 CPU which is a better overall CPU - doesn't require a discrete GPU, has better idle power consumption, sports a similar performance:

            https://ark.intel.com/content/www/us...-4-30-ghz.html

            Comment


            • #26
              Oh, and the Core i7 7700K, i.e. the Core i3 10100 which is gonna get released in less than two weeks, is consistently faster in games than the new Ryzen 3X00 parts:



              And I beg your pardon in Windows games of course. Linux is a barren earth at the moment.

              Comment


              • #27
                Hint: if you stop feeding the troll he'll go away. Although I'm not sure if he's a troll, a shill or a brainless Intel fanboy.

                Comment


                • #28
                  Originally posted by birdie View Post

                  Say again? How many such cards are in use worldwide? And how many of them actually sit idle? And the source for that please? Because GTX16/RTX are extremely power efficient for their 12nm node.

                  Speaking of the idle power consumption in case people wanna see raw data:


                  Paints quite a grim picture doesn't it? Not everything is rosy in the AMD fanboys land.
                  Wow you're fast with calling people fanboys. I just provided a source. That always helps. Now to your sources: i quote
                  guru3d: "Note: This round we used an X570 ROG Crosshair VIII Formula, opposed to earlier process tests on other motherboards, this mobo is responsible for slightly higher (>10W) wattage, likely all the RGB LEDs are responsible for that. "
                  Also the other systems are not shown (at least from short scan and skip).. so yeah...

                  The Techpowerup is ok, they list everything. You see that X470 is used in Zen1 and X570 for Zen2. Unfortunately the latter chipset consumes more power that was known. I hope it gets better..
                  Still on hexus total power is a difference of 2 Watts, Techpowerup it is 10W and guru3d is useless
                  So the difference between the original statement of 14W difference just from the CPU... yeah....

                  PS: "slightly higher (>10W) wattage"
                  Last edited by thxcv; 07 May 2020, 02:46 PM.

                  Comment


                  • #29
                    Originally posted by birdie View Post
                    IOnly 14nm Core i5 9400 has an idle power consumption around 4W while its super duper 7nm AMD counterparts eats around 18W
                    ...
                    Originally posted by Anandtech
                    In both of these graphs, the package power when idle is around 16-17 W. I looked back through the data, and noticed that out of this power only 0.3 W was actually dedicated to cores, with the rest being towards the big IO die, the memory controllers, and the Infinity Fabric.
                    I suspect the X570 chipset is at least partly to blame here. It would be interesting to see the power consumption on a B450 board.
                    Also, the Intel CPU's you mention don't support PCIe 4.0, so it's not exactly an apples to oranges comparison.

                    Originally posted by birdie
                    Oh, and to run these Ryzen CPUs you need a dedicated GPU, so ... it looks like Core i3 10100 is a clear winner here.
                    There are plenty of gamers out there that love high-performance budget CPU's like this so they can spend money on other components (usually the GPU). So "clear winner" is in the eye of the beholder...

                    Comment


                    • #30
                      Originally posted by thxcv View Post

                      Wow you're fast with calling people fanboys. I just provided a source. That always helps. Now to your sources: i quote
                      guru3d: "Note: This round we used an X570 ROG Crosshair VIII Formula, opposed to earlier process tests on other motherboards, this mobo is responsible for slightly higher (>10W) wattage, likely all the RGB LEDs are responsible for that. "
                      Also the other systems are not shown (at least from short scan and skip).. so yeah...

                      The Techpowerup is ok, they list everything. You see that X470 is used in Zen1 and X570. Unfortunately the latter chipset consumes more power that was known. I hope it gets better..
                      Still on hexus total power is a difference of 2 Watts, Techpowerup it is 10W and guru3d is useless
                      So the difference between the original statement of 14W difference just from the CPU... yeah....

                      PS: "slightly higher (>10W) wattage"
                      You haven't checked with AnandTech yet.

                      Anyways, I'm probably just nit-picky about that as most people couldn't care less about extra 10 watts. From what I've seen average people never change their smartphones/monitors brightness which is just crazy for me (and painful for my eyes). And there's a huge difference for a display at 80 and 10% brightness. Peace out.

                      Comment

                      Working...
                      X