Announcement

Collapse
No announcement yet.

AMD Announces The Ryzen 3 3100 + Ryzen 3 3300X Processors

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by squash View Post
    I realize you're solving a problem for a computer for your bedroom, but: ... . Nobody is going to build a run of workstations using $10 ebay video cards if they can get an Intel with on-chip video instead.
    I'm solving the problem for the typical consumer. I.e. you. Vendor support agreements, economies of scale, procurement vehicles, it's a whole different calculus when a business is buying 1000 workstations vs. a consumer shopping for a peecee. If you're purchasing a run of commercial workstations rather a single personal PC, this is the first you've mentioned it. Are you shopping for a run of workstations?

    If you are, I'd recommend the Picasso 3200G/3400G APU's. These are Zen+ but are still current SKUs and are plenty powerful for standard office type work. There's really no reason to choose intel in this scenario.

    Originally posted by squash View Post
    . PCIE x16 slots will happily operate an Optane, 10 gig NIC, RAID controller, etc.
    Ok, none of which *require* an x16 slot. Mobos have a plethora of slots in various lane widths specifically to handle all of these other less intensive cards. Ergo, the x16 slot is wasted when using an iGPU. Why on earth would anyone put a x1 USB card into an x16 slot? Makes no sense.
    Last edited by torsionbar28; 21 April 2020, 07:00 PM.

    Comment


    • #52
      Originally posted by Jedibeeftrix View Post
      it has a long life ahead of it.
      think ryzen 4000 chips selling in 2022 in the same way people are still buying 2600x chips today when zen2 is out.
      For sure. Just for grins, I decided to spec out a previous gen Ryzen build thinking I could save a few bucks on an "older" CPU. Amazing how previous gen Ryzens are still commanding big bucks, both at retail, and in second hand markets.

      Comment


      • #53
        Originally posted by Luke_Wolf View Post

        Um sorry but you were wrong in 2017, you're even more wrong now. Back when I got my Ryzen 7 1700 it was rare to see a newer AAA game that wasn't taking advantage of all 16 threads... Now everyone is using Vulkan and so this becomes even less true now. Just because a quad core can run games doesn't mean it's optimal, and based on the jitter testing it's anything but.
        I am not aware of any AAA game currently that actually utilizes 16 threads. I mean really, use those. What typically happens is that as the workload gets split up to more threads, each core gets utilized but up to a percentage, not 100%. You almost never see full number of cores at full 100% capacity in any game.

        Also, you rarely see any actual performance gains from more cores. Most tests i have seen which display higher performance are usually done at extremely low resolutions/settings, and are meaningless for the actual resolutions and settings people play. Actual real world difference in gaming is small.

        Comment


        • #54
          Originally posted by TemplarGR View Post

          I am not aware of any AAA game currently that actually utilizes 16 threads. I mean really, use those. What typically happens is that as the workload gets split up to more threads, each core gets utilized but up to a percentage, not 100%. You almost never see full number of cores at full 100% capacity in any game.

          Also, you rarely see any actual performance gains from more cores. Most tests i have seen which display higher performance are usually done at extremely low resolutions/settings, and are meaningless for the actual resolutions and settings people play. Actual real world difference in gaming is small.
          Um lol? Your "proof" that games aren't taking advantage of 16 threads is that.... it's taking advantage of 16 threads but not running the CPUs at 100%. Just who do you think you're kidding here? What kind of a joke are you? The only sense in which running up the CPU matters as opposed to how far the game spreads across cores is how even the core spread is. Which in most games is pretty damn even.

          As far as performance goes, stop looking at maximum FPS and start looking at the minimums and you'll see a world of difference, having 16 threads eliminates most of the CPU related issues with jitter. Which has been demonstrated in blind testing and frame comparison in numerous videos since Ryzen first launched.

          Comment


          • #55
            Originally posted by Luke_Wolf View Post

            Um sorry but you were wrong in 2017, you're even more wrong now. Back when I got my Ryzen 7 1700 it was rare to see a newer AAA game that wasn't taking advantage of all 16 threads... Now everyone is using Vulkan and so this becomes even less true now. Just because a quad core can run games doesn't mean it's optimal, and based on the jitter testing it's anything but.
            Which jitter testing are You referring to?

            For gaming in general, single core performance is still king.
            And if high frame-latency is of concern, simply using a soft-realtime Linux kernel solves the problem exceptionally well!

            Comment


            • #56
              I pack a 3200G/Wraith Cooler+B540+16GB 3200MHz RAM, 1TB nVME, 22TB storage spread over 4 HDD's, and a standard ATX PSU in to a Coolermaster Elite 110A case with just the 120MM front fan for airflow (on top of the Wraith and PSU). It's almost Winter but we're still getting 8+hrs a day of 30+ degree's celsius heat where I live, and the machine runs fine when I crank up Borderlands on Proton while serving files at 100MB/s. It boots to functional in about 5 seconds, faster if I enabled the instant boot option in BIOS/UEFI.

              Pretty sure that's a far from useless system.
              Hi

              Comment


              • #57
                David kanter of rwt fame reckons the 4c/8t is optimal for client systems.

                https://www.realworldtech.com/forum/...rpostid=191571

                I've decided to jump from a Fx6300 to a cheap athlon 3000g system, and I'll probably hold out a couple of months until zen 4 releases. the prices of either 1600 af or the 3100x should drop. If not I'll pick up the equivalent zen 4 of an 3100x.

                Comment


                • #58
                  BTW, these seem to ba an itnerim solution until Ryzen 4000x arrive. 3400G was getting unpleasantly expensive and even 3200G needed a bump, which this provided.
                  Better power scheme, more cache a bit faster cores and better IF, a notch higher memory speeds and probably more OC potential.
                  Nice, but probably only a sign that upcoming desktop APUs will be thunderous.

                  Comment


                  • #59
                    Originally posted by StandaSK View Post

                    The i7-7700K makes that 3 years.
                    The 7700K was trash even then. Only retarded gamers bought it, following advice from other bright techtubers like GamersNexus.

                    Comment


                    • #60
                      Originally posted by Luke_Wolf View Post

                      Um lol? Your "proof" that games aren't taking advantage of 16 threads is that.... it's taking advantage of 16 threads but not running the CPUs at 100%. Just who do you think you're kidding here? What kind of a joke are you? The only sense in which running up the CPU matters as opposed to how far the game spreads across cores is how even the core spread is. Which in most games is pretty damn even.

                      As far as performance goes, stop looking at maximum FPS and start looking at the minimums and you'll see a world of difference, having 16 threads eliminates most of the CPU related issues with jitter. Which has been demonstrated in blind testing and frame comparison in numerous videos since Ryzen first launched.
                      Dude, you are "loling" because you don't understand hardware and coding. Yet another gamer kid on the internet... If the cores aren't 100% exploited, then it doesn't matter that the code is multithreaded. In fact, it is BETTER to have fewer cores (and fewer threads) in that case because the cost of coordinating these threads is not worth it.

                      The reason you are witnessing that the core spread is even (at low % of utilization for most cores), is precisely because they can't go higher because of all the sync problems involved. It is a gimmick. Multiple threads can run on the same core taking turns, you know... That is how threads have been working since time immemorial. As long as the core and RAM are fast enough to serve those threads, you just calculate one after the other. It is how single core systems used to operate, we didn't have only 1 thread in the OS back in the Pentium 4 days....

                      In fact, you are oftentimes better off calculating multiple threads on the same core or cluster of cores, if they share memory, in order to utilize the caches more.

                      So no, you aren't seeing improved performance due to 16 threads. That is a hoax, and only incompetent tech "youtubers" and other junk types that you gamers watch claim these things.

                      The main reason jitter can be reduced with multiple cores is that they can use the extra threads to load more stuff into memory for later use. It is actually how most cores are going to be used in the next gen consoles. Instead of putting more expensive RAM and video RAM with more bandwidth and more size, they are going to utilize extra threads + the SSD in order to push more data into the RAM faster. Consoles won't actually execute game code faster, they are just going to use smart techniques to make up for the fact that their RAM is limited and SLOW.

                      Assuming you have a GPU with 12GB of VRAM or more, and let's say 1 TB/s bandwidth, and PCIE4+, you will be able to run most AAA ports on a modern (that means AVX2 and high clocks) quad core with better performance than next gen consoles, at the same graphical settings. Because you won't be needing all those extra threads to cover for the limited RAM system. You will see when these things get released. Keep that post in your mind when that happens.

                      Comment

                      Working...
                      X