Announcement

Collapse
No announcement yet.

AMD Announces New Zen 3 Desktop APUs, FidelityFX Super Resolution + More

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by skeevy420 View Post
    I wonder if we keep getting Vega iGPUs because anything better on AM4 runs the about the same?
    I suspect so. When they increased GPU frequency in 4000G series APU, they even cut the CU count to balance things back. Things are probably bottlenecked by DDR4 bandwidth already.

    Comment


    • #22
      I'm not holding my breath for FidelityFX to come to Linux any time soon. Best-case scenario, it might arrive next year.

      Getting a 5700G overall makes more sense than a 5800X. The MSRP is $100 cheaper but you're getting graphics along with it. Only slightly slower.

      Comment


      • #23
        Originally posted by zakhrov View Post
        I hope Michael gets his hands on that Asus Scar g15 with the 5800HX and the RX6800M It would be awesome to see how it performs under Linux
        LTT benchmarked it with Windows and said it sucked. Then again he has a real bias against any thing AMD.

        Comment


        • #24
          Originally posted by atomsymbol

          In my opinion, the announcement of Ryzen CPUs with 64MB of 7nm SRAM cache (3D V-Cache) (L4 cache?; 96MB L3/L4 cache per CCD in total) stacked vertically atop a CCD with a bandwidth of 2TB/s is the most important announcement. Unfortunately, DDR5 has approximately the same 1st word latency in nanoseconds as DDR4/DDR3/DDR2/DDR/SDRAM, so it is logical for future CPUs with 3D stacking to start having a large L4 cache.

          Wikipedia: First word latencies of SDRAM/DDR[1234]
          I agree it has the potential to be the biggest news. My concern is with the thermals. They announced some changes to help with that but I am still skeptical. But the fact that they will be producing them before the end of the year took me by surprise and I look forward to the benchmarks.

          On the flip side I was disappointed there was no announcement about Van Gough. I thought I had seen source pushed to the kernel a while back as preparation but all signs are that they have dropped it.

          Comment


          • #25
            Originally posted by kneekoo View Post
            Finally something to look for to move away from Intel. Hopefully it will be good enough on Linux to buy it soon. I look forward to seeing lots of benchmarks.
            Don't buy it "soon", wait for the early adopters to take the arrows.

            Comment


            • #26
              Originally posted by pal666 View Post
              you would embarrass yourself less if you will ignore apple propaganda
              Apple hasn't produced any propaganda, if anything they undersold M1! It is a far better processor than the x86 world want to acknowledge. I really don't see the x86 world being able to compete as long as they have all the baggage that comes with x86. One of Apples smartest moves was the forcing of developers to move to 64 bit code years ago. This allowed them to completely delete the 32 bit baggage allowing for very efficient processors. x86 has more modes than politician.

              Comment


              • #27
                While all of this stuff is very interesting, I really don't see anything worth buying this year. At this point I see it as foolish to invest in any system that doesn't support DDR5. This especially on APU type chips. Of course this assumes you have the option of holding off. If you don't the new APU's are not that bad. I just see the combo of DDR5, RDNA2 and other new tech in an APU as something worth waiting for. It isn't like this new tech will be going into old motherboards.

                So early next year is probably the best time to consider what is on offer and make a solid computing upgrade.

                What I find to be really interesting is the 3D stacking. This will likely be huge in platforms I will never be able to justify buying. Imagine a Thread Ripper with these complexes. If AMD has the balls to put a next gen Thread Ripper together with these 3D assemblies and a direct port to just one of their coming CDNA chips from the super computing projects, they will have one of the best desktop compute stations on the market. Of course AMD software would need to be addressed but contrary to popular opinion that is coming along also. Basically there is a lot of stuff happening that could lead to AMD having a lock on the desktop performance compute box for a few years.

                Comment


                • #28
                  Originally posted by CochainComplex View Post

                  Supercharge your Mac with Setapp! Download and install today at http://stpp.co/LinusTechTips2Check out Crucial at: https://crucial.gg/LinusTechTipsWhen was t...
                  Nice, though looks like this stupid laptop does NOT have a webcam, which is an immediate no-buy for maybe 90% of potential buyers. Seriously, what were they thinking?

                  Comment


                  • #29
                    Been waiting for this. My new nas/cloud/plex apu.

                    Comment


                    • #30
                      Originally posted by wizard69 View Post
                      Apple hasn't produced any propaganda
                      Really? If you can't even recognize the enormous amount of marketing and hype that's gone into the launch of the M1, I'm not sure there's much hope for a reasonable conversation on the topic.

                      The M1 is fantastic, but it's not the ultimate CPU like some would have you believe. It's unquestionably king in terms of efficiency, but in terms of raw power x86 is still faster. And that's while being manufactured on older nodes. Apple will be ramping up their M chips over the next year once they try to compete in the higher end markets, and I expect we'll see continued excellence from them once they have a workstation chip out, but there's a reason they started with the low-power/laptop market. Because that's where they excel the most.

                      Comment

                      Working...
                      X