Announcement

Collapse
No announcement yet.

Hands On With The AMD Radeon RX 6600 XT

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by MadeUpName View Post
    Does it really need 3 fans or are they just flexing?
    Devil's Advocate says the more fans you use, the smaller heat sink you can get away with. But I expect it's just because of the 'Phantom Gaming' branding. Have to make the customer think they're getting something for that extra money. Even a 3090 can be properly cooled with two fans, and the AMD reference design for the 6600XT only has one, so you can draw conclusions.

    Comment


    • #22
      Originally posted by M@GOid View Post

      Oh look, Nvidia puts Linux dead last, like a leper they pretend doesn't exist. Thanks to point it out Birdie. People in this forum usually don't like you, but this shows you are indeed a nice guy.
      Please ignore birdie... He is just frustrated about the 26 likes he didn't get.

      Comment


      • #23
        Originally posted by DanL View Post
        You all missed the point of my post, so I'll be more clear. Where is the Radeon RX6400 with 75W TBP? Or even better, an RX6200 with 35-50W TBP?
        AMD is never going to make a card like that ever again because they're pointless.
        If all you need is a low-end GPU, their integrated GPUs have that covered.

        Comment


        • #24
          Originally posted by CaptainLugnuts View Post

          AMD is never going to make a card like that ever again because they're pointless.
          If all you need is a low-end GPU, their integrated GPUs have that covered.
          Actually, that's not corrrect. The low 75W> TDP cards can actually be way more powerful than iGPU's. For example, a long time ago I had the 2012 Radeon HD 7750, which is around 60-70W TDP, and it could easily run many triple A games of that time at high settings, which is way beyond what iGPU's could achieve at least then. I think there is no reason Nvidia / AMD can't make a 75W> card today that is way better than an iGPU. It's simply because both companies completely neglected midrange / entry level markets. I mean after 4+ years there is still no worthy successor to the midrange RX 580 / GTX 1060.
          Last edited by user1; 06 August 2021, 05:42 PM.

          Comment


          • #25
            Originally posted by user1 View Post

            Actually, that's not corrrect. The low 75W> TDP cards can actually be way more powerful than iGPU's. For example, a long time ago I had the 2012 Radeon HD 7750, which is around 60-70W TDP, and it could easily run many triple A games of that time at high settings, which is way beyond what iGPU's could achieve at least then. I think there is now reason Nvidia / AMD can't make a 75W> card today that is way better than an iGPU. It's simply because both companies completely neglected midrange / entry level markets. I mean after 4+ years there is still no worthy successor to the midrange RX 580 / GTX 1060.
            You'll see low-margin cards when there is available manufacturing capacity.

            It makes zero sense for AMD to sell a gpu they can make $1 off of when they can instead sell a higher end one and make $100.

            I'm assuming you don't count the 5500xt as a "worthy" successor to the 580/1060, but that was AMD's last attempt at one. I'm guessing we won't see one based around RDNA2 until after RDNA3 is released.
            Last edited by smitty3268; 06 August 2021, 05:49 PM.

            Comment


            • #26
              Originally posted by CaptainLugnuts View Post

              AMD is never going to make a card like that ever again because they're pointless.
              If all you need is a low-end GPU, their integrated GPUs have that covered.
              Uhm ... you sure realize that there are quite a few processors without integrated GPUs, right? Mostly from AMD (integrated graphics are even scarce here), but even intel has some of those again.
              I'd really like a extremely low-powered GPU for my home server (currently running on an unaccelerated AST2500 VGA adapter included in the BMC) - just to use a graphical desktop and browser.

              Comment


              • #27
                Originally posted by smitty3268 View Post

                You'll see low-margin cards when there is available manufacturing capacity.

                It makes zero sense for AMD to sell a gpu they can make $1 off of when they can instead sell a higher end one and make $100.
                But didn't both companies neglect that market way beyond the chip shortage? I mean it started with the release of Nvidia Turing.

                Comment


                • #28
                  Originally posted by DanL View Post
                  You all missed the point of my post, so I'll be more clear. Where is the Radeon RX6400 with 75W TBP? Or even better, an RX6200 with 35-50W TBP?
                  Originally posted by CaptainLugnuts View Post
                  AMD is never going to make a card like that ever again because they're pointless.
                  If all you need is a low-end GPU, their integrated GPUs have that covered.
                  Originally posted by user1 View Post
                  Actually, that's not corrrect. The low 75W> TDP cards can actually be way more powerful than iGPU's. For example, a long time ago I had the 2012 Radeon HD 7750, which is around 60-70W TDP, and it could easily run many triple A games of that time at high settings, which is way beyond what iGPU's could achieve at least then. I think there is no reason Nvidia / AMD can't make a 75W> card today that is way better than an iGPU. It's simply because both companies completely neglected midrange / entry level markets.
                  Agree that the HD 7750/7770 (Cape Verde IIRC) was faster than the iGPUs of the time, but I expect that the integrated GPUs in Renoir and Cezanne would be faster than the HD 7750 these days, even with shared and slightly slower memory.

                  There has been a lot of discussion about the lowest tier cards in other threads so I won't repeat it all but:

                  #1 - yes there is still a slot between APU and "great 1080p dGPU"

                  #2 - problem is that the market for those cards seems to be fairly small these days, essentially "entry level DIY gaming PC" since even the OEMs don't seem to be doing much in that area.

                  The product would basically be "laptop dGPU on a PCIE card" but that seems to be a tough sell to the board partners, at least while there are shortages in many of the components they need.

                  Originally posted by user1 View Post
                  I mean after 4+ years there is still no worthy successor to the midrange RX 580 / GTX 1060.
                  Have to disagree there - the RX 5500XT was a bit faster, a bit cheaper and used less power than either of those. Unfortunately even the 5500XT is selling for 4x MSRP or higher right now.

                  One thing that muddies the water a bit is that games have become a lot more demanding over the last 5 years, so "a great 1080p card" today has maybe twice the performance of the 1080p cards back in 2016, more like a GTX 1080ti than a GTX 1060.

                  Originally posted by Luzipher View Post
                  Uhm ... you sure realize that there are quite a few processors without integrated GPUs, right? Mostly from AMD (integrated graphics are even scarce here), but even intel has some of those again. I'd really like a extremely low-powered GPU for my home server (currently running on an unaccelerated AST2500 VGA adapter included in the BMC) - just to use a graphical desktop and browser.
                  That's an even smaller market, unfortunately... while the cost of making new chips is going through the roof.

                  What kind of use cases do you see for graphics acceleration on the server ? My impression was that the GPUs integrated into BMCs were doing a decent job of covering user needs.
                  Last edited by bridgman; 07 August 2021, 03:35 PM.
                  Test signature

                  Comment


                  • #29
                    Originally posted by user1 View Post

                    But didn't both companies neglect that market way beyond the chip shortage? I mean it started with the release of Nvidia Turing.
                    As has been mentioned, the increasingly competent nature of integrated gpus has really cut down on the market for low end discrete gpus. There just aren't a lot of people who want to buy them.

                    At the same time, AMD's gpu group has been in a shambles the last few years. They've just been trying to hang on long enough to get out new architectures that are halfway competitive with Nvidia. They finally have done so, but they're now stuck with very limited ability to supply the market with silicon right now as their limited supply is much better off going towards Epyc CPUs, and other higher margin parts.

                    NVidia I think has been happy to be seen as a higher end product brand that charges a premium, and as long as people have been willing to keep paying the higher prices they've been asking for they've decided they didn't really care about the low end anymore. There's always the 1030 still for those people, but Nvidia isn't going to care much about that market.

                    I think hope for you may be coming next year in the form of Intel. They are supposedly going to be flooding the market with cheap gpus in order to gain market share. It remains to be seen how AMD or NVidia will respond, if at all.
                    Last edited by smitty3268; 06 August 2021, 06:04 PM.

                    Comment


                    • #30
                      1080p GPU with 3 FANs... When will I have a chance to replace my R9 380 ITX Compact?

                      Comment

                      Working...
                      X