Announcement

Collapse
No announcement yet.

AMD Radeon RX 6600 Linux Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by TemplarGR View Post
    MSRP 329, real world price 600. In Greece only the 6600XT is available and it costs around 700 at the lowest on online stores. EUROS, not dollars.... In Greece the basic monthly salary is less than 600 euros, for reference.

    Seriously, there is no point in buying anything until Intel arrives and destroys both gpu makers simply on price and availability. Plus nice open source drivers. Who would have thought.
    I really hope Intel can make a dent. I worry that any Intel cards will be made of pure unobtainium as well though. They are using TSMC and there are no attempts to throttle mining. A good 1440p card from them at a decent price would be nice, but at this rate I'd be thrilled with a solid 1080p card for <=$250 USD.

    Comment


    • #12
      On the Gigabyte card itself, talk about a overkill cooler.

      On the other hand, people living on very hot climates without air conditioning are thinking: "nope, that seems about right for my needs".

      Comment


      • #13
        Originally posted by user1 View Post
        As long as it has pci-e 4.0 8x instead of pci-e 4.0 16x, it's a no-go for me, regardless of the price. I have an i7 8700k and 32gb of ram. It's still a beast of a system and I intend to use it for at least a few more years. Problem is, I'm limited to pci-e 3.0, which means with an 8x card I'm going to potentially lose up to 25% performance in some games (this is already proven with the 6600 xt). I'm currently happy with my RX 580, but I'm afraid AMD will continue this trend of releasing mid range cards with pci-e 8x instead of 16x. That means, if I want to upgrade my GPU, I'll have to choose the lowest AMD card with pci-e 16x if it will have a price I'm willing to pay for (with today's prices it's highly unlikely). If not, I'll probably switch to an Intel discrete GPU.
        its not the same bandwich since it works with pci3 x16?

        Comment


        • #14
          Originally posted by andre30correia View Post

          its not the same bandwich since it works with pci3 x16?
          No, it will work in pci-e 3.0 x8 mode, which is the reason why some games may suffer a 25% performance loss as a result. If it had pci-e x16 like most other cards, the performance loss with pci-e 3.0 would've been negligible. I've seen someone compared RTX 3090 performance between pci 4.0 and 3.0 and the performance difference was negligible (like 188 vs 190 fps in some games), in other games no difference at all.

          Comment


          • #15
            This is not a great performer, we get similar (sometimes lower) performance than the 4yo Vega56.
            This is not great on power consumption (nvidia has older gpus that consume less).
            The price is going to be high in reality (probably around 400usd)
            Why bother then?

            Comment


            • #16
              Originally posted by birdie View Post
              Pros: Energy efficiency is great but to be honest could be better. GTX 1660 Ti based on a 12nm node and launched almost 3 years ago is not far off.
              Indeed, I've had to replace the case few times to improve the ventilation. I guess if the TDP values keep growing, the next case would need to be a mesh case with 4 x 140mm fans.
              Last edited by caligula; 13 October 2021, 12:49 PM.

              Comment


              • #17
                Thankfully I got a Vega 56 for 209 EUR in 2019 which I still use. In retrospect that was the perfect time to grab a then highly underrated card for a fair price (taking a look on the used market now, Vega is much more appreciated today). Even though I had several Vega models in my hands and seen quite some trouble initially due to poor quality of several board partner models, I still have plenty of fun squeezing more performance out of it with undervolting. And 1080p performance is still great in most games on decent settings.

                Comment


                • #18
                  Originally posted by user1 View Post
                  As long as it has pci-e 4.0 8x instead of pci-e 4.0 16x, it's a no-go for me, regardless of the price. I have an i7 8700k and 32gb of ram. It's still a beast of a system and I intend to use it for at least a few more years. Problem is, I'm limited to pci-e 3.0, which means with an 8x card I'm going to potentially lose up to 25% performance in some games (this is already proven with the 6600 xt). I'm currently happy with my RX 580, but I'm afraid AMD will continue this trend of releasing mid range cards with pci-e 8x instead of 16x. That means, if I want to upgrade my GPU, I'll have to choose the lowest AMD card with pci-e 16x if it will have a price I'm willing to pay for (with today's prices it's highly unlikely). If not, I'll probably switch to an Intel discrete GPU.
                  Stop conflating bus bandwidth requirements with overcommit of VRAM... the only time the 6600xt slows down is if the vram is overcommited.. aka you are running at too high of setting for the vram, even with more bus performance it will be a laggy mess.

                  Comment


                  • #19
                    This GPU and your price is a joke!

                    Comment


                    • #20
                      Originally posted by ezekrb5 View Post
                      This is not great on power consumption (nvidia has older gpus that consume less).
                      NVidia has older GPUs that consume less power but they are also much slower. I didn't see any older GPUs that were even close on perf/watt in general, although on a few benchmarks they got closer but still no better than maybe 3/4. Am I missing something ?
                      Last edited by bridgman; 13 October 2021, 02:00 PM.
                      Test signature

                      Comment

                      Working...
                      X