Announcement

Collapse
No announcement yet.

Hands On With The AMD Radeon RX 6600 XT

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    If the silicon is the problem, maybe AMD could at least update the video encode/decode block on the Polaris cards, while keeping the 3D part more or less the same. That way, they can still use GloFo for those cards.

    Comment


    • #42
      Originally posted by DanL View Post
      If the silicon is the problem, maybe AMD could at least update the video encode/decode block on the Polaris cards, while keeping the 3D part more or less the same. That way, they can still use GloFo for those cards.
      Yep, that seems like the best way to get another fab process into use... or save 6-9 months and fab them as they are to distract the mining market and make more gaming cards available to gamers. Mining is the real problem here AFAICS - take that away and I think the industry capacity comes fairly close to being able to supply current requirements.

      That said, my impression is that GloFo is fully booked for the foreseeable future as well.
      Last edited by bridgman; 07 August 2021, 10:32 PM.
      Test signature

      Comment


      • #43
        Originally posted by MadeUpName View Post
        As a side note I saw a YT clip where some one was showing that at least in the Steam survey NVidia has a higher share of 3090s than all of AMDs 6000 series combined. I'm not entirely sure what to make of that. Could it be true? If so why is it true?
        It could be that survey isn't counting them properly. There have been reports of people on Windows getting the Steam survey and seeing RDNA2 cards identified as "AMD Radeon Graphics" rather than the actual GPU model, and "AMD Radeon HD 8800 Series" has been growing for a few months.

        It could also be that AMD just aren't making many of them. Their priorities for the limited fab capacity are likely:
        1. Consoles, as the deals made with Sony and Microsoft would require a certain volume of product. The profit per unit is probably quite low, on the other hand without having these sales guaranteed years ago AMD wouldn't have been able to do the R&D needed to be where they are now.
        2. CPU chiplets. The profit/wafer for a few chiplets to go into Ryzens will be much higher than using that same space to make one GPU, and Epyc is much more again.
        3. GPUs, and even here there is more money in workstation and AI cards than in gaming.

        Comment


        • #44
          Originally posted by birdie View Post

          Vega graphics in its Ryzen APUs is good enough and that's my opinion as well. Now tons of people don't have a built-in GPU and that's a valid concern but such people normally know what they are doing and buy beefy GPUs instead.
          Nearly everyone in the developed world can afford a low-end dGPU, but in the developing world, Ryzen APUs are very popular with gamers. They offer acceptable performance in most titles if you turn down the graphics settings. The next step up in gaming performance is a low-end dGPU but this can double the cost of a system.

          Comment


          • #45
            I wonder if it would be feasible to integrate a tiny but still 3D and video decoding capable GPU into the IO-die of Ryzen CPUs once that die gets node shrink to (for example) 7 nm so that there would be more transistors to spare. No idea when that might happen or if it's actually possible to do but if it is, it could mean that every Ryzen CPU could be an APU with (hopefully) little extra cost to manufacture.

            Comment


            • #46
              Originally posted by CaptainLugnuts View Post
              AMD is never going to make a card like that ever again because they're pointless.
              If all you need is a low-end GPU, their integrated GPUs have that covered.
              Their APU's are very capable, no doubt. The challenge there is they share the TDP envelope with the CPU. A dedicated GPU, even a low power one, frees the CPU up from a power budget perspective, and can make for a potent combination. There is also the fact that the majority of games do not require much CPU- you can game all but the latest AAA titles on a 5+ year old CPU just fine, provided you have a modern dGPU. This allows you to decouple GPU from the CPU/mobo from a financial perspective. This is a compelling option for those on a budget. Of course nowadays with the silicon shortage, manufacturers and vendors aren't going to bother with low margin SKU's, so you won't be seeing anything like a Rx460 or a Gtx550ti any time soon.

              Comment


              • #47
                Originally posted by Tomin View Post
                I wonder if it would be feasible to integrate a tiny but still 3D and video decoding capable GPU into the IO-die of Ryzen CPUs once that die gets node shrink to (for example) 7 nm so that there would be more transistors to spare. No idea when that might happen or if it's actually possible to do but if it is, it could mean that every Ryzen CPU could be an APU with (hopefully) little extra cost to manufacture.
                I like how I made this comment before we knew that Zen 4 IO die would contain a GPU.

                Comment

                Working...
                X