Announcement

Collapse
No announcement yet.

AMD Announcements From CES 2020: Ryzen 4000 Mobile Series, Radeon RX 5600 XT

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by c117152 View Post

    The maxed out textures are for 4k.
    It's pretty hard to get to the point of diminishing returns. There's no excuse for putting so little VRAM in a GPU. I thought AMD wasn't playing the planned obsolescence game like Nvidia does, where they release new GPUs that can't even max out something as basic as textures in already released games.

    Comment


    • #32
      Originally posted by atomsymbol View Post
      It is interesting that in year 2015 the R9 390/390X (5 TFLOPS FP32) were considered to be 4K gaming cards, while in year 2020 the RX 5600 XT (7 TFLOPS FP32) is considered to be a 1080p gaming card.
      The thing about graphics is that it's not just the resolution that goes up over time, you also get higher quality assets (higher poly models, higher res textures, etc.), new and higher quality effects, more objects on-screen at once and more expansive play areas (which require more assets to be constantly shifted in and out of memory).

      Also, AMD's "4k" card back in 2015 was the Fury X with 8.6 TFLOPS FP32, not the warmed up R9 290 from two years earlier.

      Originally posted by pegasus View Post
      Way too much in my opinion. It is configurable down to 12w tdp, but that's still twice the tdp of my current desktop, which is three years old. Come on AMD, you're practically forcing me to buy Intel again.
      Well considering this is not something in the Atom/i3-tier in terms of performance it's a bit much to expect it to be in within the same power envelope as something with a fraction of the performance. There's probably eventually going to be something like the embedded Ryzen APUs that you can find in the Smach Z handheld and Atari VCS console that'll be equivalent of Intel's stuff in that power envelope, but to expect an 8C/16T part within the power envelope of a thin client is unrealistic to say the least.
      Last edited by L_A_G; 01-07-2020, 10:08 AM.
      "Why should I want to make anything up? Life's bad enough as it is without wanting to invent any more of it."

      Comment


      • #33
        Originally posted by L_A_G View Post

        The thing about graphics is that it's not just the resolution that goes up over time, you also get higher quality assets (higher poly models, higher res textures, etc.), new and higher quality effects, more objects on-screen at once and more expansive play areas (which require more assets to be constantly shifted in and out of memory).

        Also, AMD's "4k" card back in 2015 was the Fury X with 8.6 TFLOPS FP32, not the warmed up R9 290 from two years earlier.
        I completely agree.

        Another interesting thought is that if 7 TFLOPS FP32 are being recommended for 1080p gaming this year (assuming the Ultra quality in-game setting is selected), then the next-generation PS5/Xbox consoles which are being thought of as "finally" enabling 4K=2160p gaming would require 4*7=28 TFLOPS GPUs which is of course impossible to achieve at the expected price-points of those consoles (300-500 EUR/USD).

        Comment


        • #34
          Originally posted by bachchain View Post
          If only that six was an eight, I might care.
          Or if that six was priced like a six ($160)... Then it might make sense...

          Comment


          • #35
            Originally posted by pegasus View Post
            Way too much in my opinion. It is configurable down to 12w tdp, but that's still twice the tdp of my current desktop, which is three years old. Come on AMD, you're practically forcing me to buy Intel again.
            Twice the (announced) tdp, but everyone knows what tdp means to Intel...
            Also, how many times more the performance for the tdp? And how much cheaper?
            Those are also valid points.

            Comment


            • #36
              Originally posted by atomsymbol View Post

              It makes sense to expect it only if you are going to buy it. In terms of performance, isn't Radeon VII sufficient?
              No, it isn't due to the older architecture (and hence older video coding block).

              Comment


              • #37
                Originally posted by DoMiNeLa10 View Post
                It's pretty hard to get to the point of diminishing returns. There's no excuse for putting so little VRAM in a GPU. I thought AMD wasn't playing the planned obsolescence game like Nvidia does, where they release new GPUs that can't even max out something as basic as textures in already released games.
                What does any of this has to do with you wanting to load textures to RAM that are too high resolution to be rendered at the target display? It's not just diminishing returns. There's absolutely no returns: Only down sides. Bigger RAM means more lanes, power and cooling. Bigger textures than the target means you need to actively downscale the texture resolution.

                Comment


                • #38
                  Originally posted by tildearrow View Post
                  No, it isn't due to the older architecture (and hence older video coding block).
                  I am not sure whether you mean accelerated video decoding or encoding. Accelerated video encoding with AMD GPUs has little support in Linux, as far as I know.

                  Comment


                  • #39
                    My GTX 1070 (6.5 TF) works fine at 4k for all the games I care about.

                    Comment


                    • #40
                      Originally posted by c117152 View Post

                      What does any of this has to do with you wanting to load textures to RAM that are too high resolution to be rendered at the target display? It's not just diminishing returns. There's absolutely no returns: Only down sides. Bigger RAM means more lanes, power and cooling. Bigger textures than the target means you need to actively downscale the texture resolution.
                      All bitmap textures are too low in resolution if you look close enough. It's obvious that games are designed with more VRAM in mind these days, so it's disgusting that a manufacturer would design another DOA card, especially after the flop of lower end Turing cards from Nvidia struggling with high details at 1080p (which this card is supposed to do well).

                      Comment

                      Working...
                      X