Announcement

Collapse
No announcement yet.

Radeon Vulkan Driver Adds Option Of Rendering Less For ~30% Greater Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by phoronix_is_awesome View Post
    So, after using all the nasty tactics such as forcing GA104 8GB to swap at 1440P, pricing a 192bit GPU at 256bit competitor's price, forcing media to not talk about raytracing performance or lack of dedicated matrixmul units, or lack of ROCm compute support, it still doesn't solve the problem that the 3060's MSRP is nearly 30% less. So now the shift of 30% burden needs to come somewhere. I guess 30% less image quality is a lot harder to detect. Great work David and Scott. It's a shame that all the 16Gb Samsung GDDR6 16Gbps dies are wasted on this piece of shit lineup.

    If Nvidia would realize that ECC/ P2P RDMA is the barrier to professional AI computing, and start putting the samsung 16Gb dies on GA104/GA102 chips, and make those cards unavailable to miners and available to university CUDA compute people, it would do the greatest good to society.
    How did you come to the conclusion that AMD is controlling the mass media? Do you really believe Userbenchmark claims? Do you really think AMD GPUs are cheap knockoffs and marketing is the only thing that is keeping the company alive?

    I also find very interesting how you talk about pricing in MSRP terms, as if ANYTHING would sell at MSRP nowadays. Not to mention how you compare bus bandwidth between different architectures as if you had forgotten about Infinity Cache. Did you?

    And speaking about that "30% less image quality" you're talking about: That's VRS. AMD GPUs have it. Consoles have it. And believe it or not, Nvidia GPUs also have it.

    Yeah, you're probably a troll. But whatever.
    Last edited by EvilHowl; 10 April 2021, 06:12 AM.

    Comment


    • #12
      I am not a fan of significantly noticeable quality sacrifices for performance. Only if it wasn't noticeable, that would be okay. They should put more effort instead into extracting more performance of hardware people actually own today (such as trying to get GFX9 NGG to work), RDNA2 is still not widely spread and that will continue this year thanks to the shortages.

      Comment


      • #13
        Originally posted by coder View Post
        I do find it ironic that people are buying 4k monitors, then turning around and enabling a bunch of features that degrade image quality, just to get their framerates back up.
        It's not, because 1440p-to-4K DLSS 2.0 has better quality than 1440p. Also you can still use your 4K monitor for better text rendition outside of games while still enjoying 1440p-like (or better) quality while gaming.
        ## VGA ##
        AMD: X1950XTX, HD3870, HD5870
        Intel: GMA45, HD3000 (Core i5 2500K)

        Comment


        • #14
          Originally posted by fafreeman View Post
          anyone know how development for mesh shaders is going in radv? that's another big performance boosting one.
          I've been looking into it a little, but it's not a priority.

          Are you aware of any game or app that actually uses mesh shaders? I haven't seen one yet. There are also no proper test cases, only a couple of demos from NVidia.

          Comment


          • #15
            You guys realize that enforcing 2x2 VRS may boost the frame rate but is not its intended use at all, right? Its working principle is that games decide where full shading detail is required and where it can be omitted. Typical such cases are clear blue skies or areas which are subjected to motion blur anyway. There is absolutely no reason to argue about this beyond purely academic interest...

            Comment


            • #16
              Originally posted by coder View Post
              Your other grievances notwithstanding, they are simply trying to catch up with the rest of the industry on this one, as the original blog post tried so hard (yet apparently without much success) to state.

              I do find it ironic that people are buying 4k monitors, then turning around and enabling a bunch of features that degrade image quality, just to get their framerates back up. But, I see the point that these techniques each try to sacrifice quality in ways and places that you're less likely to notice. So, you could say it's a similar concept as video compression.
              In my case I use my TV as a monitor and, unexpectedly, got a 55" 4K HDR last Black Friday. What I do is either game on the same settings at 1080p or, when applicable, in 4K with 50%-75% resolution scaling. I prefer the latter option because alt-tab changing windows and resolutions is jarring. Since I'm only aiming for 1080p high quality and the TV was a gift I do what I can to make it work.

              Also, I totally get a person wanting 4K+ for everything but gaming. That describes me and, IMHO, after 2K the returns in games are less and less apparent and obvious. In regards to desktop tasks, however, 2K to 4K is very noticeable with more working area in programs and sharper text.

              Comment


              • #17
                Originally posted by ms178 View Post
                I am not a fan of significantly noticeable quality sacrifices for performance. Only if it wasn't noticeable, that would be okay. They should put more effort instead into extracting more performance of hardware people actually own today
                Kinda stating the obvious there. That's like saying "if only healthy foods tasted better, I would eat more of them".
                You can't just magically make games perform better without sacrificing detail. The whole reason why there's a demand to make GPUs more powerful is because people want to play games with more detail. The driver devs have been extracting more performance out of the hardware they already own, and this has been the case for years. If you don't like the results, well, that's because there's only so much that can be done.

                You can't have your broccoli taste like the best cake you ever had without overpowering it in other flavors. You can't have your old GPU play at max detail in 4K without sacrificing detail.

                Comment


                • #18
                  Originally posted by darkbasic View Post

                  It's not, because 1440p-to-4K DLSS 2.0 has better quality than 1440p. Also you can still use your 4K monitor for better text rendition outside of games while still enjoying 1440p-like (or better) quality while gaming.
                  Only vs taa native 1440p because taa blurs the image. And only when you standing or moving forward, when you strafe dlss is trash because there is no compensation from previous relative to the scene frames.

                  Comment


                  • #19
                    Originally posted by ms178 View Post
                    I am not a fan of significantly noticeable quality sacrifices for performance. Only if it wasn't noticeable, that would be okay.
                    The idea is that it's supposed to be barely noticeable, assuming it's well-implemented. You can find lots of impressions of Nvidia's and even Intel's implementations. Here's what they say about it:

                    Comment


                    • #20
                      Originally posted by darkbasic View Post
                      It's not, because 1440p-to-4K DLSS 2.0 has better quality than 1440p. Also you can still use your 4K monitor for better text rendition outside of games while still enjoying 1440p-like (or better) quality while gaming.
                      Yeah, I get it. I'm not saying the net-effect isn't better, just that it seems a little ironic.

                      Comment

                      Working...
                      X