Announcement

Collapse
No announcement yet.

HITMAN 3 Runs Well On Linux With Steam Play - Open-Source Radeon Performance Especially Good

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by birdie View Post

    This was the case almost 20 (!) years ago when NVIDIA sort of cheated on anisotropic filtering. Would be great if people stopped perpetuating this BS.

    Many people over the past decade have compared NVIDIA and AMD in terms of image quality and nothing has been found, e.g.

    No, there's quite a bit more to it. On GeForce FX GPUs Nvidia cheated by using lower precision pixel shader calculations, resulting in visible worse image quality. They did that because high precision was not fully implemented in hardware and very slow (they keyword is "partial precision"). They had to remove that cheat in later driver versions, resulting in huge performance drops. I think they also cheated in some other cases simply by using lower precision buffers.

    Comment


    • #22
      If we get about 100fps FHD Ultra quality with a RX590 on unknown CPU (but likely better than my AMD FX8350), can I assume I'll get at least 60fps with my RX580 and AMD FX8350?

      Comment


      • #23
        Originally posted by birdie View Post

        Yeah, both did but for some reasons only "NVIDIA's image is worse" is repeated over and over again.
        I think it's mostly because Nvidia driver (at least on WIndows) sets the color space to limited RGB by default in some cases, so people start assuming that AMD delivers better image before even knowing that you just need to switch to full RGB.

        Comment


        • #24
          When testing games I would like to see Steam Deck compareable hardware (maybe a 4750G with 1280x800 res)

          I'll definitly give Hitman 3 a try once my unit arrives

          Comment


          • #25
            Originally posted by birdie View Post

            This was the case almost 20 (!) years ago when NVIDIA sort of cheated on anisotropic filtering. Would be great if people stopped perpetuating this BS.

            Many people over the past decade have compared NVIDIA and AMD in terms of image quality and nothing has been found, e.g.

            There was that quality setting in the Nvidia control panel that did default to ugly.

            Comment


            • #26
              Originally posted by flower View Post
              When testing games I would like to see Steam Deck compareable hardware (maybe a 4750G with 1280x800 res)

              I'll definitly give Hitman 3 a try once my unit arrives
              AMD never supplied me with any 4000 series APU.
              Michael Larabel
              https://www.michaellarabel.com/

              Comment


              • #27
                Originally posted by lucrus View Post
                If we get about 100fps FHD Ultra quality with a RX590 on unknown CPU (but likely better than my AMD FX8350), can I assume I'll get at least 60fps with my RX580 and AMD FX8350?
                Michael used a Ryzen 9 5950X, see comparison table at the beginning of the article.

                The worst that could happen in modern games is that they simply refuse to run on your CPU due to lack of extensions such as AVX2.
                But, since the minimum requirement on the AMD side is a Phenom II, I suppose you'll be just fine.

                Steam lets you refund the game automatically for 15 days after purchase if your playtime is under 2 hours, so you should be able to test it safely.
                Happy gaming.

                Comment


                • #28
                  Originally posted by user1 View Post

                  I think it's mostly because Nvidia driver (at least on WIndows) sets the color space to limited RGB by default in some cases, so people start assuming that AMD delivers better image before even knowing that you just need to switch to full RGB.
                  This was solved/changed a long time ago and it affected only some HDMI devices - not even all of them. But year, it was a recurring topic on many forums but NVIDIA just wanted to play safe since limited color space is what works best on TVs.

                  Comment


                  • #29
                    Originally posted by brent View Post

                    No, there's quite a bit more to it. On GeForce FX GPUs Nvidia cheated by using lower precision pixel shader calculations, resulting in visible worse image quality. They did that because high precision was not fully implemented in hardware and very slow (they keyword is "partial precision"). They had to remove that cheat in later driver versions, resulting in huge performance drops. I think they also cheated in some other cases simply by using lower precision buffers.
                    Are you talking about the FX 5000 series? Yeah, it was one huge fiasco, basically the entire lineup was horrible.

                    Comment


                    • #30
                      Michael hi why dont use 510.39.01 driver? 495 is too old maybe need make new test

                      Comment

                      Working...
                      X