Announcement

Collapse
No announcement yet.

AMD Radeon RX 5600 XT Linux Gaming Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by CochainComplex View Post

    mh the difference is amds gimmicks are usually open.....tress fx ...opencl...freesync...ah and there was mantle which set the groundwork for dx12 and vulkan...
    NVIDIA -works have been open sourced for ages and don't use any proprietary NVIDIA features - they use standard D3D11/12 features.

    Comment


    • #22
      Originally posted by atomsymbol View Post

      It isn't true that Nvidia "invented tessellation" or "first implemented tessellation in HW". See http://rastergrid.com/blog/2010/09/h...-tessellation/
      You're quite right. Only NVIDIA was the first company to offer a really fast implementation (which AMD couldn't match for many years) and popularized it among game developers.

      Originally posted by JPFSanders View Post

      Mate, Nvidia has a long, long history of playing shenanigans on the video-game industry, playing dirty with game developers, pushing proprietary crap on games (ahem physx) that do not contribute in the slightness to make the games themselves better.

      But the issue with Nvidia is not that, that's not important. The issue with Nvidia is that they're screwing the Linux experience continuously
      (ahem! EGL) or directly no support at all for some of their chipsets under Linux (Optimus) Also good luck trying to make any classic Nvidia card not supported by their proprietary driver work on linux.

      Fortunately we're reaching the point where they are becoming a non-problem as AMD cards behave better and better every day under Linux, but sure their cards can do 10?-20fps? more on games, and you and anybody else are free to buy an Nvidia card.

      Me I'm not supporting them until they produce a proper OSS driver and give Linux users the opportunity to run MESA on their cards.

      Note, I do not hate Nvidia, I just want the best experience on Linux and Nvidia is not it.
      NVIDIA will treat Linux as a first class OS when it becomes one. As for playing shenanigans - almost all commercial companies are not immune to that. I vividly remember how AMD released $1000 AMD64 CPUs when Intel was stuck with their Pentium 4 architecture. AMD fans forget that so easily.
      birdie
      Senior Member
      Last edited by birdie; 21 January 2020, 12:51 PM.

      Comment


      • #23
        Originally posted by birdie View Post
        I vividly remember how AMD released $1000 AMD64 CPUs when Intel was stuck with their Pentium 4 architecture. AMD fans forget that so easily.
        There is a difference between having a superior product with a high price for it and deliberately modifying benchmarks to only make it look that way. Intel even lost in court multiple times over their shenanigans making them pay AMD billions in damages. One such example is this, and also the Intel compiler issues.

        Comment


        • #24
          Originally posted by birdie View Post
          NVIDIA will treat Linux as a first class OS when it becomes one.
          Why are you here birdie? Just to suffer?

          Originally posted by birdie View Post
          As for playing shenanigans - almost all commercial companies are not immune to that. I vividly remember how AMD released $1000 AMD64 CPUs when Intel was stuck with their Pentium 4 architecture. AMD fans forget that so easily.
          Birdie, go read my comment again, the market shenanigans is not what prevents me from having an Nvidia card, is poor support because they refuse to implement their drivers the way they're supposed to be implemented in Linux.

          https://www.youtube.com/watch?v=IVpOyKCNZYw

          Why doesn't Nvidia help this poor girl to have a good experience on her laptop? Why Birdie? why?

          Comment


          • #25
            Originally posted by birdie View Post
            and don't use any proprietary NVIDIA features - they use standard D3D11/12 features.
            Doesn't change that they are inefficient as fuck. Enabling Hairworks can still reduce performance in Witcher 3 by over 30% for no particularly good reason whereas Rise of the Tomb Raider has hair rendering that not only looks better than Hairworks does in any game that supports it, but also manages to do so without a noticeable performance impact (it's like 5% on vs off).

            Turns out that using 64x tessellation together with geometry shaders and transform feedback all in one go isn't the best thing to do on any GPU.

            Comment


            • #26
              Originally posted by Michael
              the budget RX 5500 XT
              That's not a budget card. It's a midrange (maybe low midrange) card. Unfortunately, AMD doesn't make budget (<=$100) cards anymore. They have nothing to compete with the GT 1030, not counting using an APU.

              Comment


              • #27
                Originally posted by birdie View Post

                NVIDIA -works have been open sourced for ages and don't use any proprietary NVIDIA features - they use standard D3D11/12 features.
                https://en.wikipedia.org/wiki/Nvidia_GameWorks 50% of the article is about criticism of the closed nature

                Comment


                • #28
                  Originally posted by DanL View Post

                  That's not a budget card. It's a midrange (maybe low midrange) card. Unfortunately, AMD doesn't make budget (<=$100) cards anymore. They have nothing to compete with the GT 1030, not counting using an APU.
                  The APU is kinda the whole point. I'm using a Vega 10 right now it annihilates everything discreet nVidia has on the low end. I don't have one, but even Vega 8 puts nVidia's low end offerings to shame.

                  EDIT: To be perfectly reasonable, I'm not so sure I'll ever need another discreet GPU ever again. I'm just so pleased with Vega 10's performance, it plays every game I throw at it. And not all, but most of them at max settings.
                  duby229
                  Senior Member
                  Last edited by duby229; 21 January 2020, 04:50 PM.

                  Comment


                  • #29
                    Overclocking RX 5600 to RX 5700 voltages and frequencies should be fairly straightforward, assuming the RX 5600 is powered by at least one 8-pin connector (8-pin = max 150 Watts, 6-pin = max 75 Watts).

                    Comment


                    • #30
                      Originally posted by DanL View Post
                      That's not a budget card. It's a midrange (maybe low midrange) card. Unfortunately, AMD doesn't make budget (<=$100) cards anymore. They have nothing to compete with the GT 1030, not counting using an APU.
                      What about something like the RX550 ? IIRC it's a bit faster than the GTX 1030 and in the same price range.

                      Any smaller than that and you're in integrated graphics territory.
                      Test signature

                      Comment

                      Working...
                      X