Announcement

Collapse
No announcement yet.

Radeon GPUs Are Increasingly Competing With NVIDIA GPUs On Latest RadeonSI/RADV Drivers

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Not meaning to dampen proceedings... but... some efficiency data would have been nice, to qualify the "performance" comparisons.

    Comment


    • #32
      Originally posted by zanny View Post
      The year of the open source graphics is finally upon us - AMD finally has a first class high end GPU that can go neck and neck with Nvidia in price / performance.

      And its out of stock everywhere and when in stock at least twice over its MSRP thanks to miners.

      What a buzzkill.
      Yeah, was going to build a new rig some months ago, I've given up hope and refuse to give into the price hike. Pretty much killing the pc building market.
      Honestly, it's to the point I'm confused as to why GPU manufacturers don't just flood the market with GPUs, to try and meet demand. They aren't going to stop any time soon and they'd probably make a killing off of it.
      Last edited by computerquip; 19 March 2018, 10:48 PM.

      Comment


      • #33
        Originally posted by entropy View Post
        Remember when J. Bridgman said many times that AMD folks expect the FOSS driver
        to eventually deliver a performance of about 70 % of a closed-source driver (with all the magic)?
        It may still be true though.

        Contrary to popular belief, AMD hardware is much better than similar (MSRP) Nvidia hardware. VEGA is a great architecture, more advanced than Pascal.

        "but but why it loses on muh gaemz on Windows?"

        Because Gimpworks(tm) mostly. And overtesselation (but that is less of an issue these days). Notice that on Linux various proprietary Nvidia blackboxes are missing. So even though the opensource radeonSI driver may not have all the cool optimizations of a mature closed driver, it is still not gimped by graphical effects and settings that are specifically designed to run better on Nvidia DX11 drivers without providing any real benefit for graphical fidelity. So you can finally have Polaris and VEGA stretch their TFLOP muscles and prove how good they are. I still suspect Mesa is not that fast, yet.

        I suspect that Gimpworks(tm) is primarily the reason we don't see more Linux ports too. I mean, when companies like Ubisoft have stopped creating game engines of their own and just create a few libraries to complete Gimpworks(tm), it makes sense they can't port their games to Linux...

        Comment


        • #34
          Originally posted by entropy View Post
          Remember when J. Bridgman said many times that AMD folks expect the FOSS driver to eventually deliver a performance of about 70 % of a closed-source driver (with all the magic)?
          What I said (almost 10 years ago now) was that with a simple shader translator rather than a real shader compiler and without per-app performance investigation and optimization we should be able to reach 60-70% of closed source driver performance. In hindsight that was a bit optimistic - performance actually plateaued at more like 50-60%.

          IIRC the first crack in those numbers was when Vadim's "Shader Backend" (SB) optimizer started being enabled by default. The current driver has a "real" LLVM-based shader compiler plus a couple of years of per-app performance investigation and optimization by Marek, Nicolai and others.

          BTW I'm pretty sure I actually only said it once but then spent a few years correcting people who had misquoted me.
          Last edited by bridgman; 19 March 2018, 11:29 PM.
          Test signature

          Comment


          • #35
            Originally posted by computerquip View Post
            Yeah, was going to build a new rig some months ago, I've given up hope and refuse to give into the price hike. Pretty much killing the pc building market.
            Honestly, it's to the point I'm confused as to why GPU manufacturers don't just flood the market with GPUs, to try and meet demand. They aren't going to stop any time soon and they'd probably make a killing off of it.
            If they could, they would. They are in the business of selling products after all. The problem is they are getting bought up as fast as they can make them. I've read they are hesitant to invest in increasing their manufacturing output, because if the "using consumer GPU for cryptocurrency mining" bubble bursts, they now have a glut of both product and manufacturing capacity. Makes more business sense to keep producing at current volume knowing they will all sell quickly and at full price.

            Comment


            • #36
              Originally posted by torsionbar28 View Post
              There is really no reason for any self respecting open-source enthusiast to run NVidia hardware any more. Sold my GTX Titan last year, replacing it with an Rx 480 and have not looked back. All my Steam games play really smoothly on Ultimate settings. Everything just works, and works really well - no more binary blobs or proprietary kernel modules to compile. AMD FTW!
              I bought an RX 480 at home as well and I am blown away with the performance considering that I don't have to fiddle with the drivers at all. It just works. Its remarkable really.

              Comment


              • #37
                Originally posted by computerquip View Post

                Honestly, it's to the point I'm confused as to why GPU manufacturers don't just flood the market with GPUs, to try and meet demand.
                It seems that the main reason for AMD is the memory shortage - at least according to their statement during the earnings call.

                Anandtech has article about it: https://www.anandtech.com/show/12380...imiting-factor


                "Relative to just where we are in the market today, for sure the GPU channel is lower than we would like it to be, so we are ramping up our production. At this point we are not limited by silicon per se, so our foundry partners are supplying us, there are shortages in memory and I think that is true across the board, whether you are talking about GDDR5, or you’re talking about high bandwidth memory. We continue to work through that, with our memory partners and that will be certainly one of the key factors as we go through 2018."

                Comment


                • #38
                  Originally posted by vito View Post
                  It seems that the main reason for AMD is the memory shortage...
                  GDDR5 should die as soon as possible, that is a story about it i think - there is no shortages, but no one wants to use it at this time

                  At beginning of "meltdowned" year Samsung announced starting large-scale production of GDDR6 memory... so they probably all waiting for that 6

                  http://www.tomshardware.com/news/sam...ory,36358.html
                  Last edited by dungeon; 20 March 2018, 12:41 AM.

                  Comment


                  • #39
                    Originally posted by dungeon View Post

                    GDDR5 should die as soon as possible, that is a story about it i think - there is no shortages, but no one wants to use it at this time

                    At beginning of "meltdowned" year Samsung announced starting large-scale production of GDDR6 memory... so they probably all waiting for that 6

                    http://www.tomshardware.com/news/sam...ory,36358.html


                    Interesting article but not sure how it affects production of Vega (unless I am misunderstanding what you are trying to say).

                    AMD Vegas use high bandwidth memory or HBM and not GDDR5 or GDDR6. AMD has explicitly said that the memory shortage is the problem and they point out that there is a shortage of HBM which I thought was pretty clear in the quote:
                    "Relative to just where we are in the market today, for sure the GPU channel is lower than we would like it to be, so we are ramping up our production. At this point we are not limited by silicon per se, so our foundry partners are supplying us, there are shortages in memory and I think that is true across the board, whether you are talking about GDDR5, or you’re talking about high bandwidth memory. We continue to work through that, with our memory partners and that will be certainly one of the key factors as we go through 2018."

                    Comment


                    • #40
                      Originally posted by torsionbar28 View Post
                      If they could, they would. They are in the business of selling products after all. The problem is they are getting bought up as fast as they can make them. I've read they are hesitant to invest in increasing their manufacturing output, because if the "using consumer GPU for cryptocurrency mining" bubble bursts, they now have a glut of both product and manufacturing capacity. Makes more business sense to keep producing at current volume knowing they will all sell quickly and at full price.
                      Problem with this approach is they are losing mindshare. AMD are selling all the cards they make at full price but those cards don't end up in proper consumer hands instead they end up in mining farms. This means less proper application market share, less bug fixing, less user mindshare, less advertisement, less incentive for game and application developers to actually optimise for AMD hardware.

                      In the end, they may be selling in the short term but hurt their long term position. Increasing their volume shortterm, if only to help getting more cards into gamer hands, is worth the risk in my opinion. The alternative, just surrendering the whole gaming market to Nvidia, is much worse than the bubble busting and having some inventory unpushed.

                      Comment

                      Working...
                      X