Announcement

Collapse
No announcement yet.

How The Radeon OpenGL Performance Has Evolved From The HD 2900XT To RX Vega

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Michael, thanks for the test.

    I think the power consumption could be reduced at a small performance hit by settings the BIOS switch to the lower position. I have bought a Vega 64 that should arrive next week so I'll be able to try this, and also port the patch I made for Polaris to underclock the card.

    Comment


    • #22
      I get the impression that AMD had to work around a lot of hardware bugs in these Vega cards, and that the hardware should really be capable of more if they had turned out to work as expected. Still, maybe driver optimizations will help a lot of that as they figure out what can be done.

      Comment


      • #23
        Originally posted by smitty3268 View Post
        I get the impression that AMD had to work around a lot of hardware bugs in these Vega cards, and that the hardware should really be capable of more if they had turned out to work as expected. Still, maybe driver optimizations will help a lot of that as they figure out what can be done.
        If you look at performance per watt and consider where the lion's share of performance is for the amount of money you will spend over the course of a year operating the card, there's three standouts: the R9 290, the RX 480, and the Vega 56. If you were going to own the card for exactly one year then the RX 480 would probably give you the best bang for the buck, and this holds out what I've said several times before that you're probably best off buying year-old GPUs from AMD if you care about price/performance (and open drivers.)

        The R9 290 remains competitive because the drivers had a long time to mature and that has given it higher efficiency than its architectural age would have suggested. That hurts AMD's profitability because there's much less of a difference between their 2-generation old hardware and their latest than there would be if they launched with higher quality drivers, and that impacts consumer motivation to upgrade.

        Now consider that if you compare NVidia's offerings, their latest architecture and the higher models within them are almost always the performance-per-watt winners. When AMD gets to have the same quality drivers as NVidia at launch then AMD will be able to command the same or superior pricing that NVidia does, they'll be more profitable, and we'll end up settling for lower quality hardware with higher quality drivers for the same price/performance point... (hmm... like what we do when we choose NVidia today!) I'm not really sure whether that will be a consumer win lol, so I'm glad AMD is differentiating their products in this way.

        Today I would probably buy the Vega 56 because it is giving very decent performance/watt numbers, it will only get bettter, in a year when it comes time to think about a new card it will probably be beating the next generation, and even a year after that it will still be competitive, like the R9 290 is now! :-)
        Last edited by linuxgeex; 16 August 2017, 05:51 AM.

        Comment


        • #24
          linuxgeex
          On performance per watt, people did calculations with 4h average useage (gameplay) of GPU per day, and they ended up with differences of 2 to 20$ per year or something on electricity bill. So performance per watt is still very important, but for professional use, company that have like 100 of those GPU's doing stuff 24/7 or at least 2 shifts (16h from 24h), there it makes huge difference in agregate and with so much usage, for us "regular Joes" it makes no difference at the end, few beers less per year difference, so maybe it's even better for health to buy more power hungry hardware .

          Comment


          • #25
            Originally posted by vipor29 View Post

            i have a core i7-7700k which is pretty high end lol
            Well then only explanation is that in this scenario, Nvidia sucks, isn't it?

            Comment


            • #26
              Originally posted by smitty3268 View Post
              I get the impression that AMD had to work around a lot of hardware bugs in these Vega cards, and that the hardware should really be capable of more if they had turned out to work as expected. Still, maybe driver optimizations will help a lot of that as they figure out what can be done.
              Is this just a personal impression, or do you have a source?

              Considering that Vega feels like it is fairly late to market, you would think that the AMD Radeon Technology Group would have had time to respin the silicon if they found an unexpected number of bugs...?

              Alternatively, it could just be that the arcitecture is so 'new' that RTG hasn't yet found a good way to map current graphics software to its drivers and hardware?

              But sure, it could also be a combination of both.

              Whatever the case, I'm hoping that yields improve and that more experience with the Vega uarch will make its performance rise to between 1080 and 1080Ti levels in older games and to 1080Ti levels in newer (DX12/Vulkan) games. But I readily admit that this may be nothing more than pure, untempered blue sky optimism on my part...

              Comment


              • #27
                Originally posted by ermo View Post

                Is this just a personal impression, or do you have a source?
                No real source, just a feeling. It's an impression i get from the late launch, and if you read Anandtech's review you can see that things were really down to the very last minute there. Plus the architecture seems less efficient than I would have thought - it doesn't seem like they've made any improvements with power efficiency like i would have expected from the architectural changes. And there was some commit in Mesa where they said something like "the official limit of waves is now 128 to avoid hangs" (making the #/terms up there, but it was something like that - they were clearly hoping to go higher but had turned it down lower). I expect that's all typical and probably happens to all hardware on launch, it's just a bit more visible with OSS drivers, but combined with everything else it just makes me think there may have been more than they had hoped for.

                Comment

                Working...
                X