Announcement

Collapse
No announcement yet.

Radeon RX Vega On Linux: High-Performance GPUs & Open-Source No Longer An Oxymoron

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Honestly, I'm kinda disappointed here. I feel like the AMD cards should be around $50 cheaper, if not more. They go for about the same price as nVidia, at least looking around in Canada the difference is less then twenty bucks between a 1080 and a Vega 64. Except the nVidia seems to be showing more reliable performance and much better power usage. This isn't exactly blowing up my skirt here, this is luke warm at best.

    That said, open source drivers are much nicer, and freesync is much cheaper than gsync so... that's probably enough to make me prefer it over nvidia. Its just nothing that excites me in any way. I realize the drivers will get better over time and these are the worst possible numbers, yes.. but still, this is nothing special after all that hype. Yawn.

    Comment


    • #52
      bridgman or agd5f can you say something about "Primitive Shaders"? Can you confirm what I assumed in post #8?
      I've assumed it is related to that:



      Starting with GFX9 chips, some shader stages are merged
      I assumed there are less hardware stages compared to earlier GCN GPUs, so the compiler is merging states and there might be performance benefits because of missing state changes and improved data management.

      Marketing describes the new "Primitive Shader" (which I thought is a new hardware stage equivalent of Vertex+Geometry API stages) as follows:


      I was convinced that this means that e.g. cluster culling could be done in the geometry shader more efficient compared to doing it via compute shaders. But the developer would have to implement it in their geometry/"primitive" shaders ("Enables early primitive culling in shaders"). Calling it a "programmable shader-based approach" does fit that theory.



      Now I'm not entirely sure because some reviewers claim that it is currently disabled and the "NGG Fast Path" (which I assumed is some reference implementation of a shader that is culling) is something to be enabled via the driver for older games.



      And Rys Sommerfeldt said that the NGG Fast Path would be enabled transparently: https://twitter.com/ryszu/status/896413294021529600

      Comment


      • #53
        Good Michael, coming straight to the point instead of adding a-million-times repeated blabla. It's good to see the card performing so well and even though I waited for Vega to release before buying my new card, I guess I'll go for the 580 and wait a million years till it's finally available, because of the TDP. I prefer a more silent card when running idle, rather than the best performing while gaming.

        Comment


        • #54
          Originally posted by Shevchen View Post

          And the reason for the poor scaling is the mostly brute-force approach we render games these days. That we can have the opposite effect shows the first benchmark - and this is even without the ad-hoc improvements we get later on "simply" by switching some features on..
          Right. The secret behind Nvidia power efficiency is the use of tile-based rasterization they acquired from the shell of 3Dfx. The technology by itself is not new as it was used on PowerVR PCX, Neon250 and the Kyro Series in PC desktop.
          Vega supposedly has similar feature but appears disabled through driver. Note that it is amazing what AMD did with much tighter budget.

          Comment


          • #55
            Originally posted by salsadoom View Post
            Honestly, I'm kinda disappointed here. I feel like the AMD cards should be around $50 cheaper, if not more. They go for about the same price as nVidia, at least looking around in Canada the difference is less then twenty bucks between a 1080 and a Vega 64. Except the nVidia seems to be showing more reliable performance and much better power usage. This isn't exactly blowing up my skirt here, this is luke warm at best.

            That said, open source drivers are much nicer, and freesync is much cheaper than gsync so... that's probably enough to make me prefer it over nvidia. Its just nothing that excites me in any way. I realize the drivers will get better over time and these are the worst possible numbers, yes.. but still, this is nothing special after all that hype. Yawn.
            Think about the current position of AMD and consider their long term strategy. With time, you will notice the difference once the open-source driver matures. There is a reasons why AMD GPU are nicknamed FineWine.

            Comment


            • #56
              Seems foolish with the heat and power consumption. People will probably opt for Intel Nvidia. AMD have really poor pricing at retailers due to lower sale so unless you're a die hard fan you'd probably just avoid AMD. Simply don't get why people can be so biased towards AMD who's always playing catch up. The competition will simply always have an extra gear.

              Comment


              • #57
                Congratulations to AMD on the Vega launch! These are some very impressive numbers. Of course, there are always a few bumps on the road to be hit at launch, happens to nearly all products. But especially that free driver performance is very impressive.
                Stop TCPA, stupid software patents and corrupt politicians!

                Comment


                • #58
                  Great review, interesting thing that Vega 64 can't come close (at least in games) to the 1080 Ti on Windows, yet on GNU/Linux it beats it in some cases with free/open source driver. That's just amazing, and I really don't know if AMD contributes to Mesa (there was talk before, and i think it did, don't know now...), but they should seriously re-think their support system if they don't contribute. AMD users are blessed on Linux for sure, with many advantages of OS/free drivers.

                  On the GPU, well, it is a step forward, for specific price I'm sure people would forgive power consumption for now.

                  Comment


                  • #59
                    Originally posted by leipero View Post
                    and I really don't know if AMD contributes to Mesa (there was talk before, and i think it did, don't know now...),
                    They contribute to Mesa quite a bit.

                    Comment


                    • #60
                      Originally posted by salsadoom View Post
                      Honestly, I'm kinda disappointed here. I feel like the AMD cards should be around $50 cheaper, if not more.
                      That's crazy. If you don't want to put out your $$$ for something new, then just wait. I bought my R9 290 long after it came out, after it had been used in a mining rig, and its still running about as good as much more expensive cards. AMD has some incredible long-term value that I often don't see in the technology market.

                      Besides that, this card is already great at games, but is fantastic for machine learning, mining, and I'm assuming other OpenCL workloads. It's already a great value for the performance and its brand new. The gaming performance can only get better.

                      Comment

                      Working...
                      X