Announcement

Collapse
No announcement yet.

AMD Announces Radeon RX 7900 XTX / RX 7900 XT Graphics Cards - Linux Driver Support Expectations

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #91
    Originally posted by finalzone View Post
    Current implementation of real-time ray-tracing in gaming are unimpressive to the point a better rasteurized version can match it with hardly visual difference for less performance hit. A high quality game on old console like Nintendo Switch proved that compared to say Playstation 5 version showing what a skilled gaming developer can do with the available tool. Consoles and mid-to-low end system are the larger market where gaming developers make money.


    The site basically compares a MSRP $1100 older generation vs MSRP $1600 newer for just 1.7 times performance while seemly overlooking the fact MSRP $1000 Radeon 7900 XTX at is nearly within a range of the latter by 20%. AMD basically outsmarted Nvidia in term of performance / price as the latter has yet to release the similar priced card.
    AMD made some comment about their RDNA3 cards being more future proof. I disagree as we continue to see ray tracing adoption these cards will have longer longevity. Similar to how the GTX 400 series performance out lived the HD5000/6000 series due to the adoption of tessellation.

    You have to remember Nvidia’s consumer cards are GPGPUs while with RDNA, AMD switched to making gaming GPUs. It’s very rare to see an AMD RDNA card mentioned in publications. While you’ll see plenty of RTX cards used. Same with content creation thanks to Nvidia’s Studio Drivers. At the end of the day Nvidia’s GPUs are targeting multiple markets while AMD with their poor performance in the other markets with their consumer cards are restricted to just the gaming market.

    Comment


    • #92
      Originally posted by coder View Post
      The problem is China's IP laws, where the government can compel any employee to divulge any technical details for "national security" reasons, though China has a broad definition of national security that even sometimes includes economic interests. To protect your IP from that, you basically have no choice but to limit how much of your IP makes it into China. Or, would you prefer to see China rob the world of all its IP and then run all non-Chinese companies out of business? The national security aspect merely ups the stakes.

      While I agree with what you have said about China, also USA "has a broad definition of national security that even sometimes includes economic interests", and actually "sometimes" should be replaced with "always", for both countries.

      The economic sanctions that USA has previously imposed on Huawei, and now on the Chinese manufacturers of SSDs and GPUs, are unlikely to have had any significant impact on the Chinese military, but they have been extremely successful in completely changing the competitive markets for several key consumer products, e.g. smartphones and SSDs, allowing US companies like Qualcomm and Micron to maintain their dominant positions in those markets, exactly when they were on the verge of losing to Chinese competitors (immediately prior to the recent sanctions, Apple had qualified Chinese SSD's for their products, but this supplier replacement has obviously been cancelled after the sanctions).

      By removing their competitors from the global markets, USA guarantees higher purchasing prices for many consumer products all over the world, which is not a goal that can attract the sympathy of the citizens of other countries, who care much less about the national security interests of USA than about their personal budget.

      Comment


      • #93
        Originally posted by WannaBeOCer View Post

        The RTX 4090 also gets 50%+ more performance per watt compared to a RX 6950 XT: https://www.guru3d.com/articles-page...review,30.html

        The only special feature the 7000-series has over Nvidia is DP 2.1 which is useless unless people utilize FSR to get more than 240 FPS+ at 4K since VSC is capable of 4K/240Hz and 8K/120Hz. Looking at actual power consumption when not overclocked the RTX 4090 competes normally: https://tpucdn.com/review/nvidia-gef...wer-gaming.png

        The main special feature of the Radeon 7000 series, at least for those who do not care about games (because the game performance remains to be seen in future reviews), is a much better performance per dollar than for the NVIDIA GPUs. The DisplayPort 2.1 is just a bonus.


        If AMD had not offered a better performance per dollar, then there would not have been any reason to buy an AMD GPU, because the NVIDIA GPUs have much better software support, also on Linux, where the NVIDIA drivers, settings, CUDA libraries and SDK etc., work very well.

        Both AMD and NVIDIA have chosen prices for their second best card so that their top card has much better performance per dollar, even if at a higher price. In this way they can milk both the buyers who can afford to buy whatever provides the best return for their money and those with a limited budget, which will be content with the slower but less expensive card.

        The price and performance (measured as FP32 throughput) for Radeon 7900XT have been chosen so that it matches RTX 4080 in performance, but it matches RTX 4090 in performance per dollar.

        For Radeon 7900XTX, the performance is 77% of that of RTX 4090, but the price is only 62%, so the performance per dollar is higher by 25% for AMD.


        So, for non-gaming applications, the buying decisions are already clear, one can choose NVIDIA GPUs to minimize the cost in time or in money for software development, or one can choose AMD GPUs to minimize the hardware cost for a desired performance target.


        The performance per watt seems to be similar for Radeon 7900XTX and RTX 4090. NVIDIA has just chosen to make a bigger and more power-hungry GPU, to obtain a proportionally greater performance. If however someone is not willing to provide an adequate case and power supply for the huge RTX 4090, RTX 4080 has a so bad performance per dollar in comparison with any AMD GPU that it cannot be considered as a wise purchasing choice.



        Comment


        • #94
          I'm more interested in NAVI 32.

          Comment


          • #95
            Listen people, just take the numbers: 12K Amd Shaders = 18K NV ones, that's because Amd is Triopperant 2FP+1INT and not 2FP_or_2INT. Then each Amd shader group can handle 50% more Special RT work by moving BVH to Asic, that is what NV does. They will have around the same frequency except from the Command Group where Amd people can OC it to around 3Ghz permanent regardless of the shader clock and savings. Will have equivalent software like Upsampling - Frame Generation - Low Latency - Video Processing. Amd Bandwith is unprecedented. Amd will have better Vector adn Scalar wile NV will still have better Matrix calculations.

            So about the same Gpu with only deferent size whatever this means for price and consumption, Amd is 58BT wile NV is 76BT and about 30% heavier. The equivalent size from Amd will be one with 16K shaders but that's a lot faster and still cheaper due to Multi Chip tech.

            Comment


            • #96
              Originally posted by AdrianBc View Post
              While I agree with what you have said about China, also USA "has a broad definition of national security that even sometimes includes economic interests", and actually "sometimes" should be replaced with "always", for both countries.
              That's really not true. Trump was the first to invoke national security powers to impose tariffs for transparently economic reasons. Furthermore, the US has long prioritized free trade, in spite of the damage it's done to several key domestic industries.

              Originally posted by AdrianBc View Post
              The economic sanctions that USA has previously imposed on Huawei, and now on the Chinese manufacturers of SSDs and GPUs,
              Those are extremely recent development, yet you say "always"?

              Originally posted by AdrianBc View Post
              are unlikely to have had any significant impact on the Chinese military, but they have been extremely successful in completely changing the competitive markets for several key consumer products, e.g. smartphones and SSDs, allowing US companies like Qualcomm and Micron to maintain their dominant positions in those markets,
              I can't say much about the latest round of trade restrictions. I think we can probably agree that if they're not part of a coherent strategy, they'll ultimately be short-lived in their impact and probably even self-defeating (i.e. by forcing China to become more self-sufficient in fab capability and supply chain). They do seem to have gotten Xi's attention.

              Originally posted by AdrianBc View Post
              exactly when they were on the verge of losing to Chinese competitors (immediately prior to the recent sanctions, Apple had qualified Chinese SSD's for their products,
              You can probably find many such points, in the past couple decades, when applying such sanctions might've seemed suspicious. There's a bigger picture and that's what matters. The thing I can't say with much certainty is exactly how it's being framed.

              Originally posted by AdrianBc View Post
              By removing their competitors from the global markets, USA guarantees higher purchasing prices for many consumer products all over the world,
              That would be rather short-sighted, don't you think? I really doubt that's goal.

              Comment


              • #97
                Originally posted by AdrianBc View Post
                Both AMD and NVIDIA have chosen prices for their second best card so that their top card has much better performance per dollar, even if at a higher price. In this way they can milk both the buyers who can afford to buy whatever provides the best return for their money and those with a limited budget, which will be content with the slower but less expensive card.
                Perhaps the bigger concern they have is to minimize impact pricing on previous-gen products they still have in inventory and in the sales channel. Once those inventories dwindle, they can always re-price their second-tier models lower.

                I think it's pretty rare for the top-tier card to offer the best perf/$, because most people willing to spend that much are out for the best performance at (almost) any cost.

                Originally posted by AdrianBc View Post
                The price and performance (measured as FP32 throughput) for Radeon 7900XT have been chosen so that it matches RTX 4080 in performance, but it matches RTX 4090 in performance per dollar.

                For Radeon 7900XTX, the performance is 77% of that of RTX 4090, but the price is only 62%, so the performance per dollar is higher by 25% for AMD.
                You're reading waaay too much into fp32 TFLOPS, here. They have traditionally correlated poorly with actual gaming performance.

                Originally posted by AdrianBc View Post
                If however someone is not willing to provide an adequate case and power supply for the huge RTX 4090, ...
                Don't forget to include an active fire-suppression system! Those 16-pin power connectors are a disaster!

                Comment


                • #98
                  Originally posted by WannaBeOCer View Post

                  AMD made some comment about their RDNA3 cards being more future proof. I disagree as we continue to see ray tracing adoption these cards will have longer longevity. Similar to how the GTX 400 series performance out lived the HD5000/6000 series due to the adoption of tessellation.

                  You have to remember Nvidia’s consumer cards are GPGPUs while with RDNA, AMD switched to making gaming GPUs. It’s very rare to see an AMD RDNA card mentioned in publications. While you’ll see plenty of RTX cards used. Same with content creation thanks to Nvidia’s Studio Drivers. At the end of the day Nvidia’s GPUs are targeting multiple markets while AMD with their poor performance in the other markets with their consumer cards are restricted to just the gaming market.
                  Indeed!
                  Even for gaming AMD GPUs sucks as on Linux you cannot even play all games for sure as we cannot possibly have 100% or near-100% compatibility with all games as AMD still refuses to support a simple feature like SR-IOV.
                  As for compute the support always sucked and sucked big time with that awful ROCm not coming by default installed on any distro as Mesa drivers come and if you tried to install it yourself, it was very hard with many errors and problems.
                  So, the whole good for gaming marketing that AMD keeps pushing, looks to me like false advertising as there's no gaming at all when you can't even start the game when the game is not supported by WINE.

                  Comment


                  • #99
                    Originally posted by coder View Post
                    That's really not true. Trump was the first to invoke national security powers to impose tariffs for transparently economic reasons. Furthermore, the US has long prioritized free trade, in spite of the damage it's done to several key domestic industries.
                    USAs sanction war to protect their own companys has a long history, I remember back in the 90 when I was much more into politics, they allready played that game.

                    I can't say much about the latest round of trade restrictions. I think we can probably agree that if they're not part of a coherent strategy, they'll ultimately be short-lived in their impact and probably even self-defeating (i.e. by forcing China to become more self-sufficient in fab capability and supply chain).
                    This has allready happened. If USA hadn't cut them off of TSMC, they would have never bothered to make their own fabs, because thats techically nearly impossible (compete with TSMC) and also financially.
                    Now they have their own CPUs and GPUs and 1/4th of the worlds population to sell that shit. As soon as the west makes a mistake, whooshhh china is there to dominate the market with no dependency on the west.

                    You can probably find many such points, in the past couple decades, when applying such sanctions might've seemed suspicious. There's a bigger picture and that's what matters. The thing I can't say with much certainty is exactly how it's being framed.
                    Do you still believe what politicians tell you? The reasons for almost all their wars where lies. If you want to get a feel for the real reasons you need to follow the money.

                    Comment


                    • Originally posted by AdrianBc View Post


                      The main special feature of the Radeon 7000 series, at least for those who do not care about games (because the game performance remains to be seen in future reviews), is a much better performance per dollar than for the NVIDIA GPUs. The DisplayPort 2.1 is just a bonus.


                      If AMD had not offered a better performance per dollar, then there would not have been any reason to buy an AMD GPU, because the NVIDIA GPUs have much better software support, also on Linux, where the NVIDIA drivers, settings, CUDA libraries and SDK etc., work very well.

                      Both AMD and NVIDIA have chosen prices for their second best card so that their top card has much better performance per dollar, even if at a higher price. In this way they can milk both the buyers who can afford to buy whatever provides the best return for their money and those with a limited budget, which will be content with the slower but less expensive card.

                      The price and performance (measured as FP32 throughput) for Radeon 7900XT have been chosen so that it matches RTX 4080 in performance, but it matches RTX 4090 in performance per dollar.

                      For Radeon 7900XTX, the performance is 77% of that of RTX 4090, but the price is only 62%, so the performance per dollar is higher by 25% for AMD.


                      So, for non-gaming applications, the buying decisions are already clear, one can choose NVIDIA GPUs to minimize the cost in time or in money for software development, or one can choose AMD GPUs to minimize the hardware cost for a desired performance target.


                      The performance per watt seems to be similar for Radeon 7900XTX and RTX 4090. NVIDIA has just chosen to make a bigger and more power-hungry GPU, to obtain a proportionally greater performance. If however someone is not willing to provide an adequate case and power supply for the huge RTX 4090, RTX 4080 has a so bad performance per dollar in comparison with any AMD GPU that it cannot be considered as a wise purchasing choice.


                      Price to performance per dollar would only apply to gaming regarding the 7000 series. In rendering/deep learning, the RX 6000 series was getting wrecked by Nvidia's mid-range GPUs which cost less than the RX 6900/6950. We still have no clue how well the 7000 series does in deep learning. At first I thought AMD's AI accelerators were their matrix cores from CDNA but reading other articles it seems like they're just inference designed cores to accelerator FSR3. I bought a Titan RTX back in 2018 due to the Titan drivers and unlocked tensor core performance while their Geforce cards are limited to 50% performance. $2500 seems like a lot for a GPU but it paid off since I've had that performance since 2018. This can also be the case for ray traced games as well. Pay a bit extra now but have a card that last longer.

                      https://techgage.com/article/nvidia-...ring-champion/
                      Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite


                      Old article with the RTX 30 series/Titan RTX

                      https://techgage.com/article/best-gp...-blender-more/
                      Last edited by WannaBeOCer; 05 November 2022, 06:08 PM.

                      Comment

                      Working...
                      X