Announcement

Collapse
No announcement yet.

Radeon Vega 12 Support Called For Pulling Into Linux 4.17 Kernel

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by oooverclocker View Post
    I would consider a Vega 10 (64 CU-Vega) refresh advertised as 12nm with slightly higher boost.
    I don't think so. The new Polaris was, if I rember right, the Polaris 20/21, so the refresh of Vega would be Vega 20, which was spotted in the past.

    Comment


    • #12
      Mesa 18.1-devel Git, but from that perspective it's a trivial addition and could be easily back-ported to Mesa 18.0 too
      Will there be an Mesa 18.0?
      We have 18.0.0-rc5 but there's release notes for 18.1.0.

      Comment


      • #13
        Originally posted by Nille_kungen View Post
        Mesa 18.1-devel Git, but from that perspective it's a trivial addition and could be easily back-ported to Mesa 18.0 too
        Will there be an Mesa 18.0?
        We have 18.0.0-rc5 but there's release notes for 18.1.0.
        18.0 should be out today.... https://www.phoronix.com/scan.php?pa...esa-18.0-Today but looks like it may have slipped as Emil is on European time and has yet to release.
        Michael Larabel
        https://www.michaellarabel.com/

        Comment


        • #14
          Originally posted by GruenSein View Post
          Assuming that this is true, Vega 12 would be an even smaller part - possibly a replacement for the RX 560 series which coincidently uses the Polaris 12 chip.
          Close. RX 560 uses Polaris 11. RX 550 uses Polaris 12.

          https://videocardz.com/68586/amd-lau...-rx-500-series

          Now, if we consider that Polaris 11 is a 16-CU part and Polaris 12 is a 8-CU part and regard the 24-CU Vega as "Vega 11", then perhaps "Vega 12" might actually feature half that, or 12 CUs?
          Last edited by coder; 23 March 2018, 05:06 PM.

          Comment


          • #15
            Originally posted by cybertraveler View Post
            Is there any good info out there on what the "Vega 12" products will be? I've been searching and can't find anything.
            The product has not been announced yet, so there should not be any good info out there about what it will be.

            Sometimes stuff leaks out, but I haven't seen any for Vega12 yet.
            Test signature

            Comment


            • #16
              I wonder if the 7nm Vega chips are a thing, and if 7nm can even help the architecture. We can only hope there will be a consumer product out to compete realistically with Volta or whatever NVIDIA launches sometime this year.

              Primary reason why I didn't like Vega64 was the crazy 300+ TDP (according to tests), and there was no hope in hell of a ITX version existing with such a TDP. I have a zotac 1080ti mini card atm which is happy in my ITX system, very acceptable TDP/performance ratio!

              Comment


              • #17
                Originally posted by theriddick View Post
                I wonder if the 7nm Vega chips are a thing, and if 7nm can even help the architecture. We can only hope there will be a consumer product out to compete realistically with Volta or whatever NVIDIA launches sometime this year.

                Primary reason why I didn't like Vega64 was the crazy 300+ TDP (according to tests), and there was no hope in hell of a ITX version existing with such a TDP. I have a zotac 1080ti mini card atm which is happy in my ITX system, very acceptable TDP/performance ratio!
                TDP is not power consumption. It is ridiculous that across the whole internet we keep reading the same misconcemption, and even "tech sites" and youtubers make this mistake.

                TDP is a specification for the cooling solution. How much energy it has to be able to dissipate in order to be used. Most chips rarely if ever reach the TDP value in energy consumption. Especially AMD ones, since AMD in general tends to use higher TDP values for their products, than let's say Intel and Nvidia.

                Comment


                • #18
                  While true, either way the 1080ti made it to ITX form factor while the Vega64 has not. And I have seen heat analysis of V64 and it does output allot more heat.

                  Comment


                  • #19
                    Originally posted by SvenK View Post
                    I don't think so. The new Polaris was, if I rember right, the Polaris 20/21, so the refresh of Vega would be Vega 20, which was spotted in the past.
                    You take these names quite seriously. The numbers seem to be mostly derived from the time when they have started developing them. Even when Polaris 20 might indicate an improvement to version 2 from Polaris 10 it doesn't mean anything for the Vega naming scheme. Those being internal code names it would even make a lot of sense to change this naming scheme every iteration to confuse external people.

                    It would be useful to have Vega in many gaming systems to make developers support their features better but smaller Vegas would just land in mining farms anyway. That makes me quite unsure whether it makes that much sense to replace the RX 500 GPUs that already deliver high frame rates in already released games - so people who just compare FPS and don't look at any additional features already consider it a good buy (with non-mining prices).

                    Originally posted by theriddick View Post
                    While true, either way the 1080ti made it to ITX form factor while the Vega64 has not. And I have seen heat analysis of V64 and it does output allot more heat.
                    Firstly you should notice that the Vega architecture is not only made for people who run games. And it's not only on discrete GPUs but also in Raven Ridge CPUs and in professional GPUs that are just used as co-processor. The raw compute power about the same like the one of a Titan Xp. And we have seen some benchmarks that show you can make use of this power but games must also get some optimizations for the GPU.
                    Thirdly Vega features more than the analogous Nvidia GPUs. For example The ability of computing 16 Bit floating-point numbers with twice the speed(Rapid Packed Math) is currently almost not used by game developers. As well as the primitive shaders of which people do not really know yet how to use them correctly.

                    And lastly AMD has a wonderful technique in their Windows driver called "Radeon Chill". Which is very smart because you get a better gaming experience (e.g. less input latency) although you also get less FPS and less power dissipation when the movement on your screen is slow enough to reduce the frame rate without any impact on the visual gaming experience. Which shows how completely stupid it is to compare how many fps you get in games with the wattage or the price.
                    But because people tend to be simple they rely on this stupid measurement and so no one would measure the power dissipation with Radeon Chill enabled.
                    BTW: I wish this Radeon Chill feature was in the open source drivers as well and enabled by default so people can just set an environment variable to start a game with benchmark mode or use VSync to keep a stable rate of 120 FPS for example.
                    Full Review Up Now: https://www.hardocp.com/article/2017/08/14/amd_radeon_rx_vega_64_video_card_reviewWe took 10 twitch gamers and put them in front of RX V...


                    Yes, gaming on Windows does for example lead to more measured FPS. But in comparison to how smoothly games run on Linux with our wonderful Mesa drivers with 25 to 30 FPS, on Windows they just run like crap, even with 40 FPS. It matters much more how much time there is between each frame than how many you get in a given time span like one second in this case.

                    As a result in my opinion the AMD GPUs are much better optimized for the usual stuff you like to do on a computer than competition and originally you get more for your money when miners don't buy out all cards.

                    The main reason you don't get Titan Xp-performance with Vega often is that many parts of it remain "unused" during gaming situations but they still consume energy.
                    Edit: And the other point is that the GPU might not have a perfect clock/voltage ratio with the standard bios. So I made the experience that you can usually make very big GPUs like an R9 390X consume just half the energy with optimal settings when you relinquish about 10-20% of performance.
                    But the thing is that most people don't want this!
                    Last edited by oooverclocker; 24 March 2018, 04:01 AM.

                    Comment


                    • #20
                      Mate, honestly I don't care.

                      I will care when AMD can:

                      Release a card that outperforms the 1080ti
                      Release a card that is not a single watt higher then 1080ti in power consumption
                      Release a card that can fit in a ITX case.

                      Everything else is empty chest pounding as far as I'm concerned!

                      Comment

                      Working...
                      X