Originally posted by oooverclocker
View Post
Announcement
Collapse
No announcement yet.
Radeon Vega 12 Support Called For Pulling Into Linux 4.17 Kernel
Collapse
X
-
Originally posted by Nille_kungen View PostMesa 18.1-devel Git, but from that perspective it's a trivial addition and could be easily back-ported to Mesa 18.0 too
Will there be an Mesa 18.0?
We have 18.0.0-rc5 but there's release notes for 18.1.0.
Michael Larabel
https://www.michaellarabel.com/
Comment
-
Originally posted by GruenSein View PostAssuming that this is true, Vega 12 would be an even smaller part - possibly a replacement for the RX 560 series which coincidently uses the Polaris 12 chip.
https://videocardz.com/68586/amd-lau...-rx-500-series
Now, if we consider that Polaris 11 is a 16-CU part and Polaris 12 is a 8-CU part and regard the 24-CU Vega as "Vega 11", then perhaps "Vega 12" might actually feature half that, or 12 CUs?Last edited by coder; 23 March 2018, 05:06 PM.
Comment
-
Originally posted by cybertraveler View PostIs there any good info out there on what the "Vega 12" products will be? I've been searching and can't find anything.
Sometimes stuff leaks out, but I haven't seen any for Vega12 yet.Test signature
- Likes 1
Comment
-
I wonder if the 7nm Vega chips are a thing, and if 7nm can even help the architecture. We can only hope there will be a consumer product out to compete realistically with Volta or whatever NVIDIA launches sometime this year.
Primary reason why I didn't like Vega64 was the crazy 300+ TDP (according to tests), and there was no hope in hell of a ITX version existing with such a TDP. I have a zotac 1080ti mini card atm which is happy in my ITX system, very acceptable TDP/performance ratio!
Comment
-
Originally posted by theriddick View PostI wonder if the 7nm Vega chips are a thing, and if 7nm can even help the architecture. We can only hope there will be a consumer product out to compete realistically with Volta or whatever NVIDIA launches sometime this year.
Primary reason why I didn't like Vega64 was the crazy 300+ TDP (according to tests), and there was no hope in hell of a ITX version existing with such a TDP. I have a zotac 1080ti mini card atm which is happy in my ITX system, very acceptable TDP/performance ratio!
TDP is a specification for the cooling solution. How much energy it has to be able to dissipate in order to be used. Most chips rarely if ever reach the TDP value in energy consumption. Especially AMD ones, since AMD in general tends to use higher TDP values for their products, than let's say Intel and Nvidia.
Comment
-
Originally posted by SvenK View PostI don't think so. The new Polaris was, if I rember right, the Polaris 20/21, so the refresh of Vega would be Vega 20, which was spotted in the past.
It would be useful to have Vega in many gaming systems to make developers support their features better but smaller Vegas would just land in mining farms anyway. That makes me quite unsure whether it makes that much sense to replace the RX 500 GPUs that already deliver high frame rates in already released games - so people who just compare FPS and don't look at any additional features already consider it a good buy (with non-mining prices).
Originally posted by theriddick View PostWhile true, either way the 1080ti made it to ITX form factor while the Vega64 has not. And I have seen heat analysis of V64 and it does output allot more heat.
Thirdly Vega features more than the analogous Nvidia GPUs. For example The ability of computing 16 Bit floating-point numbers with twice the speed(Rapid Packed Math) is currently almost not used by game developers. As well as the primitive shaders of which people do not really know yet how to use them correctly.
And lastly AMD has a wonderful technique in their Windows driver called "Radeon Chill". Which is very smart because you get a better gaming experience (e.g. less input latency) although you also get less FPS and less power dissipation when the movement on your screen is slow enough to reduce the frame rate without any impact on the visual gaming experience. Which shows how completely stupid it is to compare how many fps you get in games with the wattage or the price.
But because people tend to be simple they rely on this stupid measurement and so no one would measure the power dissipation with Radeon Chill enabled.
BTW: I wish this Radeon Chill feature was in the open source drivers as well and enabled by default so people can just set an environment variable to start a game with benchmark mode or use VSync to keep a stable rate of 120 FPS for example.
Full Review Up Now: https://www.hardocp.com/article/2017/08/14/amd_radeon_rx_vega_64_video_card_reviewWe took 10 twitch gamers and put them in front of RX V...
Yes, gaming on Windows does for example lead to more measured FPS. But in comparison to how smoothly games run on Linux with our wonderful Mesa drivers with 25 to 30 FPS, on Windows they just run like crap, even with 40 FPS. It matters much more how much time there is between each frame than how many you get in a given time span like one second in this case.
As a result in my opinion the AMD GPUs are much better optimized for the usual stuff you like to do on a computer than competition and originally you get more for your money when miners don't buy out all cards.
The main reason you don't get Titan Xp-performance with Vega often is that many parts of it remain "unused" during gaming situations but they still consume energy.
Edit: And the other point is that the GPU might not have a perfect clock/voltage ratio with the standard bios. So I made the experience that you can usually make very big GPUs like an R9 390X consume just half the energy with optimal settings when you relinquish about 10-20% of performance.
But the thing is that most people don't want this!Last edited by oooverclocker; 24 March 2018, 04:01 AM.
- Likes 3
Comment
Comment