Would be really interesting to have tensorflow benchmark results also for Vega.
Announcement
Collapse
No announcement yet.
NVIDIA GeForce RTX 2060 Linux Performance From Gaming To TensorFlow & Compute
Collapse
X
-
Originally posted by pracedru View PostWow... The 2060 is about 75% faster than the 1060.
That is progress.
And it matches the 1080 in some games.
I'm going to buy one of those i think.
Average price for a 1060 6GB currently is around $270, so
360/270=1.333...=33% more expensive
Shave off the increased price from the 2060 and you're at 50% faster for the same price.
Still a great improvement, but price/performance is now only head to head with the 580/590 which in my experience has more reliable Linux drivers.
If you compare products with the same performance though I guess it's slightly better than a Vega 64 due to better efficiency and value, but the proprietary driver and NVidias anti-competitive behavior is still a flaw so I guess it's a tough choice now instead of a clear winner for AMD.
- Likes 3
Comment
-
Originally posted by Kemosabe View PostThank you for supporting the company with questionable policies and limited proprietary drivers which are just a nightmare for developers which don't just develop games.
I am actively developing a CAD program and I have absolutely no problems with developing on a system with a Nvidia gfx card. Developing on Linux and targeting Nvidia has consistently been a pleasure on Linux. I cannot say the same about Radeon/AMD. Nvidia has in fact delivered a professional and consistent platform for both Windows and Linux for a very long time. Their driver is not open source though, and that might be a reason for me to switch to AMD.
But if Nvidia's midrange/low-end gfx card beats the best of the AMD line.... Well hmm. I'll have to think about it.
- Likes 3
Comment
-
Originally posted by Dedale View PostInteresting. I was a bit afraid by the growing TDP but the consumption seems to be contained. Should i understand that the tests measured the wattage of the whole system and not specifically the card ?Michael Larabel
https://www.michaellarabel.com/
Comment
-
Originally posted by pracedru View Post
i was actually promising my self that I would buy an AMD gfx card next time, because of the driver issue. I was just amazed by the progress, and maybe I got a little carried away.
I am actively developing a CAD program and I have absolutely no problems with developing on a system with a Nvidia gfx card. Developing on Linux and targeting Nvidia has consistently been a pleasure on Linux. I cannot say the same about Radeon/AMD. Nvidia has in fact delivered a professional and consistent platform for both Windows and Linux for a very long time. Their driver is not open source though, and that might be a reason for me to switch to AMD.
But if Nvidia's midrange/low-end gfx card beats the best of the AMD line.... Well hmm. I'll have to think about it.
I can imagine that you have as little issues with an CAD program as with a game tho.
I'm more referring to the scientific field which includes the development of production ready programs.
Nvidia's advertisement strategy is highly aggressive and goes so far that it is not even possible to purchase non-nvidia quadro products for the labs as part of a university supplier agreement.
Also, it is indeed a nightmare to develop with this because of not only proprietary drivers in a heterogeneous environment with a number of LTS Linux distributions with chronically slightly outdated software stack and CUDA, which of course only runs with NVIDIA. But CUDA has such a high market dominance in the field due that there is no way around it.
And before you keep telling me that nvidia performs better: It might only be relevant in a one man show freelancer company but it is simply unrealistic to assume that the latest generation of dramatically over-prized hardware is available.
De-facto, non-nvidia was the more efficient and cheaper solution.
And for my private use NVIDIA is already a no-no due to EGLStream.
Comment
-
Originally posted by pracedru View PostBut if Nvidia's midrange/low-end gfx card beats the best of the AMD line.... Well hmm. I'll have to think about it.
A $360 card is not mid-range, it's a high-end card. The whole reason why NVidia is pushing ray-tracing so much is because they want to make games more demanding so people pay more for their products. By pushing RTX they are trying to sell chips with similar die-size as some of their enterprise chips to consumers for incredible amounts of money. If you look at the Radeon Instinct MI60 you can see that AMD are clearly able to compete with NVidias Tesla V100 in terms of TFLOPS, they're just not as aggressive with their marketing to consumers.
- Likes 4
Comment
-
Price, performance and power consumption look like a card that should end at 70. This isn't the super cheap but competitive mainstream card for the masses. Like the 1060 was before, the 960 was before, the 760 was before etc. This should be called a 2060 Ti at the very least. Or 2070 and the current 2070 should be 2070 Ti.
- Likes 2
Comment
-
Originally posted by pracedru View PostWow... The 2060 is about 75% faster than the 1060.
That is progress.
And it matches the 1080 in some games.
I'm going to buy one of those i think.
- Likes 1
Comment
-
Originally posted by Michael View Post
Right, as said in the article it's the overall AC system power consumption.
A well regarded French website used to do tests including the wattage of the cards themselves and sometimes IR images of the thermal radiation by IR camera. I guess IR cameras aren't cheap. They ceased when their main tester left for greener pastures.
If you are interested here is an example: https://www.hardware.fr/articles/957...photos-ir.html
Comment
Comment