Originally posted by tuxd3v
View Post
Announcement
Collapse
No announcement yet.
NVIDIA GeForce GTX 680 To RTX 2080 Ti Graphics/Compute Performance
Collapse
X
-
-
Originally posted by dungeon View Post
He is talking about ROCm platform requirements:
https://rocm.github.io/ROCmInstall.h...rdware-support
Basically you can't use that without specific CPU/GPU/mobo combo. Not even everything what you can currently buy very new on the market is supported, so *read carefully* right there
When tuxd3v said it DON'T work, he means particulary that ROCm don't work as he didn't met particular hardware combo requirement to be able to use it as only specific and selected combos are supported
When tuxd3v said "its lotary" that means exactly that - as maybe you will be able to get to use it or maybe not, same like a linux or whatever else in its early days as ROCM is also relatively new project
On top of the page don't miss to read ROCm, a New Era in Open GPU Computing as these New Eras usually does not support past as much or in some cases even current eras
Unlike CUDA from a Past Era who supports things from the past and current eras
Comment
-
Originally posted by TemplarGR View Post
You ignorant amateurs should understand what ROCm is supposed to be before you spew your Nvidia propaganda. ROCm != mere OpenCL. ROCm provides a deeper integration with the CPU and higher capabilities than simply using OpenCL on the dgpu, that is why it needs atomics. If you just want mere OpenCL you can use the binary OpenCL library from the AMD driver, it works side by side with the rest of the free drivers just fine. And it provides more or less the same thing as CUDA.
As people are like that, they think "OK so i remove this green card i have and put red card and now things will work same way", but it is not like that No one would complain if it is like that, by just changing cards but it is not.Last edited by dungeon; 21 September 2018, 04:46 AM.
Comment
-
Originally posted by TemplarGR View PostYou ignorant amateurs
Originally posted by TemplarGR View PostOk we get it, you are just an Nvidia shill...
Are you a troll or just a hater? Can't you just explain politely why people are wrong?
- Likes 2
Comment
-
Originally posted by wizard69 View PostIn some ways this new card is impressive but the power usage is just not acceptable. I know many don't care but I'd buy a lower power (as in watts) simply to avoid the power bill.
You will never generate a big power bill from gaming. Because we game only a small fraction of the cards lifetime, nobody has time to play games the whole day. When you are watching movies and browsing the web etc the GPU is in lower power states.
The power bill becomes a concern when you do things like mining which load the GPU 24x7.
Comment
-
Originally posted by humbug View PostHow is it unacceptable? It hasn't got any less efficient, the performance has gone up with the increased power draw.
You will never generate a big power bill from gaming. Because we game only a small fraction of the cards lifetime, nobody has time to play games the whole day. When you are watching movies and browsing the web etc the GPU is in lower power states.
The power bill becomes a concern when you do things like mining which load the GPU 24x7.
My R9 Fury has the same TDP as the RTX2080ti at 260W. When I was mining Ethereum it would draw 190W per card if I let it run at full speed. That's 0.190×24×30×0.11 = 15,048 USD per month on the power bill. A gamer would only see a fraction of that becouse gaming PC's don't game 24/7.
Let's assume you run demanding game and average 220W in power draw for the card alone. A 260W TDP card doesn't average at 260W becouse of various bottlenecks, V-sync and such things. You play 2 hours per day, every day for a month and the electricity costs 0.11 USD per kWh. 0,220×2×30×0,11 = 1,452 USD per month. No bid deal, and that is quite a lot of hours of gaming.Last edited by Brisse; 21 September 2018, 06:53 AM.
Comment
-
The 2080ti power consumption at load is less here than what windows benchmarking people are seeing. It's coming up at 405+ watts. Might be due to their open bench setups or drivers, but this is the lowest power metric I've seen yet. It sucks a good bit more juice than the vega 64 on windows benchmarks.
Comment
-
Ignorants should pay note to some facts. 6 years and only 4x increase while increasing power draw (so this can't go on indefinitely like this), let that sink in. Moore's Law would say each 18 months you get doubled performance with double the transistors. At least it used to.
If Moore's Law was followed, we'd need a 16x increase in performance over 6 years, not 4x. And also have the same power draw, not way higher (even if more efficient per performance)
Yeah, definitely the evil crypto mining is the reason for this, and not that physical limits are being approached (and fast). /s
- Likes 1
Comment
Comment