Looking a the performance per Watt, it looks like NViDIA is just pumping up power consumption to get more frames, essentially linearly.
With the exception of MadMAx, the consumption per Watt is equivalent for all cards (within 0.05 circa)
I am not sure that this is a great advancement, apart from the chip (Nvidia) being able to eat all that power (300++ Watt!!)
Overall I am also not sure that FPS per Watt alone as currently calculated means much.
TBH it would be great to see performance per watt at different capped framerate (@phoronix) -- e.g. 30, 60, 90, 120fps
It should reveal how the power consumption scales; similarly to the torque of a car, it would reveal the real power consumption of those cards (AMD&NVidia) across the usage spectrum.
If I had to pick now a card for 1080p -1440p, looking at FPS & FPS per Watt I would pick a Radeon 5700XT (often better min FPS than the 3080), but I am not sure how the power consumption would look like with (e.g.) vsync on (60FPS)
With the exception of MadMAx, the consumption per Watt is equivalent for all cards (within 0.05 circa)
I am not sure that this is a great advancement, apart from the chip (Nvidia) being able to eat all that power (300++ Watt!!)
Overall I am also not sure that FPS per Watt alone as currently calculated means much.
TBH it would be great to see performance per watt at different capped framerate (@phoronix) -- e.g. 30, 60, 90, 120fps
It should reveal how the power consumption scales; similarly to the torque of a car, it would reveal the real power consumption of those cards (AMD&NVidia) across the usage spectrum.
If I had to pick now a card for 1080p -1440p, looking at FPS & FPS per Watt I would pick a Radeon 5700XT (often better min FPS than the 3080), but I am not sure how the power consumption would look like with (e.g.) vsync on (60FPS)
Comment