Originally posted by Strunkenbold
View Post
Announcement
Collapse
No announcement yet.
RadeonSI/RADV Mesa 17.3 + AMDGPU DC vs. NVIDIA 387.12 Linux Gaming Performance
Collapse
X
-
-
Originally posted by theriddick View PostVEGA64 only %10 faster then a R9 Fury card, truly my brain struggles with this..... (in actual game performance)
This sort of issue really needs to be investigated by AMD, the Vega64 should be at minimal %30 faster then a standard R9 Fury, SURELY!¿
Right now it makes no sense to buy a new Vega when you can get used Furys for 250€ less and have little to no performance loss.
Leave a comment:
-
Originally posted by starshipeleven View Post4k makes sense on 43'' screens, as there it has a similar PPI (pixel per square inch) of a normal fullhd PC monitor, so you actually have more space.
If you buy smaller screens then you need to upscale your applications or it's too small to be read, and this defies the point of a 4k screen.
I use a 28" screen and the quality compared to 1440P on a 28" is absolutely staggering.
We have full HD on these 4" phones for a reason.
Leave a comment:
-
Originally posted by pal666 View Postyou've been already told that it is useless metric because it doesn't say how often it happens. if it happened once, you can ignore it
- Likes 2
Leave a comment:
-
VEGA64 only %10 faster then a R9 Fury card, truly my brain struggles with this..... (in actual game performance)
This sort of issue really needs to be investigated by AMD, the Vega64 should be at minimal %30 faster then a standard R9 Fury, SURELY!¿Last edited by theriddick; 26 October 2017, 03:55 AM.
Leave a comment:
-
Originally posted by Leopard View Post
Min fps is a major metric.
Did you even encountered one of them on CS:GO and Rocket League , Dota 2 like games?
It is not so important on single player games that much , but it is critical on multiplayer.
Also go do your AMD fanboying somewhere else. I also love AMD but i'm smart enough for not calling bad results as good ones.
Minimum fps doesn't tell you much as it just shows you one single moment. The GPU could perform smoothly overall but a single moment and receive harsh min-fps rating. Another GPU could have better min-fps but reach it more often and the whole experience would be worse due to lack of smoothness.
What you really want is 95th or 99th percentile which tells you the minimum fps of best 95% or 99% of frames. This way you remove outliers from the data and can actually tell something about smoothness.
Also, take a look at the first Mad Max, where Fury has min fps of 88.25 and 1070 has 1.67. Do you tell me Fury is ~50x more smooth than 1070 here? I think this proves how useless of a metric min fps is.
- Likes 1
Leave a comment:
-
bridgman nuetzel The problem is vertex buffers are in RAM, not VRAM. I can get 4x better FPS if I put them in VRAM. This is a driver decision, not app decision - it's fixable universally. That should get us to 18 FPS for Vega. The next bottleneck is the GPU vertex throughput. Not sure what we can do with that - maybe primitive shaders in Vega, which would be an insane amount of work.
- Likes 7
Leave a comment:
-
Not sure if this bit of info is worth anything for bit of inspiration or code path comparison, but.... I was originally recommended to ParaView for workstation graphics testing by the Intel guys working on the SWR rasterizer as one of the workstation graphics tests where SWR vs. LLVMpipe can show a big difference in performance in their code paths.
Leave a comment:
Leave a comment: