Take a chill pill guys and/or girls.
Minimum framerate does matter - it is important. That's just plain old engineering, statistics be damned. If it's getting low, they're going to want to know where, and why, because there's no point working on a driver section if it's nice (stable) & speedy.
@artvision:
Generally, some of the more high-end developers might like access to lower level close-to-the-metal access, but they definitely don't want to have to program separate code paths for separate devices, so you'll be needing some type of API layer. If you're taking the words from a certain AMD member, he clarified his position later on.
Announcement
Collapse
No announcement yet.
We need a Campaign for OpenGL Patent Exemptions for OSS, namely Mesa, and Linux
Collapse
X
-
Leave a comment:
-
Originally posted by Qaridariumthere is no "LOL" and yes it is! fore sure!
all that doesn?t matter if level loading makes the system stuttering then your harddrive/ssd/ramdisk is bad and if an background app makes the system stuttering the system is more than bad for gaming and if there is an I/O bottleneck the system is just bad for gaming at all all that stuff just say 1 think: only min-fps matters.
If a system does have the best of the best stuff the low-fps is high but if a system is crap the low-fps is low and if the hardware is the same then only the software is bad/good.
means windows is bad.
and again only min-fps matters. all other stuff is bullshit!
no the openGL implementation is better because higher result on the only matter low-fps stuff.
there is no more to talk.
Leave a comment:
-
Originally posted by artivision View PostI just said before, developers want direct access to hardware not APIs like DX11!!!
Leave a comment:
-
The Truth
I just said before, developers want direct access to hardware not APIs like DX11!!! And i mean even to the asic or specific or fixed functions!!! OpenCL has this capability to utilise this functions even in a FPGA!!! Then they can produce for example (the unreal engine 4), with all the FX are made of custom designs and custom algorithms, not Shaders and Tringles but Rays and Spheres for example!!!
Leave a comment:
-
There is also the fact that my system uses some pretty damn old drives separates for the windows system which may or not impact results slightly or any other variable (who knows, for all we know adobe or ms or something else may have been running an update checker in the background). The differences are far to minute to say Linux 64 is faster then Win 32 or Linux 32. Hell even a rogue flash ad on the browser that was open could account for the low.
Leave a comment:
-
Originally posted by crazycheese View PostTo neutralize the effect of min or max borders the test is run and sampled multitude of times, then the calculated weighted mean (instead of arithmetic mean) is taken as trustworthy.
Which means, simplified, deanjo should run the tests several times and record min values each time, then do weighted mean calculation as well as provide highest min and call the end result.
Leave a comment:
-
To neutralize the effect of min or max borders the test is run and sampled multitude of times, then the calculated weighted mean (instead of arithmetic mean) is taken as trustworthy.
Which means, simplified, deanjo should run the tests several times and record min values each time, then do weighted mean calculation as well as provide highest min and call the end result.
I don't think it will change though. The API is on par with DX, unlike what elanthis said.
His other points - wither DX is way easier to program or OpenGL offers no multithread rendering are still not disproved.
Meanwhile microsoft has hate ++. Paying money to company (without choice mind you) that uses it to sabotage cross-platform standards and improve its own(so you have even less choice) is plain retarded. Way to go, bill & steve.
Leave a comment:
-
Originally posted by Qaridariumthe only true point is that the min-fps rate is the only point that matters.
Min / Max mean nothing of value as there is absolutely no indication of how long it lasts or what was happening on the entire system.
and this fact makes openGL/linux a winner.
Leave a comment:
-
Originally posted by Qaridariumon your exampel the Average 108fps shows like the game runs well in realworld... but in realworld the game runs like shit!
so now we do an real good min-fps exampel the min-fps should be 60fps.
60 fps
100 fps
100 fps
100 fps
100 fps
100 fps
100 fps
100 fps
100 fps
140 fps
----------
1000 frames / 10 seconds Average 100 fps
the average fps is slower than on your example but because of the min-fps the game runs very very well in all situations.
means your average fps is a lie and only min-fps matters.
60 fps
60 fps
60 fps
100 fps
100 fps
100 fps
100 fps
100 fps
100 fps
140 fps
----------
920 frames / 10 seconds = 92 fps. Min and Maxes are the same as in your example. What number reflects that your example is giving better performance?
Leave a comment:
Leave a comment: