Originally posted by cl333r
View Post
Announcement
Collapse
No announcement yet.
We need a Campaign for OpenGL Patent Exemptions for OSS, namely Mesa, and Linux
Collapse
X
-
Originally posted by Qaridariumf
but if you got 8-min-fps on dirextX and 12 min-fps this truely matters
No it doesn't and here is why. A minimum is exactly that, a minimum and does not reflect the overall play. That is what the average is for. If your minimum is 8 fps on one system and lasts for one second then no big deal, if your minimum is 12 fps but lasts 10 seconds then that is a huge deal and the average FPS will reflect that flaw. This is why average FPS are used for meaningful benchmarks and not mins/maxes.
Comment
-
Originally posted by yogi_berra View PostDoesn't it stand to reason that as you are just learning an API, you might not know as much about it as people that have used it and while AMD may support that fancy new extension you are learning about, nVidia or Intel do not and vice versa?
Comment
-
Originally posted by yogi_berra View PostDoesn't it stand to reason that as you are just learning an API, you might not know as much about it as people that have used it and while AMD may support that fancy new extension you are learning about, nVidia or Intel do not and vice versa?
The real issue is of course the OpenGL bugs issue which in recent years has improved a lot and is already reasonably good with i.e. Nvidia 260.19.44. The GL bugs issue is mostly because in the late 90' and till like 2005 there was no real contender to window$' dominance, hence nvidia/ati didn't bother much about cross platform stuff. Now that the Mac continues rising (and i.e. Mac 10.7 will feature at least GL 3.2) and that in recent years Linux is king on mobiles and even starts shining on the desktop - both Nvidia and AMD are clearly starting to pay more attention to GL which will take a few years until we got close to ideal OpenGL drivers, but as I said, especially the nVidia drivers are good enough already. The multi-thread stuff is not as important as some folks imply since it's being serialized anyway so it's much more about design than about speed, but even the GL design is good enough (I actually find it quite clean, the core GL 3.3) to create sophisticated cool stuff.
We should persist on GL like we did with Linux - had Linus said: "I quit developing/maintaining Linux cause there's the much more mature BSD" - Linux wouldn't be the king of mobile, (partially on) servers and supercomputers. Linux won only cause people persisted and only then IBM and the likes kicked in. Same is the issue with OpenGL - we need to work around (temporary) bugs and continue creating qualitative software to move the Linux market share forward which in turn will _force_ AMD/Intel/Nvidia to make their drivers even better. Expecting to get a quick easy victory is a bad attitude. Microsoft fights Linux/others by all dirty means and we must accept reality and the temporary but the clearly improving state of GL graphics, which imo will improve even faster in the near future.
Comment
-
Originally posted by Qaridariumyou are fully wrong if you have a card with massiv microstuttering means the fps counter jumps every 10 seconds from 1 fps up to an 100000fps and then again 1fps all 10 seconds you can't play but your average fps is maybe 60fps
this maybe not called stuttering but its microstuttering.
and this microstuttering does not make fun.
windows with 8 min-fps does have MORE microstuttering than linux with 12 min-fps-.
means linux is faster!
1 fps
4 fps
6 fps
5 fps
5 fps
7 fps
3 fps
5 fps
6 fps
10 fps
---------
52 frames over 10 seconds Minimum being 1 maximum being 10
Average frames per second = 52/10 = 5.2 Fps
Now we go with the next 10 fps
2 fps
2 fps
2 fps
2 fps
14 fps
5 fps
4 fps
14 fps
5 fps
2 fps
------------
52 frames over 10 seconds Minimum being 2 maximum being 18
Average frames per second = 52 / 10 = 5.2 fps
Now this is a very small sampling (using an extended benchmark such as Unigine Heaven is going to lessen the Min/Max aberrations as their impact lessens with the number of samples) and by what you say is better is the last one because it's minimum is higher even though the slowdown lasts much longer. Reported one sampling of the Min / Max means absolutely nothing when benchmarking.
It is basic statistical analysis 101.
Comment
-
Originally posted by deanjo View PostIn fact here are some updated results using Heaven 2.5
Total deviation between 4 tests, Win 7 DX11, Win 7 openGL, Linux 32 and Linux 64 is .5% which falls well into a negligible difference due to no 2 results will ever be identical because of items like background apps/services, spread spectrum deviation, clock drift, etc, etc.
Mine hardware spec differs with intel quad core CPU (c2q [email protected])
Comment
-
Originally posted by Qaridariumhey nice try but you do it wrong because in your benchmark the max fps is not 10 its 120 and other benchmarks in the pts like Q3 benchmarks do have 350+ fps..
an REAL example about what i mean:
1 fps
1 fps
1 fps
1 fps
1 fps
1 fps
1 fps
1 fps
1 fps
1000 fps
------------
1009 frames over 10 seconds Minimum being 1 maximum being 1000
Average frames per second = 1009 / 10 = 100,9 fps
means in real your average frames per second count is bullshit!
1 fps
10 fps
10 fps
10 fps
10 fps
10 fps
10 fps
10 fps
10 fps
1000 fps
----------
1081 frames / 10 seconds Average 108 fps which is still better performning no matter how you cut it and the average shows that it is a better experience. (Not saying that the average is accurate measure of performance but it at least reflects a change, min/max doesn't)Last edited by deanjo; 10 April 2011, 06:30 PM.
Comment
-
Originally posted by blacknova View PostWow, seems like NVIDIA have some serious performance issues fixed in their OpenGL driver in newer windows driver. On mine rig in OpenGL mode I've got less than half of DirectX 11 performance.
Mine hardware spec differs with intel quad core CPU (c2q [email protected])
Comment
-
Originally posted by Qaridariumon your exampel the Average 108fps shows like the game runs well in realworld... but in realworld the game runs like shit!
so now we do an real good min-fps exampel the min-fps should be 60fps.
60 fps
100 fps
100 fps
100 fps
100 fps
100 fps
100 fps
100 fps
100 fps
140 fps
----------
1000 frames / 10 seconds Average 100 fps
the average fps is slower than on your example but because of the min-fps the game runs very very well in all situations.
means your average fps is a lie and only min-fps matters.
60 fps
60 fps
60 fps
100 fps
100 fps
100 fps
100 fps
100 fps
100 fps
140 fps
----------
920 frames / 10 seconds = 92 fps. Min and Maxes are the same as in your example. What number reflects that your example is giving better performance?
Comment
-
Originally posted by Qaridariumthe only true point is that the min-fps rate is the only point that matters.
Min / Max mean nothing of value as there is absolutely no indication of how long it lasts or what was happening on the entire system.
and this fact makes openGL/linux a winner.
Comment
Comment