Announcement
Collapse
No announcement yet.
Open-Source ATI R600/700 Mesa 3D Performance
Collapse
X
-
Originally posted by perpetualrabbit View PostAlso, is there a way to fix the framerate in games to a certain amount (say 30Hz) and do benchmarking based on (lesser is better) CPU and GPU load instead?
I'm really puzzled by this...
However, if your game is able to just barely to 60 FPS (I'm going to assume that as the monitor's refresh rate for the rest of this post), then the moment that any frame requires a little too much calculation it misses the vblank. It then takes that frame twice as long before it is displayed as it has to wait until the next vblank. That's the part most people miss. 60 FPS means 16.666ms. If a scene is complex, it won't just be one frame that takes a little longer, but every frame. If frames are taking even 16.668ms to render, you are missing the vblank once on every single frame, and so you're rendering at half of the refresh rate (.e.g, 30 FPS). Dropping from 60 FPS to 30 is freakin' huge.
30 FPS is acceptable for many games. Quite a few games are actually designed for 30 and not 60 FPS. However, very action-heavy games aim for 60 because a large number of gamers can "feel" the difference in frame rates up to around 40 (some are even more sensitive, some are less, but 40 is around average). Not just about vision, but also the jerkiness of motion and controls. Lot of discussion on that already, not going to get involved, other than to say that 30 FPS simply isn't sufficient for many people in a game with a lot of motion, and your option is either 30 or 60, nothing in between, when sync-to-vblank is enabled.
Even if you aim for 30, if your game is just barely doing 30 and then the scene gets complex or the PC slows down, now you're missing that 30 mark and you end up going from 1/2 refresh to 1/3 refresh, or 20 FPS... which is very noticeably low, and starts to feel very jittery even to casual gamers. If the game can't handle the 20, you end up in 15 FPS, which is at the edge of the unplayable range.
So your game wants to hit 60 FPS most of the time, maybe 30 for low-action games. However, you want your game to run on old hardware. You don't design your game to run on the $600 graphics cards, because most of your customer base is using 2-year-old $150 graphics cards (if that). So you need your game to hit a consistent 60 FPS even in very complex scenes on very old hardware. So you want your game to run at very, very, very high frame rates on modern hardware, even though when the game is actually running it will be locked at 60 FPS.
For game engine developers, they want those huge framerates for the additional reason of ensuring that their engine can handle unexpected load. If you've got a level and you only optimize up to the point of handling 60 FPS then you give yourself no room for when a new and more complex level is added, or when an unexpected number of enemies or bullets or other effects ends up on screen. You leave no room for when physics or AI suddenly starts taking 40% more CPU than what your test levels required. Your engine is limited to simpler scenes and game logic because it can't sustain 60 FPS otherwise.
Benchmarks are run with sync-to-blank off because otherwise all the benchmarks would say is "this card can run this specific game at these settings without problems," which isn't very useful when you're trying to get an idea as to which card is a better investment or better technology. Sure, maybe both card A and card B run Bioshock 2 at a sufficient 60 FPS at max settings, but if card A is capable of running it at 120 and card B can do it at 180, and both are $150 cards, you know that card B is the better buy because it's more likely to be able to handle this next's games at 60 FPS while card A is probably going to need to be replaced to handle the newer games.
It's also a bit of a marketing stunt. All of the high-end video cards serve the same purpose as the high-end CPUs. Nobody with a brain actually buys the $1,000 Intel FX CPUs. They're released as marketing gimmicks. Some idiot goes, "omg Intel makes the fastest CPU!" and then goes to the store to buy a new computer. He can't afford one with a $1,000 CPU, but he can afford one with a $200 CPU, and he just automatically goes and buys the Intel one without even checking to see if the AMD part at the $200 price point is better or not. With video cards, someone sees the behemoth $600 NVIDIA card that trounces the top-end AMD part and then goes and assumes that the $100 NVIDIA card must be better than the $100 ATI card because he's too stupid to actually compare the parts that he's actually considering buying.
Point is, no game you're actually playing should be running at higher than 60 FPS (or whatever your monitor refresh rate is) unless the game developer is an idiot, the driver's sync-to-vblank is non-functional (Linux, I mean you), or the gamer turns off sync-to-vblank.
As to why some gamers turn off sync-to-vblank, the answer is the same as to why they buy cases with blue LEDs and cut-out sides and aesthetically-designed video cards, which is for the same reason as why businessmen buy Ferraris: when you're pathetic, you find whatever you can to make yourself look better than other people by comparison. The business guy makes up for his 4" wang with a sport car and the gamer makes up for it by having 100+ FPS in Crytek3-based games. Penis envy is a costly condition.
Leave a comment:
-
hey since I still want to prove a little something: I just tested h.264 playback on my radeon hd2600 pro(which i have overclocked)
Leave a comment:
-
Originally posted by yotambien View PostNothing that a bit of MOTION BLUR can't solve!
OK, it's a bit excessive, but the results are quite impressive given that the original footage was stop-motion animation.
Leave a comment:
-
Originally posted by deanjo View PostHeh, watch any of the early sci-fi films in HD such as the original Star Trek Movies and Star Wars.
OK, it's a bit excessive, but the results are quite impressive given that the original footage was stop-motion animation.
Leave a comment:
-
Originally posted by yotambien View PostBah, now you spoiled it. At least it would be different to the KDE vs Gnome ones we usually have.
At least you just gave me the perfect excuse to post this video about LHC and the DESTRUCTION of EARTH :O
Originally posted by deanjo View PostHeh, watch any of the early sci-fi films in HD such as the original Star Trek Movies and Star Wars.Seriosly, I figured out how they did the scenes in space without ever watching any making of :') You can just see the transparent overlays
Just look at the death star for crying out loud
Leave a comment:
-
Originally posted by V!NCENT View PostWatch the Matrix on a DVD and then again on Blu-Ray. Guess which one is totally unrealisic?
With a lot of films that have CGI special effects you can just see how deep the realism goes with computer graphics and what's filmed in real life
Leave a comment:
-
Originally posted by yotambien View PostThen I watched the third part on DVD on a good TV only to realize how bad and unrealistic it looks.
With a lot of films that have CGI special effects you can just see how deep the realism goes with computer graphics and what's filmed in real life
Leave a comment:
-
Originally posted by V!NCENT View PostBut lets get back to the subject before this ends in a giant flamewar...
At least you just gave me the perfect excuse to post this video about LHC and the DESTRUCTION of EARTH :O
Leave a comment:
Leave a comment: