Announcement

Collapse
No announcement yet.

Open-Source ATI R600/700 Mesa 3D Performance

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • V!NCENT
    replied
    Originally posted by elanthis View Post
    Point is, no game you're actually playing should be running at higher than 60 FPS (or whatever your monitor refresh rate is) unless the game developer is an idiot
    No gamer will ever limit his/her framerate, unless he.she is a total idiot. There is a reason people that have kickass gamer PC's put the resolution on 800x600, turn off the props and turn of vsync... Not only to get more FPS, but also see the dirrences faster (keeping track of 800x600 pixels versus changes in full HD).

    Do you know how many people complained about Doom 3 telling it has over 60fps but doesn't display over 60fps? Well that is because it does render at hiogher than 60fps but the pysics engine is limited at 60fps. Conclusion: people do see the differences!

    And as to why vsync is for idiots who do not know what they are talking about: you get more up to date screen information when you write to the framebuffer during the rendering fase...

    "But all full HD commercial quality titles are shipping at 30-60fps!" Well that is because it's all about shittubes filled with awesomeness these days. Scirpting this, cinematic that. Most often the Extreme Difficulty is about normal difficulty found in older games. Commercial titles these days are for 99,99 procent of the offering; Total garbage.

    Kthnxbye.

    Leave a comment:


  • sabriah
    replied
    Best news in years!!!

    Leave a comment:


  • elanthis
    replied
    Originally posted by perpetualrabbit View Post
    Also, is there a way to fix the framerate in games to a certain amount (say 30Hz) and do benchmarking based on (lesser is better) CPU and GPU load instead?
    Sync to vblank and set your refresh rate to 30.

    I'm really puzzled by this...
    Almost every game wants to enable sync-to-vblank to avoid tearing. Tearing looks awful. Some consoles even require games to use sync-to-vblank to pass quality control for a release.

    However, if your game is able to just barely to 60 FPS (I'm going to assume that as the monitor's refresh rate for the rest of this post), then the moment that any frame requires a little too much calculation it misses the vblank. It then takes that frame twice as long before it is displayed as it has to wait until the next vblank. That's the part most people miss. 60 FPS means 16.666ms. If a scene is complex, it won't just be one frame that takes a little longer, but every frame. If frames are taking even 16.668ms to render, you are missing the vblank once on every single frame, and so you're rendering at half of the refresh rate (.e.g, 30 FPS). Dropping from 60 FPS to 30 is freakin' huge.

    30 FPS is acceptable for many games. Quite a few games are actually designed for 30 and not 60 FPS. However, very action-heavy games aim for 60 because a large number of gamers can "feel" the difference in frame rates up to around 40 (some are even more sensitive, some are less, but 40 is around average). Not just about vision, but also the jerkiness of motion and controls. Lot of discussion on that already, not going to get involved, other than to say that 30 FPS simply isn't sufficient for many people in a game with a lot of motion, and your option is either 30 or 60, nothing in between, when sync-to-vblank is enabled.

    Even if you aim for 30, if your game is just barely doing 30 and then the scene gets complex or the PC slows down, now you're missing that 30 mark and you end up going from 1/2 refresh to 1/3 refresh, or 20 FPS... which is very noticeably low, and starts to feel very jittery even to casual gamers. If the game can't handle the 20, you end up in 15 FPS, which is at the edge of the unplayable range.

    So your game wants to hit 60 FPS most of the time, maybe 30 for low-action games. However, you want your game to run on old hardware. You don't design your game to run on the $600 graphics cards, because most of your customer base is using 2-year-old $150 graphics cards (if that). So you need your game to hit a consistent 60 FPS even in very complex scenes on very old hardware. So you want your game to run at very, very, very high frame rates on modern hardware, even though when the game is actually running it will be locked at 60 FPS.

    For game engine developers, they want those huge framerates for the additional reason of ensuring that their engine can handle unexpected load. If you've got a level and you only optimize up to the point of handling 60 FPS then you give yourself no room for when a new and more complex level is added, or when an unexpected number of enemies or bullets or other effects ends up on screen. You leave no room for when physics or AI suddenly starts taking 40% more CPU than what your test levels required. Your engine is limited to simpler scenes and game logic because it can't sustain 60 FPS otherwise.

    Benchmarks are run with sync-to-blank off because otherwise all the benchmarks would say is "this card can run this specific game at these settings without problems," which isn't very useful when you're trying to get an idea as to which card is a better investment or better technology. Sure, maybe both card A and card B run Bioshock 2 at a sufficient 60 FPS at max settings, but if card A is capable of running it at 120 and card B can do it at 180, and both are $150 cards, you know that card B is the better buy because it's more likely to be able to handle this next's games at 60 FPS while card A is probably going to need to be replaced to handle the newer games.

    It's also a bit of a marketing stunt. All of the high-end video cards serve the same purpose as the high-end CPUs. Nobody with a brain actually buys the $1,000 Intel FX CPUs. They're released as marketing gimmicks. Some idiot goes, "omg Intel makes the fastest CPU!" and then goes to the store to buy a new computer. He can't afford one with a $1,000 CPU, but he can afford one with a $200 CPU, and he just automatically goes and buys the Intel one without even checking to see if the AMD part at the $200 price point is better or not. With video cards, someone sees the behemoth $600 NVIDIA card that trounces the top-end AMD part and then goes and assumes that the $100 NVIDIA card must be better than the $100 ATI card because he's too stupid to actually compare the parts that he's actually considering buying.

    Point is, no game you're actually playing should be running at higher than 60 FPS (or whatever your monitor refresh rate is) unless the game developer is an idiot, the driver's sync-to-vblank is non-functional (Linux, I mean you), or the gamer turns off sync-to-vblank.

    As to why some gamers turn off sync-to-vblank, the answer is the same as to why they buy cases with blue LEDs and cut-out sides and aesthetically-designed video cards, which is for the same reason as why businessmen buy Ferraris: when you're pathetic, you find whatever you can to make yourself look better than other people by comparison. The business guy makes up for his 4" wang with a sport car and the gamer makes up for it by having 100+ FPS in Crytek3-based games. Penis envy is a costly condition.

    Leave a comment:


  • WhiteRabbit
    replied
    hey since I still want to prove a little something: I just tested h.264 playback on my radeon hd2600 pro(which i have overclocked)

    http://global.phoronix-test-suite.co...757-24266-7179

    Leave a comment:


  • deanjo
    replied
    Originally posted by yotambien View Post
    Nothing that a bit of MOTION BLUR can't solve!

    OK, it's a bit excessive, but the results are quite impressive given that the original footage was stop-motion animation.
    I'm not sure if motion blur can cure the cheesy and clearly hand drawn backdrops found on the early Star Trek movies. Just watched Star Trek 3/4 and Vulcan and the Bird of Prey on Vulcan is easily seen as a painting, on regular definition it was much harder to see this.

    Leave a comment:


  • yotambien
    replied
    Originally posted by deanjo View Post
    Heh, watch any of the early sci-fi films in HD such as the original Star Trek Movies and Star Wars.
    Nothing that a bit of MOTION BLUR can't solve!

    OK, it's a bit excessive, but the results are quite impressive given that the original footage was stop-motion animation.

    Leave a comment:


  • V!NCENT
    replied
    Originally posted by yotambien View Post
    Bah, now you spoiled it. At least it would be different to the KDE vs Gnome ones we usually have.
    Heh, whatever... This thread has already taken a life of its own so let's just do it

    At least you just gave me the perfect excuse to post this video about LHC and the DESTRUCTION of EARTH :O
    Rofl xD Where did that come from, Fox News?

    Originally posted by deanjo View Post
    Heh, watch any of the early sci-fi films in HD such as the original Star Trek Movies and Star Wars.
    No need for that... The horror starts with the DVD remaster of Star Wars Seriosly, I figured out how they did the scenes in space without ever watching any making of :') You can just see the transparent overlays Just look at the death star for crying out loud

    Leave a comment:


  • deanjo
    replied
    Originally posted by V!NCENT View Post
    Watch the Matrix on a DVD and then again on Blu-Ray. Guess which one is totally unrealisic?

    With a lot of films that have CGI special effects you can just see how deep the realism goes with computer graphics and what's filmed in real life
    Heh, watch any of the early sci-fi films in HD such as the original Star Trek Movies and Star Wars.

    Leave a comment:


  • V!NCENT
    replied
    Originally posted by yotambien View Post
    Then I watched the third part on DVD on a good TV only to realize how bad and unrealistic it looks.
    Watch the Matrix on a DVD and then again on Blu-Ray. Guess which one is totally unrealisic?

    With a lot of films that have CGI special effects you can just see how deep the realism goes with computer graphics and what's filmed in real life

    Leave a comment:


  • yotambien
    replied
    Originally posted by V!NCENT View Post
    But lets get back to the subject before this ends in a giant flamewar...
    Bah, now you spoiled it. At least it would be different to the KDE vs Gnome ones we usually have.

    At least you just gave me the perfect excuse to post this video about LHC and the DESTRUCTION of EARTH :O

    Leave a comment:

Working...
X