No announcement yet.

UT2K4: ATi Vs nVidia IQ.

  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by mirv View Post
    It would be interesting to see if opengl performance is down across the board with nvidia's latest drivers, or if it's just a ut2004 thing. I wonder if opengl 3.0 support would have anything to do with it as well (who knows what internal changes have been made with that).
    If anything, the OpenGL 3.0 support would only mean that the necessary routines, extensions, etc have been added to the nVidia ICD (libGL[core].so), and maybe they even have it in a separate library... I have not checked... I'm not sure if support for GL3.0 would brake GL1.5/2.0 support (which AFAIK is what UT2K4 actually requires), I thought GL3.0 was backwards compatible with at least 1.5/2.0 (and if anyone knows better than me, wasn't GL2.0 supposed to only add support for SM3.0 [among other things, of course!], while GL1.5 had up to SM2.1, but in a way 2.0 was only a little improvement over 1.5?)

    At any rate, it could be only with the new Beta drivers that this is present, however it would indeed be interesting to see if this "degradation" in performance is local to UT2K4 and/or other games/apps, and if previous (17x.x) drivers also exhibit the issue... I'm affraid that this would have to be tested with a series 8 card, as my 9800GT was only recently officially supported by the 177.82 driver (AFAIK). I will, however test at least these two driver sets against UT2K4, using the in-game benchmark system and the PTS system for other games (Doom3, Quake4, Nexuiz, Unigine, Lighting... suggestions?)


    • #17
      Well, I may have found the root of the poor performance in UT2004: Lack of AGP support... That's right. In the log I get a Warning:

      Log: WARNING: Couldn't allocate AGP memory - turning off support for GL_NV_vertex_array_range
      Log: WARNING: This has a serious impact on performance.
      So it would seem that the UT binary expects to find an AGP kernel module or some such as otherwise it does not use GL_NV_vertex_array_range and instead uses a really slow fall back.

      I already got some numbers about benchmarks I've done with the array of games I mostly play between 177.82 and 180.08 drivers, when I have all the numbers I'll post (there is a marginal difference in performance it would seem between Q4 and D3)


      • #18
        Well, that would definitely account for the performance hit. Looking forward to the numbers - can't offer any more ideas though (any attempt at intelligent conversation when I'm really tired is generally a bad idea too).