NVIDIA's Release Happiness Continues Into April

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • deanjo
    replied
    Originally posted by BlackStar View Post
    Which indicates a large difference, no?
    Not at all look closely at Melcars results, two different benchmarks were ran. My results are from the tropics bench which he got

    Ubuntu 8.10
    fps: 25.7
    score: 648

    Windows XP
    fps: 26.6
    score: 670
    Mine is


    openSUSE 11.1
    fps: 25.7
    score: 648

    Windows XP 64
    fps: 26.6
    score: 669
    You say that Doom3 is GPU limited on high resolutions, while Unigine is CPU limited? I really doubt that - Doom3 is primarily CPU bound (shadow volume extrusion on the CPU). Its shaders are simple, the polygon count is low, it really doesn't stress a modern GPU at all. Unigine tropics is *much* harder on the GPU.
    If doom3 was CPU limited it would show 0 increase in a sli configuration. This however is not true and gains are still found when running a sli rig at higher resolutions (1920x1200 +) with full eyecandy maxed out.
    Last edited by deanjo; 08 April 2009, 02:12 PM.

    Leave a comment:


  • BlackStar
    replied
    That seems to be his Tropics score.
    Which indicates a large difference, no?

    Not everything is GPU limited, Games such as ET:QW, Quake4 and even old Doom3 are not CPU limited when ran at higher resolutions.
    Parse error. Care to rephrase? (These sentences seem to make sense on their own, but not when put together like this.)

    You say that Doom3 is GPU limited on high resolutions, while Unigine is CPU limited? I really doubt that - Doom3 is primarily CPU bound (shadow volume extrusion on the CPU). Its shaders are simple, the polygon count is low, it really doesn't stress a modern GPU at all. Unigine tropics is *much* harder on the GPU.

    Leave a comment:


  • deanjo
    replied
    Originally posted by Melcar View Post
    That seems to be his Tropics score. Either way, CPU limited or not, the GPU will still work and go under stress. The point I was making was that the driver does perform well in 3D. Besides, if I go by Deanjo's logic everything 3D in Linux is CPU limited and nothing can indicate true GPU performance, which leads me to question why the hell everyone is bitching about poor 3D performance with fglrx (since you know, nothing can show true 3D performance).
    Not everything is GPU limited, Games such as ET:QW, Quake4 and even old Doom3 are not CPU limited when ran at higher resolutions.

    Leave a comment:


  • Melcar
    replied
    Originally posted by BlackStar View Post
    ...

    Edit: #deanjo:
    score: 669 vs score: 1191

    I wouldn't call that similar.

    That seems to be his Tropics score. Either way, CPU limited or not, the GPU will still work and go under stress. The point I was making was that the driver does perform well in 3D. Besides, if I go by Deanjo's logic everything 3D in Linux is CPU limited and nothing can indicate true GPU performance, which leads me to question why the hell everyone is bitching about poor 3D performance with fglrx (since you know, nothing can show true 3D performance).

    Leave a comment:


  • BlackStar
    replied
    [...] My switch to nvidia is triggered by Ubuntu 9.04's release. Nvidia's blob is already compatible with X-Server 1.6, but FGLRX? no f**king way they could accomplish that. OSS? The holy brand new 3D stack still gonna take another year to mature.
    Ubuntu 9.04 comes with an fglrx release that supports XServer 1.6 (R600+ only, IIRC) and open source drivers that support XServer on R100-R700.

    Just pointing this out.

    ATI's OpenGL stack is so broken that it even ignores whether there is a Depth buffer being created or not in glutInitDisplayMode() call.
    I've been using ati hardware since 2003 (R300) and I have never ever seen this - even with their old and ugly OpenGL stack. There are problems when you start pushing the drivers (e.g. try FBO blit with 24/32 bit depth buffers), but ati is not alone in that regard (last time I checked, nvidia cards could not attach a 24bit depth buffer to an FBO, if your context used a 16bit depth buffer and vice versa).

    And again, fglrx is inferior in 2d and video, but performs quite well in 3d. The open-source drivers are better in 2d and video, but lose in 3d. Nvidia drivers tend to be more balanced, but at least with ati you have a choice.

    The grass always looks greener on the other side, but you'll be disappointed if you assume that nvidia drivers are flawless. Check their forums for long-standing issues that have not been fixed.

    Edit: #deanjo:
    score: 669 vs score: 1191

    I wouldn't call that similar.

    Leave a comment:


  • deanjo
    replied
    Originally posted by Melcar View Post
    Unigine Tropics
    OpenGL
    1680x1050
    4x AA
    16x AF
    Shaders High
    Textures High
    Filter Trilinear
    Reflection
    Refraction
    Occlusion
    Volumetric

    Ubuntu 8.10
    fps: 25.7
    score: 648

    Windows XP
    fps: 26.6
    score: 670

    Unigine Sanctuary
    OpenGL
    1680x1050
    4x AA
    16x AF
    Shaders High
    Textures High
    Filter Trilinear
    Translucence
    Parallax Mapping
    Occlusion
    Reflection
    Refraction
    Scattering
    Volumetric
    HDR
    DOF

    Ubuntu 8.10
    fps: 28.1
    score: 1190

    Windows XP
    fps: 28.1
    score: 1191

    Now don't you find it odd that similar system (PhenII,[email protected],8800GT first single and then sli) with exception of the GPU get identical results?

    OpenGL
    1680x1050
    4x AA
    16x AF
    Shaders High
    Textures High
    Filter Trilinear
    Reflection
    Refraction
    Occlusion
    Volumetric

    openSUSE 11.1
    fps: 25.7
    score: 648

    Windows XP 64
    fps: 26.6
    score: 669
    Again, you are CPU limited. BTW the same test using the DX renderer in windows yielded an average of 27.1 but with such small variances that is easily attributed to standard deviation due to various background processes, cpu clock drift, etc.

    Leave a comment:


  • deanjo
    replied
    Originally posted by Melcar View Post
    Unigine Tropics
    OpenGL
    1680x1050
    4x AA
    16x AF
    Shaders High
    Textures High
    Filter Trilinear
    Reflection
    Refraction
    Occlusion
    Volumetric

    Ubuntu 8.10
    fps: 25.7
    score: 648

    Windows XP
    fps: 26.6
    score: 670

    Unigine Sanctuary
    OpenGL
    1680x1050
    4x AA
    16x AF
    Shaders High
    Textures High
    Filter Trilinear
    Translucence
    Parallax Mapping
    Occlusion
    Reflection
    Refraction
    Scattering
    Volumetric
    HDR
    DOF

    Ubuntu 8.10
    fps: 28.1
    score: 1190

    Windows XP
    fps: 28.1
    score: 1191

    Now don't you find it odd that similar system (PhenII,[email protected],8800GT first single and then sli) with exception of the GPU get identical results?

    OpenGL
    1680x1050
    4x AA
    16x AF
    Shaders High
    Textures High
    Filter Trilinear
    Reflection
    Refraction
    Occlusion
    Volumetric

    openSUSE 11.1
    fps: 25.7
    score: 648

    Windows XP 64
    fps: 26.6
    score: 669
    Again, you are CPU limited.

    Leave a comment:


  • FunkyRider
    replied
    They problem is not because of ATI's 'a little bit inferior' driver land, it's because it is well below expectation.

    I've waited FGLRX to solve some very basic problems for like 3 years. My switch to nvidia is triggered by Ubuntu 9.04's release. Nvidia's blob is already compatible with X-Server 1.6, but FGLRX? no f**king way they could accomplish that. OSS? The holy brand new 3D stack still gonna take another year to mature.

    IMO before this promised 3D stack is proven to be useful, I'd better stick with nvidia. I'm a programmer with 15 years of experience and 10+ years of 3D programming, and currently in employment in a respectable 3D CAD company as a software engineer (and is well paid). I want to contribute to the OSS world with some easy to use Linux OGL Shader development tool as my hobby project (ShaderStudioMAX was my one manned project in 2003-ish and yes, it was for Windows exclusively because I wasn't using Linux until since 2005). Guess which card I have to stick to, just so I could use as a working OpenGL platform? ATI's OpenGL stack is so broken that it even ignores whether there is a Depth buffer being created or not in glutInitDisplayMode() call. Quite pathetic for them and I know for sure that even if I fire the bug report they will ignore it.

    What do you call a very sick man who refuses to accept any medical treatment? A DEAD MAN.

    Fortunately for ATI they chose a death-and-rebirth way to fix it.
    Last edited by FunkyRider; 08 April 2009, 06:14 AM.

    Leave a comment:


  • Kano
    replied
    The problem with ATI is always that when you find a problem which seems to be specific to one app and that app is unknown than you are basically ignored for long time (if you call it normal that bug found with gl2benchmark took over 1 year to fix then you must be a huge fan). If a very widely used app then chances are higher that there would be a fix. But in some cases ATI always fails. fglrx is even worse than the limited xv support of the free driver. xv with fglrx is unable to provide flickerfree video and shows green bars with some res which are not divideable by 16. Nv has absolutely no problem showing the same video with xv or vdpau nor has the free driver. As this problem seems to generic to all fglrx drivers nobody really tested video playback, thats 100% sure.

    Leave a comment:


  • susikala
    replied
    So the argument boils down to the nvidia-using people trying to present themselves as 'practical', "yes it's a blob but who cares, it works better than anything ATI has to offer", and the ati-using people stress the philosophical / ideological side and the advantages of using FOSS software.

    (I'm ignoring fglrx here since it's a pile of horseshit imho.)

    In essence I think both sides have good arguments. Though as an ATI user, my only problem with the nvidia-argument is that performance has never been the _only_ key issue on Linux. I mean, the approach of "it should just work and work well" is very Windows-centric. So everyone of you who preaches it here and elsewhere, why do you use Linux to begin with?

    I mean, Linux doesn't have many areas where its applications are superior in functionality to Windows; there are just as many and varied programs for Windows, so you can't be complaining about that either; and games are still largely a Windows domain, so that's not a reason by itself to use Linux either.

    So why do you even use Linux with this approach of "it should just work and I shouldn't be bothered with it"? Isn't that just using the wrong OS?

    Maybe people who have the wrong mentality should just keep to Windows instead of switching and bringing the wrong views with them about how software should be. And that sentence cannot be accused of some form of elitism: I have been a Windows user myself longer than Linux, I only switched about five years ago.

    Leave a comment:

Working...
X