Page 2 of 2 FirstFirst 12
Results 11 to 16 of 16

Thread: 24-Way AMD Radeon vs. NVIDIA GeForce Linux Graphics Card Comparison

  1. #11
    Join Date
    Nov 2013
    Posts
    45

    Default

    Quote Originally Posted by Michael View Post
    Read the article why it wasn't tested...
    That'll teach me to read the article rather than scanning the results.

  2. #12
    Join Date
    Jan 2013
    Posts
    991

    Default

    Michael, excuse me please if it was already answered, but why did you use tests with different settings?

    What I mean is - virtually none of the results intersect with earlier 25-way opensource comparison, but both are done on near identical hardware.
    The closest test you done was Xonotic, but opensource got tested on "high" topmost, where closed-source starts from "ultra".

    The only thing I noticed is that opensource driver has much more concentrated fps distribution, where catalyst jumps everywhere between 40 and 500 fps, coming with less than middle (for example 140 fps) average. This is observed by comparing Xonotic high vs ultra sections of the articles.

    Thanks!

  3. #13
    Join Date
    Mar 2007
    Location
    West Australia
    Posts
    368

    Default

    I feel like things like _Min frame rate_ ie graph frame rate analysis, is also important. Avg fps doesn't show things like microstutter or other problems encountered with GPUs.

    For instance on the PC, I get solid 60fps with RAGE but the game stutters like a bitch because it caches from the hard drive. Even the 64bit version of the game (I have 8GB of memory) yet people with SSD don't have this problem. I remember fallout games having a similar issue with anti-aliasing enabled as well. I think therefore it's important to include the frame rate graph showing min frame rate, max and medium.

  4. #14
    Join Date
    Jan 2012
    Posts
    151

    Default

    Quote Originally Posted by b15hop View Post
    i feel like things like _min frame rate_ ie graph frame rate analysis, is also important. Avg fps doesn't show things like microstutter or other problems encountered with gpus.

    For instance on the pc, i get solid 60fps with rage but the game stutters like a bitch because it caches from the hard drive. Even the 64bit version of the game (i have 8gb of memory) yet people with ssd don't have this problem. I remember fallout games having a similar issue with anti-aliasing enabled as well. I think therefore it's important to include the frame rate graph showing min frame rate, max and medium.
    box plots ftw!!!

  5. #15
    Join Date
    Feb 2008
    Location
    Linuxland
    Posts
    5,130

    Default

    Someone needs to write the PTS code, hint hint nudge nudge: http://phoronix.com/forums/showthrea...in-all-GL-apps

    Otherwise the only frame times are from engines that report it, ie OA/quake derivatives.

  6. #16

    Default

    Quote Originally Posted by curaga View Post
    Someone needs to write the PTS code, hint hint nudge nudge: http://phoronix.com/forums/showthrea...in-all-GL-apps

    Otherwise the only frame times are from engines that report it, ie OA/quake derivatives.
    Should be easy for anyone to do considering PTS already supports generating line graphs out of frame times (for engines that expose it), etc.

    In terms of box plots, PTS also supports generating them. IIRC, the reason I don't use them by default in the UI is I couldn't make them look really nice / graphically pleasing,

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •