Page 4 of 4 FirstFirst ... 234
Results 31 to 40 of 40

Thread: NVIDIA Performance: Windows 7 vs. Ubuntu Linux 12.10

  1. #31
    Join Date
    Oct 2007
    Posts
    23

    Default

    Thats what I figure, I just thought it was a bit unclear.

  2. #32
    Join Date
    Feb 2012
    Posts
    68

    Default Unredirect Fullscreen Windows in CompizConfig

    Michael next time when you test Unity performance you should enable Unredirect Fullscreen Windows in CompizConfig Settings Manager and also disable Sync to VBlank. The first option should be a must have for anyone that wants to run games under Unity.

  3. #33
    Join Date
    Sep 2007
    Posts
    997

    Default

    Why wasn't Gnome included in the test? Even if it's ****, who else but Ubuntu uses Unity? They (Canonical) don't support KDE (Kubuntu) anymore so it shouldn't matter, right?

  4. #34

    Default

    Benchmarks for scene complexity and jitter maintaining 72hz would be more interesting. One thing not showing up in these benchmarks is ofcourse windows poorer jitter-performance (dropped frames).
    I also tried the latest ubuntu now. Standard kernel is still not configured for low jitter. Benchmarks on that is completely uninteresting for games. It cannot even display 30fps video without jitter. My lowlatency configged kernel can though, but the browser videoplayer should ofcourse be driven by screen refresh of 2-3x value, to avoid all jitter.

    Read also my post on LKML.

    Peace Be With You.

  5. #35
    Join Date
    Oct 2006
    Location
    Israel
    Posts
    556

    Default

    Quote Originally Posted by jrch2k8 View Post
    ... Doom3 is not a relevant test for an OS vs OS, if you actually put 3 neurons to work togheter you will see the reasons like:

    * is closed source so you can't know for sure if it uses the same renderpath in both oses or any other typical dirty hack to gain more FPS that is not present within the other OS
    * neither xonotic and especially doom3 will put any modern gpu to its knees
    ...
    if you wanna reply do it technically i won't bother in answer stupid replys
    Either you've been living under a rock for the past couple of years (...) or you are running for next year's Darwin award.
    1. Doom3 *is* GPL'ed. It has been open sourced in **November** last year. [1]
    2. Doom3 *is* 8 (!!!!!) years old. Even ancient (relatively speaking) GPUs with Open source drivers can run it close to 100 FPS [2] - let alone binary driver users (such as, err, this Win7 vs. Linux benchmark). Heck, it even runs on a smartphone [3].

    On a personal note, had you take 5 minutes to do some *basic* facts checking before pressing the "send" button, you can have saved yourself a major embarrassment.

    - Gilboa
    [1] http://en.wikipedia.org/wiki/Id_Tech_4
    [2] http://www.phoronix.com/scan.php?pag...nux_mesa&num=4
    [3] http://www.phoronix.com/scan.php?pag...tem&px=MTE1ODg
    DEV: Intel S2600C0, 2xE52658V2, 32GB, 4x2TB, GTX680, F20/x86_64, Dell U2711.
    SRV: Intel S5520SC, 2xX5680, 36GB, 4x2TB, GTX550, F20/x86_64, Dell U2412..
    BACK: Tyan Tempest i5400XT, 2xE5335, 8GB, 3x1.5TB, 9800GTX, F20/x86-64.
    LAP: ASUS N56VJ, i7-3630QM, 16GB, 1TB, 635M, F20/x86_64.

  6. #36
    Join Date
    Feb 2008
    Location
    Linuxland
    Posts
    4,994

    Default

    The PTS profile runs the binary version, no? There is no way to know whether that was built from the GPL'd code.

    The point about binary drivers having hacks for specific executables is also valid.

  7. #37
    Join Date
    Oct 2006
    Location
    Israel
    Posts
    556

    Default

    Quote Originally Posted by curaga View Post
    The PTS profile runs the binary version, no? There is no way to know whether that was built from the GPL'd code.

    The point about binary drivers having hacks for specific executables is also valid.
    Please note that only I quoted a couple of grossly erroneous comments.
    BTW, Why do you assume that nVidia *Linux* binary drivers do not use binary-detection-hecks - just as their Windows counter-part? (Given the fact that they share the same code-base).
    ... And whether nVidia employs binary name hacks or not, why should I, as an end user, care?

    - Gilboa
    DEV: Intel S2600C0, 2xE52658V2, 32GB, 4x2TB, GTX680, F20/x86_64, Dell U2711.
    SRV: Intel S5520SC, 2xX5680, 36GB, 4x2TB, GTX550, F20/x86_64, Dell U2412..
    BACK: Tyan Tempest i5400XT, 2xE5335, 8GB, 3x1.5TB, 9800GTX, F20/x86-64.
    LAP: ASUS N56VJ, i7-3630QM, 16GB, 1TB, 635M, F20/x86_64.

  8. #38

    Default Doesnt matter

    It doesn`t matter if it is closed or open source. What we want to see is how it performs, under the best conditions. On a low-latency kernel, HLe2 in wine performs close to an optimized windows XP install. Read also about optimizing winXP here.

    Also one needs to know wheter there are variables that improve performance in the games themselves, like doom 3s 60fps limit.

    I am also working on optimizing Ubuntu 12.04, and will make a post on my blog, and probably here later.

    Then we actually compare the whole OS+code+compiler+whatever.

    Peace Be With You.

  9. #39
    Join Date
    Feb 2008
    Location
    Linuxland
    Posts
    4,994

    Default

    You're right with those erroneus parts, I just wanted to point out the validity of parts of it.

    BTW, Why do you assume that nVidia *Linux* binary drivers do not use binary-detection-hecks - just as their Windows counter-part? (Given the fact that they share the same code-base).
    Popularity. Even if the code for doom3 is there in the linux blob, it's likely only hooked up to act on doom3.exe, not doom3.bin or whatever it's named on linux.

    One counter-example is Q3, both the linux blobs have (had?) confirmed hacks for that.

    And whether nVidia employs binary name hacks or not, why should I, as an end user, care?
    Because it doesn't represent the card's performance. It represents a hack.

    Just like their "optimizing" for the 3dMarks. If they, for example, replace shaders on-the-fly with simpler ones that look nearly the same (note: nearly!), it's cheating, and gives invalid data about the card's performance.

    Ie, you expect it to show the 460GTX performance for light effect X. What it shows is the performance for grossly simplified but similar effect, unfairly skewing results and also giving a different picture than you would have gotten with the original shaders.


    If you only run that one program, then you may like the result, if the picture is not terribly worse. But chances are you will run N different apps, M of those being ones the blob does not hack. Even if they use the exact same shader, you will see worse performance from the less popular apps.

  10. #40
    Join Date
    Oct 2006
    Location
    Israel
    Posts
    556

    Default

    Quote Originally Posted by curaga View Post
    Because it doesn't represent the card's performance. It represents a hack.
    I *completely* disagree.
    Newer nVidia release detect flash (under Linux) and presumably fix the blue-tint bug.
    Is it a hack? Or is it a fix? How can you tell the deference.

    Just like their "optimizing" for the 3dMarks. If they, for example, replace shaders on-the-fly with simpler ones that look nearly the same (note: nearly!), it's cheating, and gives invalid data about the card's performance.
    If the visuals are the same, I couldn't care less.
    If the visuals are not the same, I very much care.
    As before (flash), everything is in the details.

    In the end, this argument is rather meaningless. Linux seems to perform just as good as Windows 7 - or better, with or without hacks.
    There were no reports of different visuals, so per-applications hacks weren't apparent. (Plus, I doubt that nVidia still has Doom3 targeted hacks, given the age of the game).

    - Gilboa
    DEV: Intel S2600C0, 2xE52658V2, 32GB, 4x2TB, GTX680, F20/x86_64, Dell U2711.
    SRV: Intel S5520SC, 2xX5680, 36GB, 4x2TB, GTX550, F20/x86_64, Dell U2412..
    BACK: Tyan Tempest i5400XT, 2xE5335, 8GB, 3x1.5TB, 9800GTX, F20/x86-64.
    LAP: ASUS N56VJ, i7-3630QM, 16GB, 1TB, 635M, F20/x86_64.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •