Announcement

Collapse
No announcement yet.

NVIDIA Performance: Windows 7 vs. Ubuntu Linux 12.10

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Thats what I figure, I just thought it was a bit unclear.

    Comment


    • #32
      Unredirect Fullscreen Windows in CompizConfig

      Michael next time when you test Unity performance you should enable Unredirect Fullscreen Windows in CompizConfig Settings Manager and also disable Sync to VBlank. The first option should be a must have for anyone that wants to run games under Unity.

      Comment


      • #33
        Why wasn't Gnome included in the test? Even if it's ****, who else but Ubuntu uses Unity? They (Canonical) don't support KDE (Kubuntu) anymore so it shouldn't matter, right?

        Comment


        • #34
          Benchmarks for scene complexity and jitter maintaining 72hz would be more interesting. One thing not showing up in these benchmarks is ofcourse windows poorer jitter-performance (dropped frames).
          I also tried the latest ubuntu now. Standard kernel is still not configured for low jitter. Benchmarks on that is completely uninteresting for games. It cannot even display 30fps video without jitter. My lowlatency configged kernel can though, but the browser videoplayer should ofcourse be driven by screen refresh of 2-3x value, to avoid all jitter.

          Read also my post on LKML.

          Peace Be With You.

          Comment


          • #35
            Originally posted by jrch2k8 View Post
            ... Doom3 is not a relevant test for an OS vs OS, if you actually put 3 neurons to work togheter you will see the reasons like:

            * is closed source so you can't know for sure if it uses the same renderpath in both oses or any other typical dirty hack to gain more FPS that is not present within the other OS
            * neither xonotic and especially doom3 will put any modern gpu to its knees
            ...
            if you wanna reply do it technically i won't bother in answer stupid replys
            Either you've been living under a rock for the past couple of years (...) or you are running for next year's Darwin award.
            1. Doom3 *is* GPL'ed. It has been open sourced in **November** last year. [1]
            2. Doom3 *is* 8 (!!!!!) years old. Even ancient (relatively speaking) GPUs with Open source drivers can run it close to 100 FPS [2] - let alone binary driver users (such as, err, this Win7 vs. Linux benchmark). Heck, it even runs on a smartphone [3].

            On a personal note, had you take 5 minutes to do some *basic* facts checking before pressing the "send" button, you can have saved yourself a major embarrassment.

            - Gilboa
            [1] http://en.wikipedia.org/wiki/Id_Tech_4
            [2] http://www.phoronix.com/scan.php?pag...nux_mesa&num=4
            [3] http://www.phoronix.com/scan.php?pag...tem&px=MTE1ODg
            oVirt-HV1: Intel S2600C0, 2xE5-2658V2, 128GB, 8x2TB, 4x480GB SSD, GTX1080 (to-VM), Dell U3219Q, U2415, U2412M.
            oVirt-HV2: Intel S2400GP2, 2xE5-2448L, 120GB, 8x2TB, 4x480GB SSD, GTX730 (to-VM).
            oVirt-HV3: Gigabyte B85M-HD3, E3-1245V3, 32GB, 4x1TB, 2x480GB SSD, GTX980 (to-VM).
            Devel-2: Asus H110M-K, i5-6500, 16GB, 3x1TB + 128GB-SSD, F33.

            Comment


            • #36
              The PTS profile runs the binary version, no? There is no way to know whether that was built from the GPL'd code.

              The point about binary drivers having hacks for specific executables is also valid.

              Comment


              • #37
                Originally posted by curaga View Post
                The PTS profile runs the binary version, no? There is no way to know whether that was built from the GPL'd code.

                The point about binary drivers having hacks for specific executables is also valid.
                Please note that only I quoted a couple of grossly erroneous comments.
                BTW, Why do you assume that nVidia *Linux* binary drivers do not use binary-detection-hecks - just as their Windows counter-part? (Given the fact that they share the same code-base).
                ... And whether nVidia employs binary name hacks or not, why should I, as an end user, care?

                - Gilboa
                oVirt-HV1: Intel S2600C0, 2xE5-2658V2, 128GB, 8x2TB, 4x480GB SSD, GTX1080 (to-VM), Dell U3219Q, U2415, U2412M.
                oVirt-HV2: Intel S2400GP2, 2xE5-2448L, 120GB, 8x2TB, 4x480GB SSD, GTX730 (to-VM).
                oVirt-HV3: Gigabyte B85M-HD3, E3-1245V3, 32GB, 4x1TB, 2x480GB SSD, GTX980 (to-VM).
                Devel-2: Asus H110M-K, i5-6500, 16GB, 3x1TB + 128GB-SSD, F33.

                Comment


                • #38
                  Doesnt matter

                  It doesn`t matter if it is closed or open source. What we want to see is how it performs, under the best conditions. On a low-latency kernel, HLe2 in wine performs close to an optimized windows XP install. Read also about optimizing winXP here.

                  Also one needs to know wheter there are variables that improve performance in the games themselves, like doom 3s 60fps limit.

                  I am also working on optimizing Ubuntu 12.04, and will make a post on my blog, and probably here later.

                  Then we actually compare the whole OS+code+compiler+whatever.

                  Peace Be With You.

                  Comment


                  • #39
                    You're right with those erroneus parts, I just wanted to point out the validity of parts of it.

                    BTW, Why do you assume that nVidia *Linux* binary drivers do not use binary-detection-hecks - just as their Windows counter-part? (Given the fact that they share the same code-base).
                    Popularity. Even if the code for doom3 is there in the linux blob, it's likely only hooked up to act on doom3.exe, not doom3.bin or whatever it's named on linux.

                    One counter-example is Q3, both the linux blobs have (had?) confirmed hacks for that.

                    And whether nVidia employs binary name hacks or not, why should I, as an end user, care?
                    Because it doesn't represent the card's performance. It represents a hack.

                    Just like their "optimizing" for the 3dMarks. If they, for example, replace shaders on-the-fly with simpler ones that look nearly the same (note: nearly!), it's cheating, and gives invalid data about the card's performance.

                    Ie, you expect it to show the 460GTX performance for light effect X. What it shows is the performance for grossly simplified but similar effect, unfairly skewing results and also giving a different picture than you would have gotten with the original shaders.


                    If you only run that one program, then you may like the result, if the picture is not terribly worse. But chances are you will run N different apps, M of those being ones the blob does not hack. Even if they use the exact same shader, you will see worse performance from the less popular apps.

                    Comment


                    • #40
                      Originally posted by curaga View Post
                      Because it doesn't represent the card's performance. It represents a hack.
                      I *completely* disagree.
                      Newer nVidia release detect flash (under Linux) and presumably fix the blue-tint bug.
                      Is it a hack? Or is it a fix? How can you tell the deference.

                      Just like their "optimizing" for the 3dMarks. If they, for example, replace shaders on-the-fly with simpler ones that look nearly the same (note: nearly!), it's cheating, and gives invalid data about the card's performance.
                      If the visuals are the same, I couldn't care less.
                      If the visuals are not the same, I very much care.
                      As before (flash), everything is in the details.

                      In the end, this argument is rather meaningless. Linux seems to perform just as good as Windows 7 - or better, with or without hacks.
                      There were no reports of different visuals, so per-applications hacks weren't apparent. (Plus, I doubt that nVidia still has Doom3 targeted hacks, given the age of the game).

                      - Gilboa
                      oVirt-HV1: Intel S2600C0, 2xE5-2658V2, 128GB, 8x2TB, 4x480GB SSD, GTX1080 (to-VM), Dell U3219Q, U2415, U2412M.
                      oVirt-HV2: Intel S2400GP2, 2xE5-2448L, 120GB, 8x2TB, 4x480GB SSD, GTX730 (to-VM).
                      oVirt-HV3: Gigabyte B85M-HD3, E3-1245V3, 32GB, 4x1TB, 2x480GB SSD, GTX980 (to-VM).
                      Devel-2: Asus H110M-K, i5-6500, 16GB, 3x1TB + 128GB-SSD, F33.

                      Comment

                      Working...
                      X