Announcement

Collapse
No announcement yet.

We need a Campaign for OpenGL Patent Exemptions for OSS, namely Mesa, and Linux

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    Originally posted by yogi_berra View Post
    Doesn't it stand to reason that as you are just learning an API, you might not know as much about it as people that have used it and while AMD may support that fancy new extension you are learning about, nVidia or Intel do not and vice versa?
    I mean the APIs, not the adjacent extensions vendors put in anyway. The word was that the GL API is horribly deprecated and full of cruft, to which I said is it's not nearly the case, want an API with hundreds and maybe thousands of deprecated classes/methods? Look at Java. Want slow weird stuff people used (and prolly still use)? Look at X's old fashioned means of drawing stuff - now these are relatively creepy and old fashioned piles of crap, compared to them OpenGL is lean and mean and very fast (as mentioned, as fast as DX) and no where as crappy as some paranoid folks may implied.

    The real issue is of course the OpenGL bugs issue which in recent years has improved a lot and is already reasonably good with i.e. Nvidia 260.19.44. The GL bugs issue is mostly because in the late 90' and till like 2005 there was no real contender to window$' dominance, hence nvidia/ati didn't bother much about cross platform stuff. Now that the Mac continues rising (and i.e. Mac 10.7 will feature at least GL 3.2) and that in recent years Linux is king on mobiles and even starts shining on the desktop - both Nvidia and AMD are clearly starting to pay more attention to GL which will take a few years until we got close to ideal OpenGL drivers, but as I said, especially the nVidia drivers are good enough already. The multi-thread stuff is not as important as some folks imply since it's being serialized anyway so it's much more about design than about speed, but even the GL design is good enough (I actually find it quite clean, the core GL 3.3) to create sophisticated cool stuff.


    We should persist on GL like we did with Linux - had Linus said: "I quit developing/maintaining Linux cause there's the much more mature BSD" - Linux wouldn't be the king of mobile, (partially on) servers and supercomputers. Linux won only cause people persisted and only then IBM and the likes kicked in. Same is the issue with OpenGL - we need to work around (temporary) bugs and continue creating qualitative software to move the Linux market share forward which in turn will _force_ AMD/Intel/Nvidia to make their drivers even better. Expecting to get a quick easy victory is a bad attitude. Microsoft fights Linux/others by all dirty means and we must accept reality and the temporary but the clearly improving state of GL graphics, which imo will improve even faster in the near future.

    Comment


    • #62
      Originally posted by Qaridarium View Post
      you are fully wrong if you have a card with massiv microstuttering means the fps counter jumps every 10 seconds from 1 fps up to an 100000fps and then again 1fps all 10 seconds you can't play but your average fps is maybe 60fps

      this maybe not called stuttering but its microstuttering.

      and this microstuttering does not make fun.

      windows with 8 min-fps does have MORE microstuttering than linux with 12 min-fps-.

      means linux is faster!
      Sorry but you are failing to realize what Min/Max means. A Min/Max is an extreme aberration. If I take 10 seconds of frame counts

      1 fps
      4 fps
      6 fps
      5 fps
      5 fps
      7 fps
      3 fps
      5 fps
      6 fps
      10 fps
      ---------
      52 frames over 10 seconds Minimum being 1 maximum being 10
      Average frames per second = 52/10 = 5.2 Fps

      Now we go with the next 10 fps

      2 fps
      2 fps
      2 fps
      2 fps
      14 fps
      5 fps
      4 fps
      14 fps
      5 fps
      2 fps
      ------------
      52 frames over 10 seconds Minimum being 2 maximum being 18
      Average frames per second = 52 / 10 = 5.2 fps

      Now this is a very small sampling (using an extended benchmark such as Unigine Heaven is going to lessen the Min/Max aberrations as their impact lessens with the number of samples) and by what you say is better is the last one because it's minimum is higher even though the slowdown lasts much longer. Reported one sampling of the Min / Max means absolutely nothing when benchmarking.

      It is basic statistical analysis 101.

      Comment


      • #63
        Originally posted by deanjo View Post
        Sorry but you are failing to realize what Min/Max means. A Min/Max is an extreme aberration. If I take 10 seconds of frame counts

        1 fps
        4 fps
        6 fps
        5 fps
        5 fps
        7 fps
        3 fps
        5 fps
        6 fps
        10 fps
        ---------
        52 frames over 10 seconds Minimum being 1 maximum being 10
        Average frames per second = 52/10 = 5.2 Fps

        Now we go with the next 10 fps

        2 fps
        2 fps
        2 fps
        2 fps
        14 fps
        5 fps
        4 fps
        14 fps
        5 fps
        2 fps
        ------------
        52 frames over 10 seconds Minimum being 2 maximum being 18
        Average frames per second = 52 / 10 = 5.2 fps

        Now this is a very small sampling (using an extended benchmark such as Unigine Heaven is going to lessen the Min/Max aberrations as their impact lessens with the number of samples) and by what you say is better is the last one because it's minimum is higher even though the slowdown lasts much longer. Reported one sampling of the Min / Max means absolutely nothing when benchmarking.

        It is basic statistical analysis 101.
        hey nice try but you do it wrong because in your benchmark the max fps is not 10 its 120 and other benchmarks in the pts like Q3 benchmarks do have 350+ fps..

        an REAL example about what i mean:

        1 fps
        1 fps
        1 fps
        1 fps
        1 fps
        1 fps
        1 fps
        1 fps
        1 fps
        1000 fps
        ------------
        1009 frames over 10 seconds Minimum being 1 maximum being 1000
        Average frames per second = 1009 / 10 = 100,9 fps

        means in real your average frames per second count is bullshit!
        Phantom circuit Sequence Reducer Dyslexia

        Comment


        • #64
          Originally posted by deanjo View Post
          In fact here are some updated results using Heaven 2.5
          Total deviation between 4 tests, Win 7 DX11, Win 7 openGL, Linux 32 and Linux 64 is .5% which falls well into a negligible difference due to no 2 results will ever be identical because of items like background apps/services, spread spectrum deviation, clock drift, etc, etc.
          Wow, seems like NVIDIA have some serious performance issues fixed in their OpenGL driver in newer windows driver. On mine rig in OpenGL mode I've got less than half of DirectX 11 performance.
          Mine hardware spec differs with intel quad core CPU (c2q [email protected])

          Comment


          • #65
            Originally posted by Qaridarium View Post
            hey nice try but you do it wrong because in your benchmark the max fps is not 10 its 120 and other benchmarks in the pts like Q3 benchmarks do have 350+ fps..

            an REAL example about what i mean:

            1 fps
            1 fps
            1 fps
            1 fps
            1 fps
            1 fps
            1 fps
            1 fps
            1 fps
            1000 fps
            ------------
            1009 frames over 10 seconds Minimum being 1 maximum being 1000
            Average frames per second = 1009 / 10 = 100,9 fps

            means in real your average frames per second count is bullshit!
            Thank you for proving that Min/Max doesn't mean shit as that could be just 1 frame that is 1 fps and the minimum would be the same.

            1 fps
            10 fps
            10 fps
            10 fps
            10 fps
            10 fps
            10 fps
            10 fps
            10 fps
            1000 fps
            ----------
            1081 frames / 10 seconds Average 108 fps which is still better performning no matter how you cut it and the average shows that it is a better experience. (Not saying that the average is accurate measure of performance but it at least reflects a change, min/max doesn't)
            Last edited by deanjo; 04-10-2011, 06:30 PM.

            Comment


            • #66
              Originally posted by blacknova View Post
              Wow, seems like NVIDIA have some serious performance issues fixed in their OpenGL driver in newer windows driver. On mine rig in OpenGL mode I've got less than half of DirectX 11 performance.
              Mine hardware spec differs with intel quad core CPU (c2q [email protected])
              There may have been a regression with one driver but my first link shows that identical performance was even to be had with the 262.99 driver way back even in November.

              Comment


              • #67
                Originally posted by deanjo View Post
                Thank you for proving that Min/Max doesn't mean shit as that could be just 1 frame that is 1 fps and the minimum would be the same.
                1 fps
                10 fps
                10 fps
                10 fps
                10 fps
                10 fps
                10 fps
                10 fps
                10 fps
                1000 fps
                ----------
                1081 frames / 10 seconds Average 108 fps which is still better performning no matter how you cut it and the average shows that it is a better experience. (Not saying that the average is accurate measure of performance but it at least reflects a change, min/max doesn't)
                on your exampel the Average 108fps shows like the game runs well in realworld... but in realworld the game runs like shit!

                so now we do an real good min-fps exampel the min-fps should be 60fps.

                60 fps
                100 fps
                100 fps
                100 fps
                100 fps
                100 fps
                100 fps
                100 fps
                100 fps
                140 fps
                ----------
                1000 frames / 10 seconds Average 100 fps

                the average fps is slower than on your example but because of the min-fps the game runs very very well in all situations.

                means your average fps is a lie and only min-fps matters.
                Phantom circuit Sequence Reducer Dyslexia

                Comment


                • #68
                  Originally posted by Qaridarium View Post
                  on your exampel the Average 108fps shows like the game runs well in realworld... but in realworld the game runs like shit!

                  so now we do an real good min-fps exampel the min-fps should be 60fps.

                  60 fps
                  100 fps
                  100 fps
                  100 fps
                  100 fps
                  100 fps
                  100 fps
                  100 fps
                  100 fps
                  140 fps
                  ----------
                  1000 frames / 10 seconds Average 100 fps

                  the average fps is slower than on your example but because of the min-fps the game runs very very well in all situations.

                  means your average fps is a lie and only min-fps matters.
                  lol, you really have no clue


                  60 fps
                  60 fps
                  60 fps
                  100 fps
                  100 fps
                  100 fps
                  100 fps
                  100 fps
                  100 fps
                  140 fps
                  ----------
                  920 frames / 10 seconds = 92 fps. Min and Maxes are the same as in your example. What number reflects that your example is giving better performance?

                  Comment


                  • #69
                    Originally posted by deanjo View Post
                    lol, you really have no clue


                    60 fps
                    60 fps
                    60 fps
                    100 fps
                    100 fps
                    100 fps
                    100 fps
                    100 fps
                    100 fps
                    140 fps
                    ----------
                    920 frames / 10 seconds = 92 fps. Min and Maxes are the same as in your example. What number reflects that your example is giving better performance?
                    the only true point is that the min-fps rate is the only point that matters.

                    if you have an min-fps rate of 60 all is fine. no matter about average fps rate or max fps rate.

                    and this fact makes openGL/linux a winner.

                    but yes dump people also can buy cards from useless max-fps numbers or pointless average numbers.
                    Phantom circuit Sequence Reducer Dyslexia

                    Comment


                    • #70
                      Originally posted by Qaridarium View Post
                      the only true point is that the min-fps rate is the only point that matters.
                      Lol again it isn't. Do you know if that minimum was a level loading? Do you know if that minimum was caused by a background unrelated app? Maybe it is a I/O bottleneck on another subsystem at that point in time.

                      Min / Max mean nothing of value as there is absolutely no indication of how long it lasts or what was happening on the entire system.


                      and this fact makes openGL/linux a winner.
                      The only fact you can take from this is Nvidia's openGL implementation is just as good as their DX.

                      Comment

                      Working...
                      X