Announcement

Collapse
No announcement yet.

We need a Campaign for OpenGL Patent Exemptions for OSS, namely Mesa, and Linux

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Hm, weren't they supposed to be optimizing it, yet the new version is 7 fps slower on a monster system.

    Comment


    • #52
      Well, elanthis said OpenGL is like 60% the performance of DX, but when other folks provided benchmarks proving otherwise - others now claim those benchmarks can't be trusted. So I ask those people to then provide benchmarks which can be trusted and which clearly show that OpenGL is like 60% of DX. Please do.

      Also, elanthis needs to stop eating steroids and get his head examined or get a life. As even the Unigine Engine benchmarks show it's neither as slow nor as unstable as the catastrophic pictures he's painting, if he was a girl I'd think he's got the critical monthly .. you know.. period. Also his claims that OpenGL stopped evolving a decade ago (when only last 2-3 years there were GL 3.0, 3.1, 3.2, 3.3, 4.0, 4.1) are just as weird and full of paranoia and self-defined meaning of words to prove his point that OGL doesn't evolve.

      I'm learning OpenGL 3.3, but coming from Java I can tell you that OpenGL's API is well thought and there's not tons of deprecated stuff in OpenGL like in Java so it's not so hopeless as he implies. Yet I managed to create cool stuff in Java that my clients simply *LOVE*, so since OpenGL's API is even cleaner and meaner in this aspect and has a lot less cruft - I can use it with C++ to create amazing stuff. The API itself is good enough and allows for creating stuff that runs pretty much as fast as DX.
      But elanthis keeps painting a catastrophic picture of OpenGL whenever anyone doesn't agree with his paranoid (as shown by benchmarks and elsewhere) worldview.

      Comment


      • #53
        Originally posted by cl333r View Post
        I'm learning OpenGL 3.3, but coming from Java I can tell you that OpenGL's API is well thought and there's not tons of deprecated stuff in OpenGL like in Java so it's not so hopeless as he implies.
        Doesn't it stand to reason that as you are just learning an API, you might not know as much about it as people that have used it and while AMD may support that fancy new extension you are learning about, nVidia or Intel do not and vice versa?

        Comment


        • #54
          Originally posted by deanjo View Post
          And FYI, there is virtually no difference with Unigine Heaven running the 32-bit blob or the 64-bit blob or running it in a 64-bit or 32-bit environment.

          All in all Q your comments are meaningless and full of bullshit as well.
          no one care about 120fps at the max fps.

          but all care about the min fps rate and the 64bit build is faster at the min fps side and only the min fps count because all fps over 60fps are invalid in an realworld and usefull way.

          the 32bit version only do fps to compensate this on invalid und useless points means more than 60fps

          7,7 vs 8,3

          means the 64bit version is 7,8% faster on an point that matters.

          but you write it in that way: "And FYI, there is virtually no difference with Unigine Heaven running the 32-bit blob or the 64-bit blob or running it in a 64-bit or 32-bit environment."

          if you are dump then the max fps @100000fps cound but if you are smart then only the min-fps cound and only results up to 60fps count because thats the only point in the app/game were the game goes to stuttering and cou can't play.

          an real world exampel: if one man does have 24fps and you have 7,8% less you have 22.1 the first person do have a kino like fps and you only have stuttering.
          Phantom circuit Sequence Reducer Dyslexia

          Comment


          • #55
            Originally posted by deanjo View Post
            In fact here are some updated results using Heaven 2.5
            Total deviation between 4 tests, Win 7 DX11, Win 7 openGL, Linux 32 and Linux 64 is .5% which falls well into a negligible difference due to no 2 results will ever be identical because of items like background apps/services, spread spectrum deviation, clock drift, etc, etc.
            and again on linux the 64bit version wins on the min-fps part and only min-fps matters no one care about 10000000000fps at the max fps part of a game.

            on windows the 64bit windows lose because of the 32bit app-.
            Phantom circuit Sequence Reducer Dyslexia

            Comment


            • #56
              pseudo edit: i delete this part on my post"on windows the 64bit windows lose because of the 32bit app-. " just because i read wrong again...

              this unigine 32bit shity shit makes me crazy
              Phantom circuit Sequence Reducer Dyslexia

              Comment


              • #57
                Originally posted by deanjo View Post
                In fact here are some updated results using Heaven 2.5
                Total deviation between 4 tests, Win 7 DX11, Win 7 openGL, Linux 32 and Linux 64 is .5% which falls well into a negligible difference due to no 2 results will ever be identical because of items like background apps/services, spread spectrum deviation, clock drift, etc, etc.
                for general nice to spend your benchmarks here.
                and hey i read your benchmarks differently.

                you say something like: the difference does not matter

                but if you got 8-min-fps on dirextX and 12 min-fps this truely matters

                means YES linux-64bit IS really faster than windows+DirectX
                Phantom circuit Sequence Reducer Dyslexia

                Comment


                • #58
                  Originally posted by Qaridarium View Post
                  f
                  but if you got 8-min-fps on dirextX and 12 min-fps this truely matters

                  No it doesn't and here is why. A minimum is exactly that, a minimum and does not reflect the overall play. That is what the average is for. If your minimum is 8 fps on one system and lasts for one second then no big deal, if your minimum is 12 fps but lasts 10 seconds then that is a huge deal and the average FPS will reflect that flaw. This is why average FPS are used for meaningful benchmarks and not mins/maxes.

                  Comment


                  • #59
                    Originally posted by deanjo View Post
                    No it doesn't and here is why. A minimum is exactly that, a minimum and does not reflect the overall play. That is what the average is for. If your minimum is 8 fps on one system and lasts for one second then no big deal, if your minimum is 12 fps but lasts 10 seconds then that is a huge deal and the average FPS will reflect that flaw. This is why average FPS are used for meaningful benchmarks and not mins/maxes.
                    you are fully wrong if you have a card with massiv microstuttering means the fps counter jumps every 10 seconds from 1 fps up to an 100000fps and then again 1fps all 10 seconds you can't play but your average fps is maybe 60fps

                    this maybe not called stuttering but its microstuttering.

                    and this microstuttering does not make fun.

                    windows with 8 min-fps does have MORE microstuttering than linux with 12 min-fps-.

                    means linux is faster!
                    Phantom circuit Sequence Reducer Dyslexia

                    Comment


                    • #60
                      Originally posted by yogi_berra View Post
                      Doesn't it stand to reason that as you are just learning an API, you might not know as much about it as people that have used it and while AMD may support that fancy new extension you are learning about, nVidia or Intel do not and vice versa?
                      Interesting you mention that - supporting some new extensions is actually one of the greatest strengths and curses of OpenGL. Strength in that it's good for R&D and easy to add new stuff without waiting for the spec, and a curse if you want to use it, but have to write fallbacks because someone else doesn't support it.

                      Comment

                      Working...
                      X