Announcement

Collapse
No announcement yet.

We need a Campaign for OpenGL Patent Exemptions for OSS, namely Mesa, and Linux

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #76
    Originally posted by deanjo View Post
    Lol again it isn't.
    there is no "LOL" and yes it is! fore sure!

    Originally posted by deanjo View Post
    Do you know if that minimum was a level loading? Do you know if that minimum was caused by a background unrelated app? Maybe it is a I/O bottleneck on another subsystem at that point in time.
    all that doesn’t matter if level loading makes the system stuttering then your harddrive/ssd/ramdisk is bad and if an background app makes the system stuttering the system is more than bad for gaming and if there is an I/O bottleneck the system is just bad for gaming at all all that stuff just say 1 think: only min-fps matters.
    If a system does have the best of the best stuff the low-fps is high but if a system is crap the low-fps is low and if the hardware is the same then only the software is bad/good.

    means windows is bad.

    and again only min-fps matters. all other stuff is bullshit!


    Originally posted by deanjo View Post
    The only fact you can take from this is Nvidia's openGL implementation is just as good as their DX.
    no the openGL implementation is better because higher result on the only matter low-fps stuff.

    there is no more to talk.

    Comment


    • #77
      Originally posted by Qaridarium View Post
      there is no "LOL" and yes it is! fore sure!

      all that doesn’t matter if level loading makes the system stuttering then your harddrive/ssd/ramdisk is bad and if an background app makes the system stuttering the system is more than bad for gaming and if there is an I/O bottleneck the system is just bad for gaming at all all that stuff just say 1 think: only min-fps matters.
      If a system does have the best of the best stuff the low-fps is high but if a system is crap the low-fps is low and if the hardware is the same then only the software is bad/good.

      means windows is bad.

      and again only min-fps matters. all other stuff is bullshit!

      no the openGL implementation is better because higher result on the only matter low-fps stuff.

      there is no more to talk.
      No it doesn't mean anything of a sort. It means on one run there was some reason for a low. The mere fact that runs on the same system on the same OS and different non identical results are to be had are a testament to that. It is so sad that you can't understand the obvious.

      Comment


      • #78
        Here are two more runs Q. Same system, same drivers, same version of app.
        Does this now mean that Windows 7 / DX 11 / and 32 bit blobs are superior?

        Because according to your criteria that would be the case. If that is the case,
        how did you ever conclude that Linux 64 / ogl were superior?




        Comment


        • #79
          Take a chill pill guys and/or girls.
          Minimum framerate does matter - it is important. That's just plain old engineering, statistics be damned. If it's getting low, they're going to want to know where, and why, because there's no point working on a driver section if it's nice (stable) & speedy.

          @artvision:
          Generally, some of the more high-end developers might like access to lower level close-to-the-metal access, but they definitely don't want to have to program separate code paths for separate devices, so you'll be needing some type of API layer. If you're taking the words from a certain AMD member, he clarified his position later on.

          Comment


          • #80
            Originally posted by mirv View Post
            Take a chill pill guys and/or girls.
            Minimum framerate does matter - it is important. That's just plain old engineering, statistics be damned. If it's getting low, they're going to want to know where, and why, because there's no point working on a driver section if it's nice (stable) & speedy.
            Only if it is replicable on a fairly consistent basis and noticeably effects gameplay. That is something that reporting a low extreme does not tell you. You have to take a look at the frequency those low frame rates occur and that will usually be reflected in the overall fps.

            Comment


            • #81
              Originally posted by deanjo View Post
              Only if it is replicable on a fairly consistent basis and noticeably effects gameplay. That is something that reporting a low extreme does not tell you. You have to take a look at the frequency those low frame rates occur and that will usually be reflected in the overall fps.
              Repeated occurrence at a particular point, gameplay or not, is important. What affects one game slightly, may affect another rather more - or worse for these companies, it could be something affecting their workstation software targets.
              I'd also say that non-variation in framerate is more important for gameplay than anything else, but that's getting off to much into a tangent (much like the whole thread I suppose).
              So to bring things back a little - what if that minimum framerate only occurred because they couldn't use floating point buffers, or S3TC, or something similar?

              Comment


              • #82
                Originally posted by mirv View Post
                So to bring things back a little - what if that minimum framerate only occurred because they couldn't use floating point buffers, or S3TC, or something similar?
                Then the benchmark ideally would return a fairly consistent result, the fact that this does not happen suggest that the slow downs are induced from an outside influence making the min fps result useless.

                Comment


                • #83
                  Originally posted by deanjo View Post
                  Then the benchmark ideally would return a fairly consistent result, the fact that this does not happen suggest that the slow downs are induced from an outside influence making the min fps result useless.
                  Fairly consistent only if you're aware of what's happening at the time of the slow down, otherwise it might only seem inconsistent. Case in point (and bringing back to on-topic, no matter how people try to diverge!): I had sub-image replacement corruption that for all intents and purposes appeared random, and would have been had anyone else looked at it. What the problem was, was that the s3tc texture blocks had to be replaced in a specific order, or with certain data the result was corrupted. Driver bug, actually, several years back and long since fixed, and not something I would have needed except for using the drivers to convert images to s3tc format.

                  Comment


                  • #84
                    Originally posted by mirv View Post
                    Fairly consistent only if you're aware of what's happening at the time of the slow down, otherwise it might only seem inconsistent. Case in point (and bringing back to on-topic, no matter how people try to diverge!): I had sub-image replacement corruption that for all intents and purposes appeared random, and would have been had anyone else looked at it. What the problem was, was that the s3tc texture blocks had to be replaced in a specific order, or with certain data the result was corrupted. Driver bug, actually, several years back and long since fixed, and not something I would have needed except for using the drivers to convert images to s3tc format.
                    You still need to see something like a histogram before you can make any type of educated conclusion. Otherwise it is like saying, Team A scored a goal and then determining if they won or not without knowing the other teams score. With your issue if the data was identical that it was trying to process on the same run then the results should be relatively identical. The heaven test does the same calculations in the same order, the items it has to render are identical each and every run.

                    Comment


                    • #85
                      Originally posted by deanjo View Post
                      You still need to see something like a histogram before you can make any type of educated conclusion. Otherwise it is like saying, Team A scored a goal and then determining if they won or not without knowing the other teams score. With your issue if the data was identical that it was trying to process on the same run then the results should be relatively identical. The heaven test does the same calculations in the same order, the items it has to render are identical each and every run.
                      My point is that you can't simply say that the min fps is useless - not from that benchmark, nor from any other, and I pointed out why. If you'd done any serious performance programming, you would know this.

                      Comment


                      • #86
                        Originally posted by mirv View Post
                        My point is that you can't simply say that the min fps is useless - not from that benchmark, nor from any other, and I pointed out why. If you'd done any serious performance programming, you would know this.
                        That number quoted up there is useless as it gives no real indication as to when, how long, why it happened. It doesn't even isolate it to being an application issue. The fact that it it cannot be replicated on a steady basis running a "canned app" suggest that the dip occurred from an outside influence. A single low point fps does not tell you anything meaningful.For all you know that dip occurred while the timer started and the level was still loading.


                        PS I would say "realtime financial analysis software development" would qualify as "any serious performance programming".

                        Comment


                        • #87
                          Originally posted by deanjo View Post
                          That number quoted up there is useless as it gives no real indication as to when, how long, why it happened. It doesn't even isolate it to being an application issue. The fact that it it cannot be replicated on a steady basis running a "canned app" suggest that the dip occurred from an outside influence. A single low point fps does not tell you anything meaningful.For all you know that dip occurred while the timer started and the level was still loading.


                          PS I would say "realtime financial analysis software development" would qualify as "any serious performance programming".
                          Then you've just told me that they should be waiting for a couple of seconds to run the benchmark and give better numbers. Either way, the min fps just gave you information.

                          Comment


                          • #88
                            Originally posted by deanjo View Post
                            Here are two more runs Q. Same system, same drivers, same version of app.
                            Does this now mean that Windows 7 / DX 11 / and 32 bit blobs are superior?
                            Because according to your criteria that would be the case. If that is the case,
                            how did you ever conclude that Linux 64 / ogl were superior?

                            on that run openGL clearly lose.

                            but yes thats what i talking in my first post you make a statement based on your own (faked) benchmark.

                            you need other people to reproduce your results.

                            Comment


                            • #89
                              Originally posted by deanjo View Post
                              PS I would say "realtime financial analysis software development" would qualify as "any serious performance programming".
                              no not really driver programming in low-level is just another stuff than your high-programming an business software.

                              and no i do not drop you down its just another kind of skill.

                              sorry.

                              Comment


                              • #90
                                Originally posted by Qaridarium View Post
                                no not really driver programming in low-level is just another stuff than your high-programming an business software.

                                and no i do not drop you down its just another kind of skill.

                                sorry.
                                Lol, programmed drivers for better part of a decade.

                                Comment

                                Working...
                                X