Announcement

Collapse
No announcement yet.

We need a Campaign for OpenGL Patent Exemptions for OSS, namely Mesa, and Linux

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • We need a Campaign for OpenGL Patent Exemptions for OSS, namely Mesa, and Linux

    I've heard all sorts of things discussion about open drivers being blocked by patents over parts of OpenGL. One thing I haven't heard about is efforts to get patent exemptions for open source drivers for open source operating systems. We should articulate to patent holders that we will never be able to make money from this "niche." We will never buy or pay for rights to use it. There is no gain for them to hold us back. Maybe we could articulate some benefits though to them. Any thoughts?

  • deanjo
    replied
    Originally posted by Qaridarium View Post
    on that run openGL clearly lose.

    but yes thats what i talking in my first post you make a statement based on your own (faked) benchmark.
    Nothing faked about any of the benchmarks at all, just pure fact.

    you need other people to reproduce your results.
    No problem, there are a few other GTX-580's here that can easily replicate the results.
    Last edited by deanjo; 04-11-2011, 06:58 PM.

    Leave a comment:


  • deanjo
    replied
    Originally posted by Qaridarium View Post
    no not really driver programming in low-level is just another stuff than your high-programming an business software.

    and no i do not drop you down its just another kind of skill.

    sorry.
    Lol, programmed drivers for better part of a decade.

    Leave a comment:


  • Qaridarium
    replied
    Originally posted by deanjo View Post
    PS I would say "realtime financial analysis software development" would qualify as "any serious performance programming".
    no not really driver programming in low-level is just another stuff than your high-programming an business software.

    and no i do not drop you down its just another kind of skill.

    sorry.

    Leave a comment:


  • Qaridarium
    replied
    Originally posted by deanjo View Post
    Here are two more runs Q. Same system, same drivers, same version of app.
    Does this now mean that Windows 7 / DX 11 / and 32 bit blobs are superior?
    Because according to your criteria that would be the case. If that is the case,
    how did you ever conclude that Linux 64 / ogl were superior?

    on that run openGL clearly lose.

    but yes thats what i talking in my first post you make a statement based on your own (faked) benchmark.

    you need other people to reproduce your results.

    Leave a comment:


  • mirv
    replied
    Originally posted by deanjo View Post
    That number quoted up there is useless as it gives no real indication as to when, how long, why it happened. It doesn't even isolate it to being an application issue. The fact that it it cannot be replicated on a steady basis running a "canned app" suggest that the dip occurred from an outside influence. A single low point fps does not tell you anything meaningful.For all you know that dip occurred while the timer started and the level was still loading.


    PS I would say "realtime financial analysis software development" would qualify as "any serious performance programming".
    Then you've just told me that they should be waiting for a couple of seconds to run the benchmark and give better numbers. Either way, the min fps just gave you information.

    Leave a comment:


  • deanjo
    replied
    Originally posted by mirv View Post
    My point is that you can't simply say that the min fps is useless - not from that benchmark, nor from any other, and I pointed out why. If you'd done any serious performance programming, you would know this.
    That number quoted up there is useless as it gives no real indication as to when, how long, why it happened. It doesn't even isolate it to being an application issue. The fact that it it cannot be replicated on a steady basis running a "canned app" suggest that the dip occurred from an outside influence. A single low point fps does not tell you anything meaningful.For all you know that dip occurred while the timer started and the level was still loading.


    PS I would say "realtime financial analysis software development" would qualify as "any serious performance programming".

    Leave a comment:


  • mirv
    replied
    Originally posted by deanjo View Post
    You still need to see something like a histogram before you can make any type of educated conclusion. Otherwise it is like saying, Team A scored a goal and then determining if they won or not without knowing the other teams score. With your issue if the data was identical that it was trying to process on the same run then the results should be relatively identical. The heaven test does the same calculations in the same order, the items it has to render are identical each and every run.
    My point is that you can't simply say that the min fps is useless - not from that benchmark, nor from any other, and I pointed out why. If you'd done any serious performance programming, you would know this.

    Leave a comment:


  • deanjo
    replied
    Originally posted by mirv View Post
    Fairly consistent only if you're aware of what's happening at the time of the slow down, otherwise it might only seem inconsistent. Case in point (and bringing back to on-topic, no matter how people try to diverge!): I had sub-image replacement corruption that for all intents and purposes appeared random, and would have been had anyone else looked at it. What the problem was, was that the s3tc texture blocks had to be replaced in a specific order, or with certain data the result was corrupted. Driver bug, actually, several years back and long since fixed, and not something I would have needed except for using the drivers to convert images to s3tc format.
    You still need to see something like a histogram before you can make any type of educated conclusion. Otherwise it is like saying, Team A scored a goal and then determining if they won or not without knowing the other teams score. With your issue if the data was identical that it was trying to process on the same run then the results should be relatively identical. The heaven test does the same calculations in the same order, the items it has to render are identical each and every run.

    Leave a comment:


  • mirv
    replied
    Originally posted by deanjo View Post
    Then the benchmark ideally would return a fairly consistent result, the fact that this does not happen suggest that the slow downs are induced from an outside influence making the min fps result useless.
    Fairly consistent only if you're aware of what's happening at the time of the slow down, otherwise it might only seem inconsistent. Case in point (and bringing back to on-topic, no matter how people try to diverge!): I had sub-image replacement corruption that for all intents and purposes appeared random, and would have been had anyone else looked at it. What the problem was, was that the s3tc texture blocks had to be replaced in a specific order, or with certain data the result was corrupted. Driver bug, actually, several years back and long since fixed, and not something I would have needed except for using the drivers to convert images to s3tc format.

    Leave a comment:

Working...
X