Announcement

Collapse
No announcement yet.

OpenGL 3.2 Specification Officially Released

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    OpenGL does not have to follow Java's brain damaged philosophies

    Comment


    • #22
      Originally posted by RealNC View Post
      OpenGL does not have to follow Java's brain damaged philosophies
      Why is it brain-dead to throw an exception instead of crashing? You can recover from an exception.

      If OpenGL were "sanitary" then it could be used in sandboxes for web applets and such. In its current state you have to wrap it up or else it is just a big security hole.

      Comment


      • #23
        Originally posted by RealNC View Post
        OpenGL does not have to follow Java's brain damaged philosophies
        Why is it brain-dead to throw an exception instead of crashing? You can recover from an exception.

        If OpenGL were "sanitary" then it could be used in sandboxes for web applets and such. In its current state you have to wrap it up or else it is just a big security hole.

        It's not even just java. It is the same situation for Ruby or Perl or Scheme or python or any other interpreted language.

        Comment


        • #24
          Have you ever profiled a 3-D app? Even with a fancy high-end card, your program spends the vast majority of its cycles inside of OpenGL calls. This means that you can write your app in a slow interpreted language and it will really not slow things down much at all.

          3-D apps are all about look and it is very subjective, so you do a lot of fussing with the properties of objects to get them to look "right". This means lots of recompiling if you write in C or C++. If you work with an interpreted language you can quickly fix up the look and the action to the way you want. And THEN you can port it to C or C++ if you really want the speed.

          "When writing a program, you should plan to throw the first version away, because you inevitably will, whether you planned to or not" - Gerald Sussman, inventor of Scheme and otherwise really smart guy.

          If one follows his advice then one should always prototype in a nice, easy to work with language. There are enough headaches with development of new code, why give yourlself the extra burden of worrying about pointers and memory allocation when you should be focused on how your game is going to play or how your visualization is going to show the tumor cells?

          The problem with OpenGL is, even your prototype app will dump core if you mess up, and you will still find yourself in gdb looking at stack traces even though you made an effort to keep your head out of the bits and bytes.
          Last edited by frantaylor; 04 August 2009, 11:58 PM.

          Comment


          • #25
            OpenGL is very low level for performance reasons - it doesn't suit well to make it work directly with a higher level language, and that's why wrappers / bindings exist for it. OpenGL will return error values however, and these can be checked easily enough. If the error values are not properly reported, it's not the fault of OpenGL, but rather the implementation of it.
            Interpreted languages also do an awful lot of error checking internally, so wrappers could do that quite easily as well.

            Comment


            • #26
              Originally posted by Kano View Post
              Why is it a mistake to use the max out of existing hardware? ATI is free to add the same extensions.
              It's not as long as you're mentally prepared to do things the Right Way (tm) and do distinct code paths for all vendors - read: Intel, nVidia, ATi, etc. (Like Microsoft's DirectX afaik does; Wine's devs seem to adamantly believe that they can do some uniform solution which is obviously wrong - from what I've heard they seem to believe it should be enough they just do the nVidia codepath and it's on the responsibility of driver vendors to port their drivers. This is very likely to give you reduced performance with non-nVidia hardware even if the driver implementation is as good as with nVidia. Of course, who wants to do three - or five if we count open ATi and nVidia drivers - times as much work?)
              Last edited by nanonyme; 05 August 2009, 07:54 AM.

              Comment


              • #27
                I am sure that will be done in time for fglrx with opengl 3.2 support. As NV provides now already those test drivers wine could adopt that new codepath and fglrx users will be happy when ATI manages to provide that too. Currently it does not matter in which way the functions are used as they run only on NV hardware.

                Comment

                Working...
                X