Announcement

Collapse
No announcement yet.

We need a Campaign for OpenGL Patent Exemptions for OSS, namely Mesa, and Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by elanthis View Post
    And movies are not rendered using OpenGL.
    No, but their content is generated in apps powered by OpenGL, apps that require a great deal of performance since at the stage of conception the meshes are MASSES of raw polygons in order to allow for as flexible a workflow as possible.

    Originally posted by elanthis View Post
    So why are you throwing a royal hissy fit because someone dares to claim (along with thousands of other developers) that OpenGL doesn't work just peachy-keen for a use case other than your own pet one?
    LOL, 'doesn't work just peachy-keen'? You are claiming that it's unuseable crap.

    Originally posted by elanthis View Post
    Write some code, Khronos! Write samples! Write tools! Write a test suite!
    Khronos is as far as I know nothing but a consortium of industry players (including NVidia and ATI) who submits api suggestions for consideration/voting, as for an 'official' test suite that would be Mesa afaik.

    Originally posted by elanthis View Post
    I get that you don't understand software engineering, but get this: there is absolutely no damn excuse for Khronos' incompetence and negligence with how it's handling OpenGL, or the ARB's incompetence and negligence before it.
    Oh, I've worked as a software engineer (or programmer, as it is usually called) for 8+ years, mainly c, c++ but also x86 assembly, scripting languages and some java (uurk!), although I must admit that I haven't worked in games or 3d for that matter (unless some patches to the Allegro game library ages ago count ) I'd say I'm pretty well versed in software development.

    And you speak of Khronos again as if they were a single 'entity', they are not. It's a group of separate members agreeing on an api specification, which is then implemented separately in drivers.

    Originally posted by elanthis View Post
    OpenGL has been mishandled and left to rot by its caretakers over and over again throughout its history. This is why it's in such an awful state today, and why so many developers absolutely freaking hate using the API.
    Again, OSX, Linux, mobile devices (apart from Microsofts offerings) rely on OpenGL, if it was so bad as you describe it we would have seen a cross platform replacement by now.

    Originally posted by elanthis View Post
    NVIDIA's drivers are the best. By a long shot. Everybody knows that. And they still have a ton of bugs that their Direct3D drivers don't have.
    All these sweeping statements with nothing to back them up, please point me to some objective comparisons regarding bugs in NVidia OpenGL vs Direct3D drivers. Again I can only compare NVidia OpenGL drivers since the programs I use are for OpenGL (although I test many of them on both Linux and Windows and I've seen no difference in stability or general performance).

    Originally posted by elanthis View Post
    But your modeler app works, so screw everyone else, right?
    Oh please, the ONLY thing working in OpenGL are the 3d applications that I've used... man I must be lucky, everything else running OpenGL out there must be using software rendering or some fairy dust or something, because it's just no f****** way OpenGL could be working, I mean all drivers except NVidia's (again lucky me) are just big piles of bugs, and the api is so terrible that no programmers could actually create anything in it anyway, I mean it's just so f****** bad.

    Comment


    • #32
      First of all developers don't like API's like dx11 or ogl4. They want to get rid of them. They want direct access to hardware with a C-like language. If they do that, then their games will have much batter graphics. For example an 1teraflop vga with direct access will give performance like an 4teraflop vga with API!!! If that language is the famous openCL then they will gain universal access to all hardware, as this hardware has an openCL driver!!! OpenCL programming does not need compile for a specific cpu. Second, openGL now has openRL library (open ray-tracing library). http://www.caustic.com/products.php | http://3dradar.techradar.com/3d-tech...les-15-12-2010 So openGL is now better than directX. A ray-traced 3d image may be 5-10 times better than an only raster 3d image, at the same flops, or equal at only 1/5 of flops for example. See NPG with the next powerVR gpu and ray-tracing.

      Comment


      • #33
        I'll throw in some more comments here:
        The core OpenGL functionality of the big players (nvidia and ati/amd) is stable. It has to be. Most of the bugs that occur with desktop use are with handling outside the core (typically some X integration). By core of course, I mean defined spec. Both companies have had outlying issues on occasion, but they're about as rare as D3D issues now.

        From an API perspective, workstation graphics care a bit more about the geometry processing than shaders, if I recall correctly, and there are a few tweaks aimed at that.

        Driver quality aside, opengl and d3d are supposedly quite similar in terms of functionality these days (I'll repeat my disclaimer: I have no intention of doing anything that can't run on windows/linux/mac, so it's OpenGL or software rendering for me).

        I'm also going to note something else: neither one has hardware accelerated multi-threaded rendering. D3D does not have it - it's done in software. OpenGL doesn't define it, which is perhaps not good for gaming, but power users (programmers) do have greater control if they want to create their own. Always a tradeoff.

        I can't comment on OpenGL ES btw, as I've nothing to play around on with that (not properly anyway).

        All this is pointless banter anyway - OpenGL is the only viable option on Linux, unless you want software rendering. D3D is not going to happen, and unless you can start doing wonderful things with Fusion systems (quite the possibility, but again I haven't looked into programming with that architecture, though I'd like to one day) software rendering can't compete with hardware accelerated performance just yet.

        Comment


        • #34
          The whole mess with OpenGL 2 and 3 has to do with one company in Khronos (at the time) who tried to keep everything as it was. Turned down every suggestion.
          That company was Microsoft. In 2006 or something around that year Microsoft left Khronos. Then work begun on OpenGL 3.1 and so on. Which went much quicker without Microsoft interference.

          Comment


          • #35
            [QUOTE]
            Originally posted by XorEaxEax View Post

            Again, OSX, Linux, mobile devices (apart from Microsofts offerings) rely on OpenGL, if it was so bad as you describe it we would have seen a cross platform replacement by now.
            [/QUOTE=XorEaxEax;194830]
            No we wouldn't.
            Ten years ago we also didn't have a function for saying we don't want our application using the graphic card. Only Last year they introduced a decent function that doesn't lock the whole card! And that's in windows.

            As for Mesa's test suite. Don't forget that mesa is actually a third party in OpenGL. There should be an official test suite. Making one is quite a problem because C and C++. Most widely used languages with OpenGL? Don't have a standardized way for doing tests. The language D would be better for this. If it had enough tools to work with just as much as C and C++ ide's exist.

            Comment


            • #36
              Originally posted by plonoma View Post
              The whole mess with OpenGL 2 and 3 has to do with one company in Khronos (at the time) who tried to keep everything as it was. Turned down every suggestion.
              That company was Microsoft. In 2006 or something around that year Microsoft left Khronos. Then work begun on OpenGL 3.1 and so on. Which went much quicker without Microsoft interference.
              Somehow Im not surprised.

              When I quit using microsoft software, my productivity boosted as well.

              Comment


              • #37
                Why can't the Khronos people be replaced with people willing to break ABI and really improve the spec to make it meet or beat DirectX? Shareholders do it to executives all the time in the business world. Why not here? Even if that did happen, it still wouldn't change the fact that software patents are there to block open source implementations.

                Comment


                • #38
                  Originally posted by Prescience500 View Post
                  Why can't the Khronos people be replaced with people willing to break ABI and really improve the spec to make it meet or beat DirectX? Shareholders do it to executives all the time in the business world. Why not here? Even if that did happen, it still wouldn't change the fact that software patents are there to block open source implementations.
                  Because Khronos is comprised of people (companies/groups) who make the hardware and implement the spec.

                  Comment


                  • #39
                    Originally posted by XorEaxEax View Post
                    Khronos is as far as I know nothing but a consortium of industry players (including NVidia and ATI) who submits api suggestions for consideration/voting, as for an 'official' test suite that would be Mesa afaik.
                    So? Why does that mean they can't write code and test suites?

                    Again, OSX, Linux, mobile devices (apart from Microsofts offerings) rely on OpenGL, if it was so bad as you describe it we would have seen a cross platform replacement by now.
                    Bullshit. It's taken Linux years to get a usable OpenGL implementation working. OS X has an even worse one.

                    Writing a new API takes effort and requires a level of knowledge and direct vendor access that some random hodgepodge of Open Source coders simply do not have.

                    And yes, Linux is suffering under OpenGL. You can't even get Compiz or Firefox to reliably use it, even on the proprietary drivers.

                    You're acting like everything is working just fine when this very site has posted countless articles about how every single fucking app that's tried to use OpenGL has run into numerous problems doing so, to the point of just disabling it half the damn time!

                    All these sweeping statements with nothing to back them up, please point me to some objective comparisons regarding bugs in NVidia OpenGL vs Direct3D drivers.
                    Why? You've ignored every link and example I've already given. Go do your own research and stop wasting my time.

                    Originally posted by mirv
                    The core OpenGL functionality of the big players (nvidia and ati/amd) is stable. It has to be. Most of the bugs that occur with desktop use are with handling outside the core (typically some X integration). By core of course, I mean defined spec. Both companies have had outlying issues on occasion, but they're about as rare as D3D issues now.
                    Sorry, this simply isn't true.

                    NVIDIA's own test programs can trigger bugs that cause mis-rendering with features as core (and essential to modern graphics) as FBOs. It's that bad.

                    No, despite Xor's idiotic binary-logic arguments, that doesn't imply that it's impossible to use FBOs at all period. It just means that you can and often will run into totally weird bugs that sap away hours or days of your time while you try to sort out whether the platform is actually behaving sanely or if it's just a bug in your app, and then you spend even more time trying to figure out workarounds that maintain acceptable performance and don't trigger the bugs.

                    And that's stupid. And that's why many developers don't even bother with OpenGL support anymore, because it's simply easier to use D3D and get 95% of the market for 5% of the engineering cost.

                    I'm also going to note something else: neither one has hardware accelerated multi-threaded rendering. D3D does not have it - it's done in software.
                    This isn't true, at least for D3D 11. Although it's possibly just stating something different than what is meant.

                    With D3D, what "multi-threaded rendering" means is that you can create and manage buffers on other threads, and that you can compose rendering commands to be submitted to the GPU in those threads. You can let your rendering code build up independent batches in each thread efficiently and then those can be submitted to the GPU (serially) by the main thread. OpenGL doesn't allow this because every OpenGL call uses a hidden magic context and because of the limitations that imposes on buffer mapping.

                    In metaphor speech, D3D is kinda like Linux today and OpenGL is like Linux 1.3 when the Big Kernel Lock was introduced. You simply can't do multi-threaded rendering in OpenGL without locking every GL call with a single global mutex, while D3D allows you do a lot of the work completely independently.

                    The actual draw calls are actually pretty minor. Submitted those independently on each thread is unimportant because the hardware is serializing those. The vast majority of the work in a modern renderer is filling up buffers with data, which is very time-consuming in a complex renderer. You have to completely serialize that in OpenGL.

                    Originally posted by artvision
                    First of all developers don't like API's like dx11 or ogl4. They want to get rid of them. They want direct access to hardware with a C-like language. If they do that, then their games will have much batter graphics.
                    Academics who sit around fantasizing what hardware could someday be like may wish for what you describe. Those of us who actually write real code today want API's that reflect what actual, real, hardware already in consumers' hands can do.

                    GPU's still have a lot of "fixed function" core built in. The polygon rasterizer (as just one example) is entirely fiixed function, and there's no possible way to write an equivalent in OpenCL that can perform anywhere near as fast. You have to write a vertex shader to feed vertices to the polygon rasterizer and a separate fragment program to process the individual fragments. Trying to implement that middle step yourself just results in a massive drop in performance for absolutely no gain. Then there's things like the fixed-function hierarchial Z buffer, fixed-function alpha test, fixed-function scissor test, etc. No hardware implements those in a programmable fashion and it's unlikely that any hardware is going to stop doing those fixed-function any time soon.

                    So far as the API for D3D11 or what OpenGL _should_ be, there really isn't much of an API anymore.

                    Your ideal API basically is comprised of:

                    (1) Allocate buffers in GPU memory
                    (2) Upload compiled programs to GPU memory
                    (3) Create input/output stream configurations for GPU programs
                    (4) Run GPU programs with a particular stream configuration

                    There's a number of finer details of course (particularly around textures, which are a bit more complex than other kinds of memory buffers due to mipmaping, tiling, etc.), but that's pretty much it.

                    Comment


                    • #40
                      The multi-threaded nature of D3D is entirely software based. It's hidden behind the implementation, sure, but it's still software based. Interaction with the video card is serial in nature, whichever API you're using.
                      This is not a for or against anything, I just wanted to make sure that point was understood.
                      Hmm, this thread has gone quite offtopic. Suppose that happens whenever OpenGL is mentioned (or <insert desktop environment> too).

                      Comment

                      Working...
                      X