Announcement

Collapse
No announcement yet.

The Most Important Project Since Mesa 1.0?

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    Originally posted by Luke_Wolf View Post
    got a link to that letter to the court? This is the first I've heard of it.
    http://cdn.arstechnica.net/wp-conten...acle-Brief.pdf

    Comment


    • #47
      Originally posted by IanS View Post
      Part of the reason D3D is as popular as it is comes down to a lot of game programmers drinking the "C++ is the only serious language for game development" cool-aid and D3D is in C++ which makes sense to them. While OpenGL is written in C which tends to take them out of their comfort zone. There are other reasons too, including D3D having better documentation in general and more plentiful and up-to-date learning material. Microsoft are better at marketing and FUD spreading than those in the OpenGL camp which is also a huge factor. As to performance, Valve and others have shown that OpenGL definitely has an edge over D3D though.
      If you were arguing for managed languages and things like .NET and Python I might let this go, but you're arguing for C, and going completely off the deep end. Even just ignoring the project management benefits that having Object Oriented Design patterns built into the language bring, with C you have no concept of things like generics or templates meaning that every time you want a basic data structure like a stack unless you plan on doing a lot of preprocessor magic you're going to have to reimplement the data structure each and every time you want to use it as a result you're going to have more lines of code and more reimplementations of things meaning more space for there to be errors in.

      As well C has no concept of things like strings, instead it deals in character arrays which are painful at best and outright dangerous at worst because any time you play directly with arrays you're setting up the potential to have a bad memory access. With C++ you never have to directly work with arrays if you don't want to and in fact it's pushed that you just use vectors in places where you need the performance of arrays. And right here is where managed languages such as Java or C# do have an advantage for the programmer in that they remove the necessity for calling delete because they're garbage collected (do note that you can strap a garbage collector to your C++ programs) thus getting rid of the potential for referencing memory after it's been deleted thus preventing invalid memory access as a thing which is then called "Type Safe".

      That said the garbage collector can have a performance hit although generational garbage collectors minimize this, and an IL/Bytecode interpreter might not optimize as much as it could when it's JITing resulting in less performance, although there is the possibility of the reverse being true for binary distributed software since the Bytecode can then be tuned towards the users specific hardware by the JITer (I will grant however that I've never seen this done in the wild). Obviously if you have the source available you can have the compiler optimize it exactly for your hardware with native code. Another benefit to managed languages is that they compile quickly and as a result the developer is spending less time waiting on the compiler.

      Now on the other paw outside of cases where your compiler's implementation of libc++ is crap there is no performance advantage of going with C over C++, and in cases where the implementation is crap the difference is basically negligible because that was one of the core things that Bjarne Stroustrup went after was not having a performance loss over C. As far as the OpenGL vs DirectX speed difference goes one data point proves nothing and we can just look at WoW to see results that are quite the opposite. Besides I wouldn't consider Valve a neutral party at this point due to their interest in Linux.

      Comment


      • #48
        Originally posted by Luke_Wolf View Post
        // Going off on a tangent...
        Lol, having trouble with your reading comprehension? Where in my post did I say that I advocate using C? I stated the fact that D3D is written in C++ and OpenGL in C. Most game programmers who decided to jump straight into C++ because they heard that it's what the pros use never bothered to learn C, and since as you have pointed out C++ and C are totally different beasts most of those typical C++ game programmers will have issues moving to/learning/using OpenGL directly if they don't know C. This naturally leads to a bias against OpenGL and to the comments about D3D just making more sense and being easier to pick up (thought this is also influenced by the better documentation and the prolific availability of decent learning material.)

        I do agree that C++ is a useful language and it has some definite advantages, particularly if you are using the latest standard, but it is an ugly language and I tend to hate to read code written in it.

        Comment


        • #49
          Originally posted by IanS View Post
          Part of the reason D3D is as popular as it is comes down to a lot of game programmers drinking the "C++ is the only serious language for game development" cool-aid and D3D is in C++ which makes sense to them. While OpenGL is written in C which tends to take them out of their comfort zone.
          You have never written a large game in your life, have you? Or talked to a real game developer?

          Your presumption that C is what makes OpenGL unpalatable is also silly even ignoring your C++ presumption. Even as a pure C API goes, _OpenGL is a bad one_. No sane C programmer (even if he hates C++ with a passion) would design an API that looks anything remotely like OpenGL. Magic hidden globals? Integer IDs for distinct opaque handles? That binding and latch system? A shading language that from 3.0 until 4.3 didn't even reflect how the hardware and drivers actually work? No proper error handling until 4.3? You'd get (or should get) fired if you design crap like that.

          Microsoft are better at marketing and FUD spreading than those in the OpenGL camp which is also a huge factor.
          Saying "FUD" is the new FUD. Using it as a way to discredit the other side of an argument doesn't make a solid point, it just makes you look idiotic. As a friendly note.

          As to performance, Valve and others have shown that OpenGL definitely has an edge over D3D though.
          "Others" is something you're possibly making up here; who knows, you're not presenting any evidence. Plenty of other engines, including Phoronix's beloved Unigine, has D3D as coming out on top by rather wide margins on many benchmarks. Note the following link (see, presenting facts is fun!) is written by one of the most prolific OpenGL developers and evanglists around, and even he has to admit the facts: http://www.g-truc.net/post-0547.html "Direct3D 11 reflects better how today's hardware works" - from the article, and plain damn fact if you even remotely know how hardware or these graphics APIs function. Also check out the Gallium3D design docs and their repeated references to D3D10. I've never seen any source besides the Valve claims that shows GL coming out as a clear winner in a modern graphics engine; please link some if you have them, though.

          D3D drivers _must_ use WDDM (a driver indirection layer, basically a micro-kernel-ish system just for graphics drivers in Windows), while GL drivers only are forced to use them for basic device initialization. As an example of WDDM's usefulness, there is a bug in Borderlands 2's latest DLC that causes some AMD cards to crash a lot (likely in the driver). With OpenGL, at best the app would be killed with a GPF (segmentation fault) because half the driver is loaded into the app's process space and quite possibly the whole OS would go down with a BSOD (kernel oops) from the substantial portion loaded into the kernel. Instead, between WDDM (vaguely like using a micro-kernel, just for graphics) and Unreal's proper handling of the removed device signals, the worst that happened was the screen would freeze for a few seconds, flash from the a monitor reset, and then the game would keep on playing like nothing happened, with a desktop notification waiting for me later letting me know my display driver had crashed. The hitches from the driver crashes were annoying but my game kept on playing, no lost play time, even network connections stayed up in co-op.

          Do you have any idea how damn cool that is? Compared to how the Linux FOSS drivers managed to kernel oops my machine at least once a week (a few years ago, assuming it's less buggy now) just by running GNOME2 and Firefox? I'll take the WDDM safety _any_ day over some more FPS in one engine, especially for a game in that engine that already runs at 270+ FPS on mediocre hardware.

          Microsoft designed WDDM specifically because video drivers are very large, very complex, and were shown to be a _huge_ source of BSODs, the vast majority of crash reports they received. Linux "taints" kernels with proprietary drivers because of their bugginess; that makes the kernel developers' lives easier, but as usual in the FOSS world, completely dismisses all the needs of the user. Microsoft's solution protected the _user_ from the crashes. Guess which one the average user/gamer would prefer?

          Which kinda gets back to an old point of mine and the new Linux graphics stack. They're catching up to WinXP with compositing tossed in. Yay, I guess. It's hard work and the developers are doing a hell of a job, don't get me wrong, but they're under-staffed and trying to pull off a miracle just by aiming for that "low" goal. The kernel and Mesa developers should consider a more fool-proof driver design if they can swing it with their resources.

          Comment


          • #50
            Originally posted by IanS View Post
            Lol, having trouble with your reading comprehension? Where in my post did I say that I advocate using C? I stated the fact that D3D is written in C++ and OpenGL in C. Most game programmers who decided to jump straight into C++ because they heard that it's what the pros use never bothered to learn C, and since as you have pointed out C++ and C are totally different beasts most of those typical C++ game programmers will have issues moving to/learning/using OpenGL directly if they don't know C. This naturally leads to a bias against OpenGL and to the comments about D3D just making more sense and being easier to pick up (thought this is also influenced by the better documentation and the prolific availability of decent learning material.)

            I do agree that C++ is a useful language and it has some definite advantages, particularly if you are using the latest standard, but it is an ugly language and I tend to hate to read code written in it.
            My reading comprehension is just fine thank you, while you didn't explicitly state that C was what they should be using but you very clearly implicitly stated it by declaring that you thought that C++ shouldn't be the end all be all, and then proceeded to go along the lines of "Well DirectX is written in C++ and OpenGL is implemented in C, and Valve says that OpenGL is faster". If you were actually going with the lines you're claiming to go with now then you wouldn't have ended on that note.

            Furthermore your notions about moving between C++ and C are nonsense. The C++ programmer just has to learn to be more adept at the preprocessor in order to achieve an even semi-manageable design (and to do what he could do natively in C++) and what the standard functions for C are as well as having to deal with insanity such as character arrays. Otherwise everything else is the same because C++ is a superset of C. It's just with C++ you have a lot more tools available to you and the code can be a lot cleaner and nicer. For instance want reusable containers? It's simple: use templates. Want that with C? well good luck with the ugly preprocessor hacking, because otherwise you're going to have to write variants all very specific to the situation, with code that would otherwise be absolutely generic. Now what is true is that C++ programmers are typically not trained in procedural techniques, however let's be perfectly blunt here: while procedural designs make trivial things quick, they're not as flexible as an Object Oriented variant that takes far more lines of code and designing to create (which is fine for some tasks sure, but that doesn't make it not poor design), and for non-trivial tasks it rather quickly becomes unmanageable without an excessive amount of manpower due to it's inability to create components without turning them into their own small little programs or using ugly preprocessor hacks. Note however on non-trivial tasks the Object Oriented code is actually smaller and cleaner and better tested due to being a model based around a bunch of reusable components as well as less design time required due to not having to get out of your own way since everything is encapsulated.
            Last edited by Luke_Wolf; 07-20-2013, 05:13 AM.

            Comment


            • #51
              Originally posted by Luke_Wolf View Post
              If that's the case why hasn't a certain WINE producer been sued yet? Also the whole they're going to sue us FUD is silly at best. of all of the things they might sue over their development libraries and languages are the least likely. Why? because Microsoft wants people using their technologies and if they sue over them then would you care to make a guess at what people are less likely to do? That's right, stop or lessen use of it due to fear of getting sued themselves.
              They will sue if they will feel threatened by it thats all. Wine is not threatening them and this D3D reimplementation is the same.
              You conveniently forgot to quote my last sentence
              and they can do much more to annoy implementation and usefulness of DX API in Linux

              Comment


              • #52
                Forgive me if this has already been asked, since I only had time to skim the thread, but doesn't this mean support could be passed through to programs like Virtualbox or QEMU to get better performance than Wine's d3d? Last time I checked, Wine's lib is how Direct3D works in Virtualbox at the moment, although they have an experimental WDDM driver architecture working.

                Even still, I do like the idea of Gallium3D taking advantage of its portable nature someday and supporting a much wider base of operating systems.

                Comment


                • #53
                  Originally posted by Luke_Wolf View Post
                  got a link to that letter to the court? This is the first I've heard of it.
                  http://arstechnica.com/tech-policy/2...result-stands/

                  Comment


                  • #54
                    Originally posted by scionicspectre View Post
                    Forgive me if this has already been asked, since I only had time to skim the thread, but doesn't this mean support could be passed through to programs like Virtualbox or QEMU to get better performance than Wine's d3d? Last time I checked, Wine's lib is how Direct3D works in Virtualbox at the moment, although they have an experimental WDDM driver architecture working.

                    Even still, I do like the idea of Gallium3D taking advantage of its portable nature someday and supporting a much wider base of operating systems.
                    I don't know how their stuff works, but I think what they want is a WDDM DDI driver with a fake GPU underneath. The way this is implemented would replace the D3D stack (d3d9.dll) which is arguably not very virtualization friendly. I suppose though, if they just want a full pass-through, they could totally do that. It wouldn't work on proprietary drivers unless someone writes pipe_gl or they keep wined3d as a backup (like we do), but it's perfectly possible.

                    If someone wants to, the Windows DDI for GPU drivers could be implemented and it would be a small matter of some kernel integration to get it running with the official D3D stack. However, I just don't give a damn about Windows. If someone wants to, go ahead, but it would require a near full rewrite of nine.

                    Comment


                    • #55
                      The way I see it is that older software would work better as most new games use DirectX 10/11 and/or OpenGL anyway. DirectX 9 is dying at best, so it shouldn't pose any threat to future DirectX vs OpenGL smack talk.

                      Comment


                      • #56
                        Considering there are new games being developed for Direct3D 9 such as StarCraft 2: Legacy Of The Void. and the lifespan of such a game is at least 5 years, I don't expect Direct3D 9 to become irrelevant until 2018.

                        It's not important how many games support Direct3D 10/11. The only thing that matters is how many games supporting Direct3D 10/11 do not support Direct3D 9. And that number should be pretty big for us to even care about Direct3D 10/11.

                        The likely scenario is that Mesa will eventually support acceleration of all popular Direct3D APIs.

                        Comment


                        • #57
                          So, how can I install the Direct3D 9 state tracker?

                          and I would like to see benchmarks of this patch enabled against the default wine.

                          Comment


                          • #58
                            Regardless of anything else, the #1 reason no major vendor is going to use an open source driver on Windows is that doing so will mean computers with their GPUs will be blocked from being able to play back protected content by Microsoft.

                            Comment


                            • #59
                              Originally posted by jonwil View Post
                              Regardless of anything else, the #1 reason no major vendor is going to use an open source driver on Windows is that doing so will mean computers with their GPUs will be blocked from being able to play back protected content by Microsoft.
                              There's a number of reasons why an open source driver would never work in Window. What you're referring to is Silverlight with Netflix. Even then, eventually Netflix will switch over to HTML5.

                              Comment


                              • #60
                                Originally posted by Dukenukemx View Post
                                There's a number of reasons why an open source driver would never work in Window. What you're referring to is Silverlight with Netflix. Even then, eventually Netflix will switch over to HTML5.
                                No, I am talking about ALL content that uses the "Protected Media Path" functionality in Windows. (where the player that is trying to play the content will refuse to play unless all the components in the playback chain have been signed and verified by Microsoft)

                                Comment

                                Working...
                                X