Announcement

Collapse
No announcement yet.

Steam, Source Engine Get First-Rate Love On Mac OS X

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #71
    Originally posted by deanjo View Post
    Yup, it sure does, it fixes it's bugs in a timely manner. Something SDL is very slow to do.
    I'd hesitate to say "timely"- in comparison to SDL, perhaps, but "timely" isn't quite what I'd call it for DirectX...

    Comment


    • #72
      SDL isn't like DirectX, i mean DirectX is a package with 3D rendering, audio and etc. In that case SDL is somewhat "inferior" to DirectX cause it require 3rd party software like Gfx drivers, ALSA/openAL and etc. So i can say that SDL is more stable than DirectX and doesn't require much updates. Sure the 3rd parties software like Alsa, OpenGL (included within gfx drivers), X.org are much more updated than DirectX so to speak... DX has a slow update if you take all softwares cited previously.

      Comment


      • #73
        Originally posted by Setlec View Post
        SDL isn't like DirectX, i mean DirectX is a package with 3D rendering, audio and etc. In that case SDL is somewhat "inferior" to DirectX cause it require 3rd party software like Gfx drivers, ALSA/openAL and etc. So i can say that SDL is more stable than DirectX and doesn't require much updates. Sure the 3rd parties software like Alsa, OpenGL (included within gfx drivers), X.org are much more updated than DirectX so to speak... DX has a slow update if you take all softwares cited previously.
        I wouldn't say SDL is more stable at all. Lack of reaction time to resolve outstanding issues isn't a sign of stability at all it's just a sign of when Icculas has time to look at it. The reason that many of those subsystems have to be updated is to catch up to what Windows has had for years with it's DirectX foundation. You use X.org as an example of frequent updates but it is still has a long long way to go to catch up with Windows on capabilities, features and stability. The same goes with audio, driver support, feature support, input device support, etc. If you want to see SDL strut it's stuff your best experience with it is actually being used on a windows system where it can utilize DirectX for it's subsystems and even there it does not utilize directx to it's fullest capabilities.

        Comment


        • #74
          hmm i see your point. maybe someone could make take over sdl from icculus. (Honestly i've lost faith in icculus since ut3 fake port!)

          The reason that many of those subsystems have to be updated is to catch up to what Windows has had for years with it's DirectX foundation.
          if we take OpenGL it has many features misused or ignored like tessellation. Khronos group needs to look into the future instead of looking to the past (to keep up with its concurrent).

          Comment


          • #75
            Originally posted by V!NCENT View Post
            And Quake 4?

            Probably, but the CryEngine is a huge hack. It can't run that well on Windows. It taps into every single clockcycle that it can get its hands on and that resulted in a lot of people not being able to play Crysis even though they had capable PC's.
            CryEngine is definetly NOT a huge hack. It is a cut-throat optimized real time rendering engine. It is SUPPOSE to suck up all cycles and frankly, given the quality of the rendering it is VERY efficient.

            I have no ideea wether Q4 uses SDL threading or their own model. If you want to use SDL to set up a GUI window, capture keys and eject the CD, fine. Beyond that, you are on your own.

            Originally posted by Setlec
            @CNCFarraday: so if SDL is for toy project, why about cryengine use DirectX? AFAIK SDL is a multimedia system for gaming it brings input layer, audio, layer, video layer (with OpenGL) and a networking layer. Does DirectX do something else than that? Maybe it does open you DVD device as coffee holder.
            Well, I hate M$ as much as the next guy but M$ invested a lot into a comprehensive full-featured framework for high performance gaming. I don't want to get into a flame war right now, but from "professional" game developers there is NOTHING on Linux/Mac world that can match DX from top to bottom and sideways.

            Please stop with this idiothic crap. I work on advanced engineering and scientific visualization projects developed for Linux/UNIX with OpenGL. When it comes to "serious" 3D OpenGL is unmatched. However, if you want to get the most of modern consumer graphics cards in a real time rendering engine for a game, you go DirectX because it is superior in this narrow field. It is optimized specifically for the singular task of making games.

            SDL has nothing whatsoever over DX. If you really want I can get down and dirty into a technical comparison of various APIs and frameworks. In the real world we do code reviews for tools we use in industrial/scientific strength projects. I was part of a team that did code review on SDL, on VTK (which can be built on top of SDL) and a few other. SDL sucks. It is very good for homebrew and Quake 4, but for people who intend to make shitloads of money from selling the game and don't give a flying f**k on anything else it just doesn't cut it. I wouldn't use SDL in a 'real' project because A: it sucks, it has bugs, it has open bugs that stay like that for ages, it is slow, it lacks key features ( I'm not going to reinvent the wheel for my project ) and upper management will never allow it, even if I will and my CTO will because it brings unknown variables and costs into the project.

            Until we see something on the scope of world of warcraft, made, deployed and used by 100k+ gamers online made with SDL and the whole bunch of homebrew stuff, big players won't take it seriously.

            SDL isn't like DirectX, i mean DirectX is a package with 3D rendering, audio and etc. In that case SDL is somewhat "inferior" to DirectX cause it require 3rd party software like Gfx drivers, ALSA/openAL and etc. So i can say that SDL is more stable than DirectX and doesn't require much updates. Sure the 3rd parties software like Alsa, OpenGL (included within gfx drivers), X.org are much more updated than DirectX so to speak... DX has a slow update if you take all software cited previously.
            Seriously? On what is this exactly based ? From the point of view of 3d game development on the level of Crysis / FarCry2 / whatever DirectX is king. We had to write "by hand" most of the "advanced" features that DirectX offers for free, fully optimized. I mean, I don't complain, I enjoyed it and we did it for solid technical reasons, but then again we simulate reactors hooked up to real hardware probes and mainframe simulators that run a model of the real thing. Nobody in their right mind will approve a project that will have to pay top money for senior programmers with advanced computational geometry knowledge to write code in OpenGL that DirectX does for free. Maybe with OpenGL 4 that will change, but that's another thing.

            And don't even get me started on audio.

            The bottom line it that people paying the money needed to make something like cryengine don't give a s**t about using open source, "standards", "cross platform" and such.

            I've been a Linux/UNIX developer for a decade+ and I wouldn't work on windows platforms for anything. However, Linux video/audio/3d is BROKEN. I hate to borrow this meme from Slashdot, but it is "broken by design". Microsoft got it right, unfortunately, and put their big pile of cash behind DX which was a piece of s**t until DX 8. But FOR GAMING, DX 8+ and the Windoze graphics model is simply better, technically speaking. Now, if everybody could agree on one thing and put all their butts behind something like DRI + OpenGL 4 + Open AL + Open CL maybe we'll have a solid multimedia foundation on *nixes.

            If we're talking about homebrew / indie gaming then, by all means, SDL all the way.

            Comment


            • #76
              Originally posted by CNCFarraday View Post
              CryEngine is definetly NOT a huge hack. It is a cut-throat optimized real time rendering engine. It is SUPPOSE to suck up all cycles and frankly, given the quality of the rendering it is VERY efficient.
              Literaly taken from a Crytek research paper[1] about virtual textures:
              "On 64 bit and enough main memory or when using half resolution textures the texture streaming is not necessary and performance is more stable."

              The fact that Crysis requires pure voodoo to run is common knowledge.[2]

              [1]: http://ati.amd.com/developer/SIGGRAP...ure_Topics.pdf
              [2]: http://www.xtremesystems.org/forums/...=173021&page=2

              Comment


              • #77
                Originally posted by V!NCENT View Post
                Literaly taken from a Crytek research paper[1] about virtual textures:
                "On 64 bit and enough main memory or when using half resolution textures the texture streaming is not necessary and performance is more stable."

                The fact that Crysis requires pure voodoo to run is common knowledge.[2]

                [1]: http://ati.amd.com/developer/SIGGRAP...ure_Topics.pdf
                [2]: http://www.xtremesystems.org/forums/...=173021&page=2
                Yes, but compared to what ? For the quality of the render output, CryEngine is still very efficient.

                Note that efficiency doesn't mean it scales back well. That is an entirely different question. CryEngine is designed to scale Upwards not Downwards.

                Also note that CryEngine is the paid whore-child of Intel and NVidia which poured massive amounts of funds in its development. The fact that it runs poorly on AMD/ATI isn't quite the proof of the engine's poorness. See Adobe's CS suite for a similar situation where there isn't single AMD machine in any development / testing team...

                As for the Virtual Texture Abstract, that's not a *Hack*. They were pushing the boundaries of real time rendering and the results are impressive. The engine still looks and runs better for the given amount of detail, rendering quality and such than any engine.

                Now, I'm not glorifying CryEngine/CryTek. I'd rather have a scalable engine than a uber-engine that needs Madoff-size hardware budgets. From my PoV, Valve's Source is better than CryTek's CryEngine...
                Just that, technically it is very impressive for a real time interactive rendering engine.

                Comment


                • #78
                  Originally posted by CNCFarraday View Post
                  Yes, but compared to what ? For the quality of the render output, CryEngine is still very efficient.

                  Note that efficiency doesn't mean it scales back well. That is an entirely different question. CryEngine is designed to scale Upwards not Downwards.
                  Everything clased better upwards than downwards -> Duh.
                  It is efficient -> Duh.

                  Nothing said, moving on...

                  Also note that CryEngine is the paid whore-child of Intel and NVidia which poured massive amounts of funds in its development. The fact that it runs poorly on AMD/ATI isn't quite the proof of the engine's poorness. See Adobe's CS suite for a similar situation where there isn't single AMD machine in any development / testing team...
                  If CryEngine was funded by Intel and nVidia so much than how didn't it totally work with SLI and it does work in Crossfire?
                  Why are all the people that are complaining have something along the lines of a:
                  Intel Q6600 and a GeForce 8800GT?
                  And ATI is primarily used in studios because it has DirectX 11 and who's first to deliver the latest DirectX will supply the de facto standard in graphics cards for that series.

                  As for the Virtual Texture Abstract, that's not a *Hack*. They were pushing the boundaries of real time rendering and the results are impressive. The engine still looks and runs better for the given amount of detail, rendering quality and such than any engine.
                  You might want to read that paper (cause I have) and in my book it is a hack because it does things in ways that graphics cards and processors aren't designed for but can be used for,

                  That said the only impressive thing about the CryEngine is that it is capable of having a lot of dynamic lightning all at once. That's it. Voxels where already done. Virtual Texture/Clip Mapping was already done by the time Quake 4 came out. HDR, shaders, post processing. All done before CryEngine. Nothing new except the speed at which it could deliver all this lightning in the forests.

                  Comment


                  • #79
                    I don't want to get into an argument with you but not everything scales better upwards, or we don't have the same definition of scaling. Valve's Source engine, the variant from HL2:EP2 doesn't look better, with more details on a better machine that wasn't available when it first came out, while CryEngine does.

                    There's no "if". CryTek got a few million $ from both NVidia and Intel. It is optimized for Intel's Core Architecture and, presumably Nehalem. That's a kind of software prostitution if you ask me, but that's how it went.

                    Dunno why "people" complaing about ths and that. Maybe they bought a shitty "chaintek" motherboard. Maybe they have a pirated windoze. Who knows? The fact is that in every benchmark, CryEngine favours more than a little the Core architecture on the CPU side and GeForce architecture on the GPU side.

                    Also, at launch, Crysis totally failed on ATI cards and Crossfire. They patched the shit out of it to make it work.

                    DX11 is a marketing thing. CryEngine with DX9 looks better than any DX11 enabled engine. It is too soon to start calling names on this. There's nothing in DX11 that can't be done "by hand" in OpenGL.

                    Well the speed is key, isn't it? True, nothing was new, 3D Studio MAX did all those and much more for what, 15 years ? But doing it REAL TIME and INTERACTIVELY is the really tricky part. Neither 3DS, Maya, Blender or whatever can output that quality in real time with user interactivity...

                    Comment


                    • #80
                      Originally posted by CNCFarraday View Post
                      I don't want to get into an argument with you but not everything scales better upwards, or we don't have the same definition of scaling. Valve's Source engine, the variant from HL2:EP2 doesn't look better, with more details on a better machine that wasn't available when it first came out, while CryEngine does.
                      By scaling I meant increased shader quality, Texture detail and the amount of polygons per performance. What is your definition of scaling?

                      Well the speed is key, isn't it? True, nothing was new, 3D Studio MAX did all those and much more for what, 15 years ? But doing it REAL TIME and INTERACTIVELY is the really tricky part.
                      Well, ET:QW is kinda playable in an interactive way with clip mapping... Also voxel terrain is nothing new in games. Shaders have been there for years in games.

                      Crysis is so beautiful because of the artwork, really. But what made it shine is the open terrain combined with the lightning And yes that they could combine all of that into one engine is an awesome achievement. CryTek had to do a lot or research as seen in the ending credits.

                      Comment

                      Working...
                      X