Announcement

Collapse
No announcement yet.

Steam, Source Engine Get First-Rate Love On Mac OS X

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #76
    Originally posted by CNCFarraday View Post
    CryEngine is definetly NOT a huge hack. It is a cut-throat optimized real time rendering engine. It is SUPPOSE to suck up all cycles and frankly, given the quality of the rendering it is VERY efficient.
    Literaly taken from a Crytek research paper[1] about virtual textures:
    "On 64 bit and enough main memory or when using half resolution textures the texture streaming is not necessary and performance is more stable."

    The fact that Crysis requires pure voodoo to run is common knowledge.[2]

    [1]: http://ati.amd.com/developer/SIGGRAP...ure_Topics.pdf
    [2]: http://www.xtremesystems.org/forums/...=173021&page=2

    Comment


    • #77
      Originally posted by V!NCENT View Post
      Literaly taken from a Crytek research paper[1] about virtual textures:
      "On 64 bit and enough main memory or when using half resolution textures the texture streaming is not necessary and performance is more stable."

      The fact that Crysis requires pure voodoo to run is common knowledge.[2]

      [1]: http://ati.amd.com/developer/SIGGRAP...ure_Topics.pdf
      [2]: http://www.xtremesystems.org/forums/...=173021&page=2
      Yes, but compared to what ? For the quality of the render output, CryEngine is still very efficient.

      Note that efficiency doesn't mean it scales back well. That is an entirely different question. CryEngine is designed to scale Upwards not Downwards.

      Also note that CryEngine is the paid whore-child of Intel and NVidia which poured massive amounts of funds in its development. The fact that it runs poorly on AMD/ATI isn't quite the proof of the engine's poorness. See Adobe's CS suite for a similar situation where there isn't single AMD machine in any development / testing team...

      As for the Virtual Texture Abstract, that's not a *Hack*. They were pushing the boundaries of real time rendering and the results are impressive. The engine still looks and runs better for the given amount of detail, rendering quality and such than any engine.

      Now, I'm not glorifying CryEngine/CryTek. I'd rather have a scalable engine than a uber-engine that needs Madoff-size hardware budgets. From my PoV, Valve's Source is better than CryTek's CryEngine...
      Just that, technically it is very impressive for a real time interactive rendering engine.

      Comment


      • #78
        Originally posted by CNCFarraday View Post
        Yes, but compared to what ? For the quality of the render output, CryEngine is still very efficient.

        Note that efficiency doesn't mean it scales back well. That is an entirely different question. CryEngine is designed to scale Upwards not Downwards.
        Everything clased better upwards than downwards -> Duh.
        It is efficient -> Duh.

        Nothing said, moving on...

        Also note that CryEngine is the paid whore-child of Intel and NVidia which poured massive amounts of funds in its development. The fact that it runs poorly on AMD/ATI isn't quite the proof of the engine's poorness. See Adobe's CS suite for a similar situation where there isn't single AMD machine in any development / testing team...
        If CryEngine was funded by Intel and nVidia so much than how didn't it totally work with SLI and it does work in Crossfire?
        Why are all the people that are complaining have something along the lines of a:
        Intel Q6600 and a GeForce 8800GT?
        And ATI is primarily used in studios because it has DirectX 11 and who's first to deliver the latest DirectX will supply the de facto standard in graphics cards for that series.

        As for the Virtual Texture Abstract, that's not a *Hack*. They were pushing the boundaries of real time rendering and the results are impressive. The engine still looks and runs better for the given amount of detail, rendering quality and such than any engine.
        You might want to read that paper (cause I have) and in my book it is a hack because it does things in ways that graphics cards and processors aren't designed for but can be used for,

        That said the only impressive thing about the CryEngine is that it is capable of having a lot of dynamic lightning all at once. That's it. Voxels where already done. Virtual Texture/Clip Mapping was already done by the time Quake 4 came out. HDR, shaders, post processing. All done before CryEngine. Nothing new except the speed at which it could deliver all this lightning in the forests.

        Comment


        • #79
          I don't want to get into an argument with you but not everything scales better upwards, or we don't have the same definition of scaling. Valve's Source engine, the variant from HL2:EP2 doesn't look better, with more details on a better machine that wasn't available when it first came out, while CryEngine does.

          There's no "if". CryTek got a few million $ from both NVidia and Intel. It is optimized for Intel's Core Architecture and, presumably Nehalem. That's a kind of software prostitution if you ask me, but that's how it went.

          Dunno why "people" complaing about ths and that. Maybe they bought a shitty "chaintek" motherboard. Maybe they have a pirated windoze. Who knows? The fact is that in every benchmark, CryEngine favours more than a little the Core architecture on the CPU side and GeForce architecture on the GPU side.

          Also, at launch, Crysis totally failed on ATI cards and Crossfire. They patched the shit out of it to make it work.

          DX11 is a marketing thing. CryEngine with DX9 looks better than any DX11 enabled engine. It is too soon to start calling names on this. There's nothing in DX11 that can't be done "by hand" in OpenGL.

          Well the speed is key, isn't it? True, nothing was new, 3D Studio MAX did all those and much more for what, 15 years ? But doing it REAL TIME and INTERACTIVELY is the really tricky part. Neither 3DS, Maya, Blender or whatever can output that quality in real time with user interactivity...

          Comment


          • #80
            Originally posted by CNCFarraday View Post
            I don't want to get into an argument with you but not everything scales better upwards, or we don't have the same definition of scaling. Valve's Source engine, the variant from HL2:EP2 doesn't look better, with more details on a better machine that wasn't available when it first came out, while CryEngine does.
            By scaling I meant increased shader quality, Texture detail and the amount of polygons per performance. What is your definition of scaling?

            Well the speed is key, isn't it? True, nothing was new, 3D Studio MAX did all those and much more for what, 15 years ? But doing it REAL TIME and INTERACTIVELY is the really tricky part.
            Well, ET:QW is kinda playable in an interactive way with clip mapping... Also voxel terrain is nothing new in games. Shaders have been there for years in games.

            Crysis is so beautiful because of the artwork, really. But what made it shine is the open terrain combined with the lightning And yes that they could combine all of that into one engine is an awesome achievement. CryTek had to do a lot or research as seen in the ending credits.

            Comment


            • #81
              It's true that Crysis favour's nvidia and this thread confirms it:
              http://www.ngohq.com/skds-corner/145...in-crysis.html
              There's even a patch in thread to show how it sucks on nvidia cards.
              I'm not expert but when single 8800gts runs better Crysis then my 2x4870 then something is fishy.And for the record Crysis doesn't scale well or did everyone forget that you couldn't run Crysis on 64-bit machines until 1.3 patch or something and it still crashes.And how the hell even next generation of cards can struggle with this game and call it self good scaling.Tell al least one review where normal PC setup runs it at 60fps I dare you.

              Comment


              • #82
                Originally posted by kUrb1a View Post
                It's true that Crysis favour's nvidia and this thread confirms it:
                http://www.ngohq.com/skds-corner/145...in-crysis.html
                There's even a patch in thread to show how it sucks on nvidia cards.
                I'm not expert but when single 8800gts runs better Crysis then my 2x4870 then something is fishy.And for the record Crysis doesn't scale well or did everyone forget that you couldn't run Crysis on 64-bit machines until 1.3 patch or something and it still crashes.And how the hell even next generation of cards can struggle with this game and call it self good scaling.Tell al least one review where normal PC setup runs it at 60fps I dare you.
                Well... At high settings I can defo get 60fps at full res. Setting it to 'enthusiast' (Crysis Warhead on Windows XP) and full res and whatever all high it runs 30fps or so I think. Please note that Warhead's enthusiast setting is what was supposed to be DirectX 10 on Vista.

                Monitor 1680*1050
                Radeon 5770 with 1GB RAM DDR5
                8 GB sys RAM
                Phenom 9950x4

                Costs you nothing if you'd buy my setup now...

                Comment


                • #83
                  How the hell you managed to get 60fps in crysis with that setup.Ati 5770 is something like 4870 in crossfire with 4850 or less.In this link they used Intel x58,Core i7 and 3x2Gb DDR3 ram and this graphic barely managed to 20+ FPS.I'm just curious what voodoo you use.
                  http://www.pcgameshardware.com/aid,6...eviews/?page=4

                  Comment


                  • #84
                    Originally posted by kUrb1a View Post
                    How the hell you managed to get 60fps in crysis with that setup.Ati 5770 is something like 4870 in crossfire with 4850 or less.In this link they used Intel x58,Core i7 and 3x2Gb DDR3 ram and this graphic barely managed to 20+ FPS.I'm just curious what voodoo you use.
                    Not to mention the full flex 8x AA and 8x anastrophic.

                    My Voodoo:
                    Latest Catalyst.
                    Windows XP user account (so no admin/root, which load uninfected base XP install and nothing else but drivers)
                    Loaded systray shizzle: Steam, OpenOffice.org quick starter, Realtek driver stuff, Logitech webcam app, Bluetooth and Logitec Wireless controller game software.

                    No overclocking. 32bit so effectively 3,79 or so GB RAM. Service pack 3 and latest updates. No virus scanner. 2 year old install of XP. Warhead unpatched.

                    What's funny is that my onboard HD 3300 IGP and 128MB RAM onboard (yes, onboard!) on my motherboard is probably doing something Crossfire-ish.

                    I had a 4870x2 before I got this card but it died and yes I didn't had that same performance as I have now (it was lower!).

                    So there you have it...

                    Comment


                    • #85
                      Originally posted by V!NCENT View Post
                      What's funny is that my onboard HD 3300 IGP and 128MB RAM onboard (yes, onboard!) on my motherboard is probably doing something Crossfire-ish...
                      I don't think that would help (and I doubt the drivers would let you enable that combination, although I'm not 100% sure).

                      The 5770 has maybe 10x the shader power of the 3300 (and that's being generous to the 3300 ), so chances are good that the overhead of splitting up the work would outweigh the added performance you got from the extra GPU.

                      Comment


                      • #86
                        Originally posted by bridgman View Post
                        I don't think that would help (and I doubt the drivers would let you enable that combination, although I'm not 100% sure).

                        The 5770 has maybe 10x the shader power of the 3300 (and that's being generous to the 3300 ), so chances are good that the overhead of splitting up the work would outweigh the added performance you got from the extra GPU.
                        Hmz... Maybe it's the fact that it is Warhead, which was later released than Crysis.

                        The CryEngine would probably have been tweaked a lot more (it is ofcourse in constant development) en combine that with the driver speed that has gone up since the release of Crysis... Maybe that has something to do with it?

                        Also the fact that Crysis (not Warhead) got nerved on XP might have something to do with it. For example no multi-core CPU usage and ultra high setting (supposedly DirectX ). Maybe it go de-nerved in Warhead, because I don't have to hack around with the config file to get the uber high settings in XP. There might as well be multi-core CPU usage now?

                        With Crysis plus config file tweaks to get the 'DirectX 10 effects' I had some serious frame dropping on my 4870x2, with and without the Crossfire and I can't remember Warhead running OK with that card on high settings.

                        I could film it for proof, but I only have a webcam and a Samsung 8MP phone, so that wouldn't proof much I guess...

                        I was realy amazed though. I saw benchmarks that showed that the 4870x2 should have been way faster...

                        Wierd stuff...

                        Comment

                        Working...
                        X