Announcement

Collapse
No announcement yet.

Quake 2 Gets Real-Time Path Tracing Powered By NVIDIA RTX / VK_NV_ray_tracing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by birdie View Post
    Overpriced? NVIDIA is not a charity and they set the exact prices the market can bear.
    Then that's no supply and demand but supply and what I can extract from you.
    It's not NVIDIA's fault AMD cannot really compete.
    Technically they are, just not in the high end market. The RX 590 is superior to the GTX 1060 and for a better price. Anything above $250 is not the majority of people, at least according to Steam. Something like 10% of people own $350+ graphic cards, which Nvidia dominates in. The 90% is something AMD does very well in. We seem to judge Chevrolet on their fastest car like the Corvette and not the majority of sales like the Malibu and Cruze. Yes I'm aware that Nvidia is destroying AMD in sales everywhere but that's not AMD's fault but the consumer mind share. Doesn't help that people claim that $350 for a X060 card is a good deal.
    Also, do you really buy a new GPU every generation? I don't and I'm content with their pricing while replacing my GPU every 3 years. I happily run the GTX 1060 6GB at the moment and I bought it for freaking $330 right after its introduction. Pricing for new GPUs/CPUs in this country is a little bit insane.
    I'm sure the crypto mining craze had nothing to do with the 6GB 1060 MSRP $250 being sold for $330. Right now a used 1060 can be found on Ebay for $150. The prices have dropped and retailers are asking Nvidia to lower the price of the RTX 2060 cause they don't see it selling.
    Those versions of Quake Wars and Wolfenstein? Is it some kind of sick joke or what? The quality was shit, the resolution was shit, the graphics was shit and the performance was shit and no one played them. Find some valid arguments next time not some BS.
    I hear Battlefield V sales are amazing though.
    Last edited by Dukenukemx; 18 January 2019, 12:09 PM.

    Comment


    • #32
      It has been 20 years and I still can't run quake 2 with full details...

      Comment


      • #33
        You people are insane. Nvidia does RayTracing? Real time RayTracing needs at least a PetaFlop to work. So only for secondary light, only for a few effects and double the price for today.

        Comment


        • #34
          Originally posted by birdie View Post

          I don't understand your obsession with > 60fps. If you're into competitive games you will make sure your games run at the very least at 120fps and in this case RTX is a strict no-go. And if you're in for exceptional graphics, you'll enable RTX and enjoy lower framerate.

          At which framerate do AMD GPUs run raytraced BF5? At 0? So why is NVIDIA bad for implementing something which was deemed impossible just a few months ago? Your nitpicking is just pathetic. "If it's not >60fps, then RTX just sucks". Wow.
          I can give you all the data in the world, but if you can't be objective there's no point. At the speed that you're replying you are not giving yourself the opportunity to think objectively. Please review and simplify the arguments that I made, for example take AMD out the the equation and compare RTX directly to rasterisation regardless game engine support.

          Originally posted by birdie View Post
          Also, you judge RTX by the only game which implements it. So much data, so many conclusions.
          Lets look at something more abstract. Like others in this thread has shown the foundation of RTX is not new technology, it was developed in 1970s making it older than rasterisation. If you understand how hybrid rendered games using RTX or DXR works you will see it's based on 3 basic hardware components:

          1) Shader cores for GP floating point used in games for rasterisation and in some cases physics/audio
          2) RT core for spatial search used in games for reverse ray/path tracing, can be done using shaders but not as fast
          3) Tensor cores for FP matrices used for machine learning, can be done using shaders but not as fast

          Getting a balance between these new hardware components is very challenging, too big RT core and you sacrifice performance of games relying heavily on shaders vice versa. Nvidia tried to make the best of what they've got, but it's still not good enough because the RT core is brought to market prematurely. Like I said earlier every task that the new hardware components do can also be done using shaders. If you do experiments with ray or path tracing using shaders you start to realise how much processing power is actually required to make a big difference in picture quality. You don't need to run your tests in realtime or if you do just ignore the low frame rate. Try to focus on increasing and decreasing the samples per pixel that your scene uses while measuring processing power used and compare that to output quality.

          You can do the same for image noise reduction, although it's not that important as it's can be improved in software using better algorithms. The RT core on the other hand has a hard limit to it, smoke and mirrors is the only way that you can get better performance (assuming that you have implemented it correctly in the first place).

          This new technology (like any other previous tech) has huge R&D and production costs which is why nvidia raised the cost of the new cards featuring this technology. It makes sense and I would not blame nvidia if RT cores had better performance. I also blame nvidia for forcing this premature technology on the industry. RT cores needs to be 100 to 1000 times faster and not sacrifice shader performance to achieve it before it makes sense to use it everywhere. The gaming industry is effectively subsidising machine learning. There's no way that these new hardware components will allow us to run games faster at high resolution it actually takes us further away.

          If nvidia focused on and hyped scalable graphics, we could actually see improved graphics and they would sell more GPUs. We know people want this based on SLI in the past. If there's no vendor lock-in nvidia won't go after it.

          Comment


          • #35
            Originally posted by birdie View Post
            real time ray tracing which is the future.
            Sounds like an awful future with terrible 30 FPS or worse. Eww.

            To me the future is in 580Hz monitors and matching FPS. Not higher resolution crap, not ray tracing.

            Comment


            • #36
              Nice to see people doing mods for proper (albeit older) engines rather than just faffing around with toys like Unity.

              It is quite an impressive technical feat. Q2 is relatively complex to develop for and replacing the render entirely is a cool undertaking!

              Once mainstream CPUs will parallelise as well as a GPU for ray tracing, then I will be interested. For now I am done with NVIDIA GPUs and their terrible blob drivers.

              As it stands, this is more interesting news XD - https://www.phoronix.com/scan.php?pa...0-Ti-Linux-5.0
              Last edited by kpedersen; 18 January 2019, 03:22 PM.

              Comment


              • #37
                Originally posted by birdie View Post
                I don't understand your obsession with > 60fps.
                Once you use a 240Hz monitor with matching fps, you can't go back.

                Comment


                • #38
                  Originally posted by Jabberwocky View Post
                  the RT core is brought to market prematurely.
                  I've heard that like 5000 times already and there's no other way to introduce a completely new graphics technology onto the market - the first implementation will not be the fastest, the most efficient and the most well-thought. But you have to do that or you will be stuck with years old graphics which could look so much better. If you don't introduce something in hardware, game developers will never embrace it and your technology, as much as you improve it in-house, will never take off.

                  Remember how ineffective GeForce FX 5800 Ultra was? How ridiculed it was? It was the first hardware implementation of D3D 9 and Shader Model 2.0. Yet, it had its run and had its fan and later was replaced with a much refined and at least twice as efficient GeForce 6 series.

                  I just don't understand all this hostility against NVIDIA and their RT/Tensor Cores. It's an actual break through in computer graphics, yet NVIDIA is judged by the only implementation (BF5) which in itself is not the best game for that.

                  Also, you're talking about shaders running RT work but dedicated RT cores will do this work so much faster and more efficiently and also they might be run in parallel to rasterization.

                  Also, just also, someone managed to make Titan V run BF5 in RT mode using only shaders (i.e. a sort of emulation) and it ran quite slower than RTX 2080, so dedicated cores currently are indisputable.

                  Comment


                  • #39
                    Originally posted by Weasel View Post
                    Once you use a 240Hz monitor with matching fps, you can't go back.
                    "I'm an idiot and I claim that no new graphics technology may exist unless the games which implement it run at 240fps from the get go and I couldn't care less about the computational costs or the quality of graphics or its fidelity. Also I couldn't care less about NPC's AI because I enjoy playing against 1dimensional stupid bots traversing waypoints like it was done in Quake 2".

                    With such a reasoning you might go play Minecraft till the rest of your life at 5000fps. Enjoy extra-super-uber fluid blocky graphics from 2005.

                    Luckily graphics companies, Microsoft, Kronos at al have a different POV or we would be stuck with games like the original Doom from 1993.

                    Comment


                    • #40
                      Originally posted by kpedersen View Post
                      Nice to see people doing mods for proper (albeit older) engines rather than just faffing around with toys like Unity.

                      It is quite an impressive technical feat. Q2 is relatively complex to develop for and replacing the render entirely is a cool undertaking!

                      Once mainstream CPUs will parallelise as well as a GPU for ray tracing, then I will be interested. For now I am done with NVIDIA GPUs and their terrible blob drivers.

                      As it stands, this is more interesting news XD - https://www.phoronix.com/scan.php?pa...0-Ti-Linux-5.0
                      I've never had a single problem with NVIDIA drivers even though my first GPU was Riva TNT 2. Their Linux drivers are far from perfect but Linux is an afterthought for them and I cannot blame them. Luckily they support Linux as it is because their Linux user base is close to zero.

                      Comment

                      Working...
                      X