Announcement

Collapse
No announcement yet.

Mesa's Radeon "RADV" Driver Can Now Handle Cyberpunk 2077 Ray-Tracing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Karmux View Post
    To me RT even in Windows is fps killer that barely gives any visual improvements. My opinion is based entirely on Youtube comparisons and tests. Great to see that Linux is catching up with new graphics technologies and hopefully fps cost of RT will lessen also in general with future GPUs.
    It is basically garbage. The real, honest reason it got introduced was because they wanted the hardware for compute applications and didn't know how to sell the same transistors to gamers, so they introduced RT. It is known that GPUs don't just magically get designed for separate applications, they need a main design and the profits are on the compute side, gamers just have to get the same stuff as professionals do (cheaper and less certified though).

    That's not to say that Ray Tracing per se is bad, but Ray Tracing would be worth it if the whole scene was rendered using it, not just for a few effects, and current gpu power can't manage that in real time, so there was no reason to rush it other than the same components could be used for stuff like AI and made sense for their profits.

    Comment


    • #12
      I love raytracing, I dont have a card capable of it outside of techincally the A380 im borrowing. but it's one of the big features i've been holding off for. can't wait for it to become more popular.

      Comment


      • #13
        Originally posted by TemplarGR View Post
        It is basically garbage. The real, honest reason it got introduced was because they wanted the hardware for compute applications and didn't know how to sell the same transistors to gamers, so they introduced RT. It is known that GPUs don't just magically get designed for separate applications, they need a main design and the profits are on the compute side, gamers just have to get the same stuff as professionals do (cheaper and less certified though).

        That's not to say that Ray Tracing per se is bad, but Ray Tracing would be worth it if the whole scene was rendered using it, not just for a few effects, and current gpu power can't manage that in real time, so there was no reason to rush it other than the same components could be used for stuff like AI and made sense for their profits.
        Are you talking about NVidia here ? As far as I know the RT hardware in our GPUs is there only for RT and is not usable for general purpose compute.

        That said, if you are talking about compute specifically for ray traced rendering then yes the RT hardware can be used for that, but not for other compute purposes.
        Test signature

        Comment


        • #14
          Originally posted by bridgman View Post

          Are you talking about NVidia here ? As far as I know the RT hardware in our GPUs is there only for RT and is not usable for general purpose compute.

          That said, if you are talking about compute specifically for ray traced rendering then yes the RT hardware can be used for that, but not for other compute purposes.
          Yes, i wrote that with Nvidia's RTX in mind. I am not familiar with AMD's and their differences, but from what i have read about RTX, aside from some specific raytracing parts that can be used mainly for graphics, all other additions in RTX are compute additions that work together with the raytracing cores to produce the effects. For example Tensor Cores are used to denoise partially raytraced images.

          Comment


          • #15
            I'm glad this crash was fixed.

            Performance is still quite suboptimal.

            Here is comparison I've seen from Windows on Ultra preset (source: https://www.youtube.com/watch?v=ijaUH1BQZyg):



            And here is what I get with vkd3d-proton / radv on medium preset:


            GPU in the Windows example is a bit better (Sapphire Nitro+ 7900XTX vs my Sapphire Pulse 7900XTX), but resolution is the same and the gap shouldn't be that big in those cards.

            So you can see that Windows performance is more than 3 times the Linux one (besides may be scenes differences, but very roughly).
            Last edited by shmerl; 24 February 2023, 04:20 PM.

            Comment


            • #16
              Originally posted by Karmux View Post
              To me RT even in Windows is fps killer that barely gives any visual improvements. My opinion is based entirely on Youtube comparisons and tests. Great to see that Linux is catching up with new graphics technologies and hopefully fps cost of RT will lessen also in general with future GPUs.
              It greatly comes down to the game really. Also NVIDIA can now get RT at 4k with 120fps with help of DLSS2/3.

              I would agree however that many games don't need it or not implemented well. I was shocked to find out Atomic Heart has no RT/RTX and runs on Unreal Engine 4 still... I guess RT/PT is just easier to obtain pretty results.

              Comment


              • #17
                Originally posted by shmerl View Post

                Here is comparison I've seen from Windows on Ultra preset (source: https://www.youtube.com/watch?v=ijaUH1BQZyg):
                The Desert in this game runs A LOT better then in the City so not quite a good comparison if your changing the scene in between.

                Comment


                • #18
                  Originally posted by theriddick View Post

                  It greatly comes down to the game really. Also NVIDIA can now get RT at 4k with 120fps with help of DLSS2/3.

                  I would agree however that many games don't need it or not implemented well. I was shocked to find out Atomic Heart has no RT/RTX and runs on Unreal Engine 4 still... I guess RT/PT is just easier to obtain pretty results.
                  No game needs it. You can get pretty much similar visual effects with traditional methods. Raytracing is just easier from a developer standpoint. Raytracing will be a great technology once hardware power advances so much that the whole scene can be raytraced in real time and rasterization gets deprecated. Until then it is just a gimmick that kills performance for minor benefit. It is no secret that most people disable it in most of their games.

                  Comment


                  • #19
                    Not that anyone is actually ever going to play like this but it (somewhat) runs on the Deck too with FSR2 set to ultra performance. Locks up after a short bit and forces a GPU reset though.
                    Attached Files

                    Comment


                    • #20
                      Originally posted by TemplarGR View Post
                      Yes, i wrote that with Nvidia's RTX in mind. I am not familiar with AMD's and their differences, but from what i have read about RTX, aside from some specific raytracing parts that can be used mainly for graphics, all other additions in RTX are compute additions that work together with the raytracing cores to produce the effects. For example Tensor Cores are used to denoise partially raytraced images.
                      OK, I think we might be talking about two different things here - your earlier post talked about "RT hardware" (which is dedicated to ray tracing in both NVidia and AMD) but this post talks about "RTX" - again both vendors make use of general purpose compute hardware as well as dedicated RT hardware.

                      So I agree with the "RTX" wording but not the "RT" wording... although if you were thinking "RTX" when you typed "RT" I guess I agree anyways
                      Test signature

                      Comment

                      Working...
                      X