Announcement

Collapse
No announcement yet.

Another Optimization Comes For Radeon RADV Ray-Tracing In Mesa 24.1

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by schmidtbag View Post
    Not only is this great work by Valve but it's nice that it works on RDNA2. It's great to see outdated hardware still get supported.
    Steam Deck utilizes an RDNA 2 GPU. As long as the device is available on the market, there will be improvements from Valve. However, I have a question for those more experienced, How similar are the drivers for RDNA 2 and RDNA 3, i.e., if Valve improves something for the RDNA 2 architecture, will the RDNA 3 architecture automatically benefit from it?

    Comment


    • #12
      Originally posted by EliasOfWaffle View Post
      Sorry but is not just AMD, AMD works more in kernel side, but the real contributor is the Valve corporation, the mostly AMD perf changes in Kernel side is from Valve, the same thing with RADV, the most effort is from Valve, Valve have focus in contract developers to work in RADV and ACO.
      ACO is just a real thing because of Valve, AMD not have no one interesting in writing a NIR Backend, because AMD LLVM backend is used in all proprietary drivers from AMD.
      Valve have make efforts in NIR intermediate representation and in Zink Driver too, the same thing with RadeonSI, bring ACO backend support for it.
      I mean, this is possible because AMD open sourced their drivers and AMD passing on their API (Mantle) to Khronos which became Vulkan API of today necessity standard. So regarding contributing to the community, AMD has a lot of it's plate of how the landscape looks like today.

      Also an argument of the open source nature is why Valve can work on these projects, that AMD figured out how much they save on external work instead of internal. Valve is just ready to invest on AMD because of its open source platform, which again EVERYONE benefits (even AMD, but mostly Valve as their Steam Deck gets updated code-wise and customers are becoming more satisfied).

      Comment


      • #13
        I am also hoping Mesa adds AMD's open source resolution scaling and frame generation codes in the future.

        Comment


        • #14
          Always good to get more optimizations to get the most out of the hardware people bought, though I give approximately 0.01 cares about raytracing itself. I care more about the hardware video decoding of such complex formats as AV1, while keeping the GPU and CPU alike in the lowest power state possible -- given YouTube and we're-totally-not-cable Netflix's switching to it, that has an... exponential? very large effect on userbase bandwidth, and thus energy and data-allotment, usage. Raytracing... unless I misunderstand, no effect on that usecase?

          I'd honestly rather see the raytracing hardware used for better positional (including vertical) audio in games (calculating occlusion rapidly, with material handling, and none of that fake EAX/EFX junk... y'know, like A3D did before Creative sued Aureal into bankruptcy-death and bought the corpse despite having lost the suit), with integration into the gameplay, and less focus on fancy graphics. Warning, rant incoming:

          I mean, unless people want this combination of dev crunch, "massive profits didn't reach expectations" studio closure, outrageous pricing for minuscule performance uplift (paired with features like AI nobody actually wants1), war-caused energy costs, and perpetual greedflation making people focus on paying NECESSITY bills, to result in a second video game crash? Sound is way cheaper, it just is harder for lazy reviewers to show off.

          There's a reason the Steam survey still lists 16GB, 6- wait, really? 6 cores? What a weird number... anyway, 6 cores followed right behind by just quad-core CPUs, 2.3-2.69GHz, and 8GB of VRAM with 1080p primary monitors as the leaders. Not to mention, discounting last-gen nVidia RTX 3060 GPUs, the GTX 1650 and 1060 cards -- a lot of people just don't have the money to drop on a new GPU, don't like the power consumption -- possibly necessitating a new PSU, unless getting a low-end card that doesn't even outperform their old hardware noticeably -- inclusive-or are just happy with the resolution monitor and graphical fidelity they have.

          Seriously, you have to squint pretty damn hard to notice differences from raytracing, unless the game was built with nVidia's bribe money^W^W assistance to showcase it. In which case, the performance sucks. Which, it already sucked, because a lot of devs don't know how to handle DirectX 12/Vulkan without the driver writers doing their work for them (via a storm of game-specific kludges) like they used to, even before the raytracing. Which devs then make up for by abusing artifact-prone fake frame generation, using that AI nonsense... that still needs energy-and-money-expensive new GPU hardware. Which people can't afford.

          Meanwhile, consoles are getting next to no good new games, just an even worse AAA slopfest mixed with rehashes of PS3/PS4 games, so trying to save money there is a waste too, and people are noticing -- and jumping on the only games that prove the exception, like BG3 apparently is? I wouldn't know, my CPU and GPU barely clear the bar of minimum requirements and I have a kilometer-long backlog of backlogs.

          But the publishers and management keep pushing more crap out. Just like they did in 1983. Except now there's reviewers... in the publishers' pockets. And only four platforms instead of like, six? Eight? And given the general multiplatform status of everything, and Microsoft owning everything AAA and AA alike, a second crash would take out the entire new game industry.

          And the behemoths of industry wouldn't even have the good graces to die like so-called free market capitalism would dictate, because digital store backlogs would keep the publishers afloat, they'd just close the rest of the studios they haven't already folded into the CoD/sportsball/WoW/remake mines and learn nothing.

          Indie studios can try so hard, and get so far. In the end, it doesn't always matter. But at least they provide something better than lowest-common-denominator.

          1 Aside from better video upscaling and frame generation which... um... spline36 and ewa_lanczossharp etc say hello, and don't need new hardware, just implementation in the driver.


          Originally posted by Kjell View Post

          AMD invests time into improving Linux drivers.

          Hardware acceleration works better.
          Compatibility is better with new technology like Wayland.
          ​Updates, bug fixes and performance enhancements are done frequently.
          There's no arbitrary limits for concurrent encoding jobs.

          I used to praise NVIDIA but their lack of effort is disgusting. Billion dollar company relying on unpaid developers working around their broken drivers is a nightmare. Endless amount of unresolved bugs around core gpu features made me switch.

          Don't waste your time and vote with your wallet
          I agree entirely, but am kinda amused considering I just ran into a known mpv bug on AMD caused by the driver not supporting all formats: a red-channel output from video playback, when using --vo=dmabuf-wayland.
          Last edited by mulenmar; 06 February 2024, 04:16 AM.

          Comment


          • #15
            Originally posted by Danny3 View Post

            Because open source software tends to have way more collaboration than closed-source one!
            Same reason why Linux has won so many markets and will win the desktop one too, one day.
            Android is also having great success.
            Bu-bu-but what about the leather jacket and the GPU oven?

            Comment


            • #16
              Originally posted by caligula View Post

              Bu-bu-but what about the leather jacket and the GPU oven?
              Nvidia? Weren't those the guys with the eco-friendly graphic card made of wood? Aptly named Thermi (or something alike)?

              Comment


              • #17
                Huh. So Vulkan really was born from AMD donating Mantle to Kronos. It makes a lot of sense in hindsight, but I wouldn't have believed it until I found these slides (via Vulkan Wikipedia article). Having a closer-to-hardware API won't be used, no matter how good it is, if it blocks out (more than) half of customers paying customers. AMD did the right thing and opened their idea to everyone.

                Valve is also a huge contributor to AMD experience on Linux. AMDGPU is open source all right, but RADV is the project where contributing is practical! I wonder how the frame rates will be affected by this patch. Congratulations about the efficiency and performance boost! One more example of how any piece of hardware requires quality drivers in order to show its true potential (and I think there's still more to squeeze out)!

                Also, I wouldn't call RDNA2 "obsolete" by any means. I have a 5700 XT myself and it still fulfills all the gaming needs I have, and I have a 4k display. Yeah, I don't play triple-A games, that's no guarantee of a good game. Then again, I'd like to jump on the raytracing bandwagon at some point, but for now it'll have to wait.

                Exciting times!

                Comment


                • #18
                  Originally posted by schmidtbag View Post
                  ...RDNA2. It's great to see outdated hardware...
                  RDNA2 "outdated??"
                  Last edited by Type44Q; 06 February 2024, 07:38 AM.

                  Comment


                  • #19
                    Originally posted by mrg666 View Post
                    I am also hoping Mesa adds AMD's open source resolution scaling and frame generation codes in the future.
                    Unfortunately, there's no "dislike" option...

                    Comment


                    • #20
                      Originally posted by Type44Q View Post

                      Unfortunately, there's no "dislike" option...
                      That is okay with me if you want to keep your reasons to yourself. But, this is going to happen soon.

                      Comment

                      Working...
                      X