AMD Radeon RX 7900 XTX + RX 7900 XT Linux Support & Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • NeoMorpheus
    Senior Member
    • Aug 2022
    • 605

    #71
    Originally posted by mdedetrich View Post

    But he wasn't wrong at the time which is the point I was making.
    After this, i wont reply on the same thing because is OT, but I already responded to you, that was then, this is now and i am not the only one that noticed the change.
    Last edited by NeoMorpheus; 13 December 2022, 03:37 AM.

    Comment

    • brad0
      Senior Member
      • May 2012
      • 1019

      #72
      Originally posted by Linuxxx View Post
      A bird has tweeted to me that birdie has apparently been permanently banned here.
      It says so when you click on his name.

      Comment

      • brad0
        Senior Member
        • May 2012
        • 1019

        #73
        Originally posted by JPFSanders View Post
        Your thoughts are a carbon copy of mine.

        Once AMD released their open driver there was no way back to the proprietary blob.

        What NVidia did back in the day was commendable and technically great (I was an NVidia user for years), but at the end of the day their solution was a bolted closed driver on top of the kernel to solve Linux's lack of proper graphics infrastructure and video device management, as Linux evolved and the graphics infrastructure within the kernel improved that bolt-on technology became both a kludge and a serious handicap to wider Linux adoption.

        The people who like NVidia did like it because it worked well (and bugs, kludge and Linux handicap aside still works well for most user applications) but the majority of people don't understand how much of a drag the current NVidia software ecosystem based on the closed driver is for Linux, NVidia (in my opinion) is along with Gnome (this is another story for another day) responsible of delaying the adoption of Linux desktop technologies for at least 5 to 10 years. Yes, it is that bad, but oh well, muh 50 FPS using RTX wasting 800 watts will improve game play so much.

        The sad part is that there is no reason for this, NVidia could have had their kernel driver completely open source and maintain their proprietary user-space stack, they could had their market dominance without chocking Linux development, their proprietary compute infrastructure is great for example, but there was no reason whatsoever they couldn't have an open source kernel-space open driver and proprietary user-space stacks. They could have made everybody happy and had less headaches themselves with an upstream kernel driver.

        The good news is that they seem to have begun to correct this and in a couple of years once they mature their new open-source driver I will be first in line to buy an NVidia card and test it.
        +1. I'll never consider any binary drivers again. Way too much of a pain in the ass. I'm not willing to sell my soul for a GPU. These are not magical unicorns.

        Comment

        • Azrael
          Junior Member
          • Dec 2022
          • 33

          #74
          Any idea how are the VAAPI performance on Linux? I heard the idle power consumption is even higher than the RTX 4080.

          Comment

          • finalzone
            Senior Member
            • Nov 2011
            • 1219

            #75
            Originally posted by WannaBeOCer View Post
            Ray tracing isn’t a gimmick, Nvidia’s 3D Vision/AMD’s HD3D were gimmicks. Ray tracing has been in development for the last 2 decades. I’d argue and say ray tracing has been the second major graphic change to photo realism since tessellation was introduced in regards to games.
            Ray-tracing is much older technology since 1950s. The reason it was not used on consumers devices is mainly due to power requirement notably for both CPU and later GPU until recently. Raster was developed to address that shortcoming. Even the 4090 needed more power to render real-time ray tracing and needed upscale technique to compensate. The current method is just a brute force approach just for a beautiful lighting effect easily reproducible on raster. Full real-time ray-tracing is still a long way to go for the consumer market.


            Last I checked ray tracing isn’t locked to Nvidia’s hardware. Games either use Microsoft’s DxR or Vulkan’s VK_KHR_ray_tracing. Nvidia’s hardware currently does it better.
            Nobody talks about ray-tracing locked to Nvidia. The issue is applying Nvidia's specific method (clever use of driver) to AMD hardware. Real-time ray-tracing in gaming world will only gain adoption when available to mainstream hardware (currently both Playstation 5 and XBox Series 5) with an effective approach.

            Comment

            • WannaBeOCer
              Senior Member
              • Jun 2020
              • 309

              #76
              Originally posted by finalzone View Post
              Ray-tracing is much older technology since 1950s. The reason it was not used on consumers devices is mainly due to power requirement notably for both CPU and later GPU until recently. Raster was developed to address that shortcoming. Even the 4090 needed more power to render real-time ray tracing and needed upscale technique to compensate. The current method is just a brute force approach just for a beautiful lighting effect easily reproducible on raster. Full real-time ray-tracing is still a long way to go for the consumer market.



              Nobody talks about ray-tracing locked to Nvidia. The issue is applying Nvidia's specific method (clever use of driver) to AMD hardware. Real-time ray-tracing in gaming world will only gain adoption when available to mainstream hardware (currently both Playstation 5 and XBox Series 5) with an effective approach.
              I’m aware of the age of ray tracing but in regards to computer graphics, it has been only 20 years since it has been worked on. Intel and IBM used to show off ray tracing running remotely using a cluster of servers around ~15 years ago. Then out of no where Nvidia showed off OptiX in 2009 using Quadro cards then a local demo using a Fermi GPUs in 2010. This was a brute force attempt. With their research they created dedicated cores to accelerate BVH traversals using RT cores which they introduced 8 years later since their first Fermi demo.



              Nvidia’s method was creating dedicated cores, AMD followed with RDNA2 introducing their “Ray Accelerators.” Are you saying AMD’s 2nd generation Ray Accelerators match Nvidia’s 3rd generation RT cores but AMD’s drivers are lacking?

              I understand what’s causing the lack of new technologies being mainstream. The other issue is that PS4/Xbox One are also being developed for.

              Me personally I’m only purchasing DX12/Vulkan titles with ray tracing and HDR. If they lack the latest API, HDR or ray tracing I won’t buy the title. Next title I’ll be buying is Forspoken.

              Comment

              • Berniyh
                Senior Member
                • Oct 2007
                • 479

                #77
                Originally posted by Grinness View Post

                rx 6800 dual monitor, youtube playing in firefox: 15 - 16 W ...
                Well, without background that info is useless. The consumption depends on the resolution and refresh rate of the displays as well as SDR/HDR and of course also the resolution of the video.

                Comment

                • dimko
                  Senior Member
                  • Dec 2009
                  • 932

                  #78
                  Originally posted by WannaBeOCer View Post

                  With 27” 1440p OLEDs being released I think it’s time they shift their focus on improving character locomotion and physics with AI. Instead of chasing this make believe 8K gaming gimmick.
                  This or more affordable 3d glasses, etc.

                  Comment

                  • Lycanthropist
                    Senior Member
                    • Jan 2019
                    • 176

                    #79
                    I really would like to know how well "Horizon: Zero Dawn" runs on those cards. I currently own a 5700XT and can achieve 4K@60 only at low settings.

                    Comment

                    • catpig
                      Junior Member
                      • Jan 2022
                      • 40

                      #80
                      Originally posted by dimko View Post
                      I have thsi crazy idea, HOW ABOUT DEVS STOP SCRATCHING ARSE and start develop better games with better engines so we don't have to upgrade every year with 2k video cards.

                      Also, Nvidia will happily create newer gimmicks that only works on their overpriced hardware. Its a race, which you as customer, WILL NEVER WIN.
                      Why would you upgrade every year? I have never done that, and even people with very different taste in games and resolution than mine don't need to do that, with pretty rare exceptions. For most (NOT all!) people the only reason to upgrade every year is bragging, lack of knowledge or both.

                      Edit: And by "never" I mean "since I started playing on PCs in the early 90s".
                      Last edited by catpig; 13 December 2022, 08:56 AM. Reason: added edit to clarify

                      Comment

                      Working...
                      X