Announcement

Collapse
No announcement yet.

AMD's GPUOpen Announces ADLX Library But For Now It's Windows-Only

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by Espionage724
    I've always ran sensors-detect and pressed Y to everything with a 6600 XT and had no issues on Fedora 35-37, Ubuntu 22.04+, and openSUSE TW.
    Thanks for the data point, though the original question sounded pretty ominous...

    Comment


    • #22
      Originally posted by Weasel View Post
      GUys BOyCOTt nVidIA it's an EvIl COMpaNY. CoMpleTElY AntI-LInUx witH shIT sUPPorT.

      BuY aMd theY aRe So gOod WitH LinUx AnD ImpeCcabLe sUppoRT.​


      I hope I don't have to write /sarcasm for the Wayland fanboys. Actually I'm sure there's a 90% correlation between Wayland fanboys and the above mentality.
      Here's a whetstone so you can grind your axe in peace ^^

      Comment


      • #23
        Originally posted by bug77 View Post
        Was anyone asking/waiting for this?
        Yes.
        If you ask me why, I will point you to the Nvidia API documentation on the functionalities I would like to see on AMD cards as well.

        Comment


        • #24
          Originally posted by Linuxxx View Post
          Yet Crysis 3's nVidia codepath performs significantly better than AMD's, to the point where starting with DXVK 1.5, any GPU will be reported as being a nVidia one to the game:
          meant to reply this sooner, i dont think it's that simple
          • when crysis 3 came out, radeons performed very well vs their geforce counterparts compared to other games, implying they were well fed or that the engine was well threaded (quite the opposite to crysis 1's massive single thread cpu bottleneck), but we cant say the nv path 'performs significantly better' because that's specific to the years later dxvk translation
          • i think i remember benchmarks showing over 4 cpu cores were used, meaning lots of parallel work to feed the gpu (same as frostbite engine eventually, fun fact: one of the cry engine leads moved to become a frostbite lead)
          • so that means higher cpu usage, maybe due to the complexity of the new at the time gcn architecture vs kepler, or maybe due to nv's driver having years of multithreading support (offloading the work of the game to the driver, simple game code complex driver code for the nv path)
          • but dxvk isnt (commonly) used on windows, nor is this a dx11 driver comparison anymore, all dxvk said was that the resulting translated-to-vulkan(-on-linux?) fps is slower-in-cpu-bound-scenarios (no cpu model mentioned)
          • vulkan drivers are by default relatively modern code with a lot of complexity to feed a parallel gpu at a low level, plus it's been stated that nv has been moving from hardware dispatchers to software ones over the generations, i'd imagine that makes the mesa driver be more similar to nv's driver compared to amd's driver (especially of 2013), presumably the amd path is doing redundant work or any other context switching/bottlenecking when combined with dxvk
          • all we have is a quirky side effect of closed source 2013 code designed for 2013 architectures/windows dx11 drivers being translated into vulkan running on newer architectures (though the dxvk commit mentioned kepler lost gpu-limited peak performance, which is strange, how can a spoof to the same vendor result in any difference)
          this reminds me of need for speed shift, very poor radeon performance that took a couple game (not driver) patches to improve, plus i was using crossfire at the time so i needed both game+driver updates

          Comment

          Working...
          X