Announcement

Collapse
No announcement yet.

AMD's GPUOpen Announces ADLX Library But For Now It's Windows-Only

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by skeevy420 View Post

    My RX 580 has done that before...usually once or twice a year and usually after an optional driver install/update. Luckily, for me anyways, a complete reinstall of the driver seems to fix it for a good while. I've used both DDU and the Factory Reset option of the AMD Driver wizard with success to fix that. Good Luck.
    Guys if you don't know how Windows work don't make it up. What happens to you is that you enabled in windows update update 3rd party applications.
    And in case of drivers delivered via Windows update they are without the control panels and are not compatible with the control panel of standalone drivers. For sure for AMD. So installing AMD driver over existing driver from WU will not make the control panel work. So you need to un-install such drivers before getting your control panel working.​

    Comment


    • #12
      Originally posted by coder View Post
      Does Mesa have APIs for doing any of this stuff? If not, maybe it's time to add them. It would be great to have a vendor-neutral way of doing it!

      Since it sounds like there are now at least 2 examples of vendor-specific implementations, it shouldn't be too hard to out most of the API details.
      We never supported ADL (which has been around for years on windows) on our open source Linux drivers because there are already Linux APIs to do the same things (i2c, hwmon, kms, etc.).

      Comment


      • #13
        Originally posted by agd5f View Post

        We never supported ADL (which has been around for years on windows) on our open source Linux drivers because there are already Linux APIs to do the same things (i2c, hwmon, kms, etc.).
        was i2c bug for rdna2 ever fixed? i always make sure lm_sensors doesn't scan i2c because of it.

        Comment


        • #14
          Originally posted by middy View Post
          was i2c bug for rdna2 ever fixed? i always make sure lm_sensors doesn't scan i2c because of it.
          Yes, it was fixed.

          Comment


          • #15
            Originally posted by WannaBeOCer View Post

            AMD has been doing that for years too. Think of all the Forza titles. For example the Vega 64 was outperforming the GTX 1080 Ti until Nvidia was able to properly optimize their drivers. Then recently with Godfall, even though it used DxR they didn’t enable ray tracing on Nvidia GPUs for months. I’m sure there are newer examples I’m missing. They did that with TessFX in Tomb Raider as well.

            https://www.techpowerup.com/237491/a...rt-7-dx-12?amp
            Hmm, perhaps that was a response to nvidia doing it first or maybe there are some people with good moral compasses and decided to do not screw another GPU in dirty way?

            Or maybe yes, they are as dirty as nvidia or maybe dirtier than nvidia and intel combined.

            Comment


            • #16
              Originally posted by agd5f View Post

              Yes, it was fixed.
              aww sweet thank you.

              Comment


              • #17
                Originally posted by WannaBeOCer View Post

                AMD has been doing that for years too. Think of all the Forza titles. For example the Vega 64 was outperforming the GTX 1080 Ti until Nvidia was able to properly optimize their drivers. Then recently with Godfall, even though it used DxR they didn’t enable ray tracing on Nvidia GPUs for months. I’m sure there are newer examples I’m missing. They did that with TessFX in Tomb Raider as well.

                https://www.techpowerup.com/237491/a...rt-7-dx-12?amp
                Isn't a lot of their stuuff like this open source? And more to the point, isn't it just a matter of them working hard to optimize this stuff for their cards/drivers while putting in no work at all to optimize it for nvidia's?

                Comment


                • #18
                  Originally posted by rabcor View Post

                  Isn't a lot of their stuuff like this open source? And more to the point, isn't it just a matter of them working hard to optimize this stuff for their cards/drivers while putting in no work at all to optimize it for nvidia's?
                  Note that just because a title is officially sponsored by a certain vendor, it doesn't automatically mean that other vendors are excluded from optimizing the game & engine during production for their respective hardware.

                  One of the best examples for this is the original Crysis trilogy:

                  Both Crysis 1 & 2 were nVidia-sponsored titles, whereas Crysis 3 was AMD-sponsored.

                  Yet Crysis 3's nVidia codepath performs significantly better than AMD's, to the point where starting with DXVK 1.5, any GPU will be reported as being a nVidia one to the game:

                  • Crysis 3: All GPUs are now reported as Nvidia GPUs by default. This enables the game to use a fast path with considerably lower CPU overhead, but may cause a small performance hit on certain Nvidia GPUs in GPU-bound scenarios. Note that it might be necessary to disable the loading of nvapi.dll in your wine prefix when using wine-staging.
                  D3D9 support D9VK is now part of DXVK, which means that D3D9 games are now supported by default. Refer to the pull request for details and planned future work, and beware of frogs. For future bug r...

                  Comment


                  • #19
                    Originally posted by rabcor View Post

                    Isn't a lot of their stuuff like this open source? And more to the point, isn't it just a matter of them working hard to optimize this stuff for their cards/drivers while putting in no work at all to optimize it for nvidia's?
                    TressFX wasn't open source when Tomb Raider was released, Godfall's developers were most likely paid off by AMD to not enable RT on Nvidia GPUs for X amount of months due to it mainly being an AMD tech demo. It shocked me it wasn't enabled at launch since it uses DxR. I don't know the reason why Forza Motorsports 7 ran slower but it was optimized pretty quickly with driver 387.92 but it shocked me that it ran poorly on a GTX 1080 Ti at release compared to a Vega 64.
                    Last edited by WannaBeOCer; 14 December 2022, 11:12 PM.

                    Comment


                    • #20
                      Originally posted by middy View Post
                      was i2c bug for rdna2 ever fixed? i always make sure lm_sensors doesn't scan i2c because of it.
                      Uh... what happens if you do (assuming you don't have the fix)? Halt and catch fire?

                      Comment

                      Working...
                      X