Announcement

Collapse
No announcement yet.

AMD Radeon RX 7900 XTX & RX 7900 XT Arrive For Linux Testing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by pete910 View Post

    They are which is why the "HDMI 2.1 support" was always odd to me as it works for me bar HDR

    Guess it's for High Hz 4k due to bandwidth.
    Last I checked, HDR doesnt work in Linux. My gaming PC is connected to a LG C9, using a RX 6900 XT.

    HDR works on Windows and my Xbox, but not Linux.

    Also, thanks to Nvidia shenanigans, LG removed FreeSync and instead added fake G-sync.

    Comment


    • #32
      Originally posted by Linuxxx View Post

      Thanks alot!

      Just had a look & found out it's officially Windows-only right now, but running it through Proton should be feasible thanks to the native Vulkan renderer...
      A fun fact is that RTX Remix apparently uses DXVK for it's DirectX to Vulkan translations... Which was obviously made with Linux in mind

      Comment


      • #33
        Originally posted by NeoMorpheus View Post

        Last I checked, HDR doesnt work in Linux. My gaming PC is connected to a LG C9, using a RX 6900 XT.

        HDR works on Windows and my Xbox, but not Linux.

        Also, thanks to Nvidia shenanigans, LG removed FreeSync and instead added fake G-sync.
        Have a CX, states gsync on box but in the tv it states freesync premium.

        Comment


        • #34
          Originally posted by pete910 View Post

          Have a CX, states gsync on box but in the tv it states freesync premium.
          Yes, everything after the C9 gets both, but they gimped the C9.

          You can do a quick test, grab a second drive, install W10 or W11 (eewww), install a game and test HDR and whatever else.

          Also, the HDMI cable used is very important and even when they are sold as HDMI 2.1 cables, they might not be.

          Comment


          • #35
            Originally posted by Mahboi View Post
            Best example is this: I switched a video to 4K once and found the image "better" but nothing to write home about. 2 minutes in, I switch back to 1080p, and everything looked blurry and terrible to my eyes. Didn't take 2 minutes to get used to it and I can never go back.
            That's a flawed experiment, since streaming platforms dynamically adjust compression quality and might have started the 1080p stream at a low data rate, when you first switched to it.

            BTW, my experience with 4k is that I use the same fonts everywhere. I went to a 32" 4k monitor, from a 27" 1440p. The size difference wasn't enough, as I discovered. I'd have had to go up to about 40" @ 4k, to get approximately the same DPI. Anyway, I was shocked at how much more screen real estate there was. It's more than you'd probably expect.

            One thing I noticed about 4k is that it actually takes a lot of mouse movement to get around. And that's just one screen. Seems like it could be an issue for 8k, not to mention 2x 8k.

            Comment


            • #36
              Originally posted by Mahboi View Post

              Crap, that's why my new 4K monitor won't accept input over 120Hz over HDMI 2.1 despite both the card and monitor having it?

              Edit: buy 4K 32 inch monitors people, they will make you never want to read on a 1080p monitor ever again. I actually bought a second one right after because my eyes cry when I'm trying to read the old one. May just be Linux going crazy with scaling or something though.

              Text is so crisp and clear, reading code went from "is my eyesight getting that worse" to "wow this is so easy to read". Videos are still mostly in 1080p and for 95% of games, you won't see a difference without a top end card, but for programming & reading the web, my god, go once, never return.

              Best example is this: I switched a video to 4K once and found the image "better" but nothing to write home about. 2 minutes in, I switch back to 1080p, and everything looked blurry and terrible to my eyes. Didn't take 2 minutes to get used to it and I can never go back.
              Same here. I have a 4K 165 Hz monitor on DP 1.4 port and able to use it on Linux at only 120 Hz, while the same GPU, cable, port and everything, WIndows is able to do 4K 165 Hz. I'm assuming this is because Linux doesn't support DSC yet.

              Comment


              • #37
                Originally posted by NeoMorpheus View Post

                Yes, everything after the C9 gets both, but they gimped the C9.

                You can do a quick test, grab a second drive, install W10 or W11 (eewww), install a game and test HDR and whatever else.

                Also, the HDMI cable used is very important and even when they are sold as HDMI 2.1 cables, they might not be.
                I know cable can play a big part, discovered that with my DP cables for my sammy 1440p 144hz monitor, ironically the expensive cable that I got didn't work but the £8 one did

                Comment


                • #38
                  Fingers crossed for good RT,

                  Comment


                  • #39
                    Michael


                    Do you happen to have plans to test the ML performance of 7900xtx?

                    In particular, Stable Diffusion. Would be interesting to see how it works.

                    This is interesting because RDNA3 is supposed to have 2:1 FP16 to FP32 theoretical performance. The FP16 number is higher than RTX 4090

                    7900XTX FP16 is 123 TFLOPS:
                    AMD Navi 31, 2498 MHz, 6144 Cores, 384 TMUs, 192 ROPs, 24576 MB GDDR6, 2500 MHz, 384 bit


                    4090 got 83 TFLOPS
                    NVIDIA AD102, 2520 MHz, 16384 Cores, 512 TMUs, 176 ROPs, 24576 MB GDDR6X, 1313 MHz, 384 bit


                    And on the other hand, there is rumor of some kind of inefficiency:


                    I don't trust this kopite7kimi guy, but Kepler_L2 seems to know what he talks about.



                    I got my 5700xt working using instructions from here:


                    Note I didn't use the docker solutions. I just installed the ROCm runtime Arch packages and then installed the AUTOMATIC1111's webgui

                    Stable Diffusion web UI. Contribute to AUTOMATIC1111/stable-diffusion-webui development by creating an account on GitHub.


                    Of course I had to download the stable diffusion 1.4 model (this was a month ago, new versions are available now)

                    There is another tutorial but I didn't try this myself: https://github.com/AUTOMATIC1111/sta...un-on-AMD-GPUs

                    7900XTX is most likely not supported by ROCm yet. so may need to use the HSA_OVERRIDE_GFX_VERSION=10.3.0​ trick (see here: https://old.reddit.com/r/StableDiffu...d_gpu/imp7bx2/)

                    Also need to figure out how to make it use of FP16. Maybe enable "--half-precision"?

                    Thanks​

                    Comment


                    • #40
                      Originally posted by pete910 View Post
                      Personally think people will be disappointed in the perf . Hoping they are stonkers in Linux, don't care how well in windows myself.




                      Have a lg oled running fine at 4k/120 with my current 6800xt on the oss driver
                      But VRR is not working, right?

                      The GPU and TV always work at 120 Hz, even for movies or games that doesn't reach such a high framerate?

                      Comment

                      Working...
                      X