Announcement

Collapse
No announcement yet.

AMD Announces Radeon RX 7900 XTX / RX 7900 XT Graphics Cards - Linux Driver Support Expectations

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by drakonas777 View Post
    During presentation Lisa said "AM4 is going to continue for long long time". I find it interesting to mention this just after emphasizing the longevity of AM5. Feels like either they are planning to maintain AM4 and ZEN3 as a budget platform far longer than I was expecting or some new SKUs are still coming for AM4. I'd love to see some ZEN3 IO + ZEN4c CCD stuff for example, with sane power limits and affordable price
    Interesting. There was a rumored Zen 3+ destined for AM4. After it was canceled, the only Zen 3+ to emerge were the Ryzen 6000-series laptop CPUs. I suppose they could revive it. I think it was probably chiplet-based, since I think the 6000-series laptop models require LPDDR5 (i.e. different die than what you could use in the DDR4-based AM4 socket).

    Comment


    • #42
      Originally posted by piotrj3 View Post
      I suspect it is memory bandwidth starved situation. 6950XT is 256bit memory bus with 18Gbps, 7900XTX is 384bit with 20Gdps. That by itself is 66.7% faster which is extremly close to 1.7x AMD is claiming.
      Infinity cache bandwidth is like 3.5x, though. That's what counts, if you can use batching. 96 MB isn't huge, but it's big enough.

      In fact, when I saw that 5.3 TB/sec figure, I immediately thought AMD did it for the express purpose of AI inferencing. But, it seems like the tensor horsepower isn't there to truly take advantage of it.

      Originally posted by piotrj3 View Post
      Probably AMD with double units wants to improve performance in stuff less to typical graphical pipeline, for example for now everyone is using Nvidia for Blender, compute, etc. With RDNA1, AMD had to put price down because even if they were equal to Nvidia performance/price ratio, they are worse for compute, blender, raytracing etc.
      I doubt it. That's a pretty small market, compared with gaming.

      Originally posted by piotrj3 View Post
      RDNA3 probably aims to close those gaps and this is why AMD more focus on raytracing,
      RDNA3's ray tracing performance didn't even improve as much as the RTX 4000s' did, meaning they're even further behind than RDNA2 was.

      Comment


      • #43
        Originally posted by Gps4life View Post
        I am on openSUSE tumbleweed, and can confirm that for an AMD card you don't need to install drivers. Radeon RX 570 in my case.
        But that's an ancient card. It's potentially very different, if you buy new hardware on launch day. In his reviews, Michael is pretty good about listing what he's had to do to enable support for the hardware he's testing. You could look at some of his old reviews of the first GPUs in each generation, to get an idea for what's typically involved.

        Comment


        • #44
          Originally posted by luno View Post
          - 2.1 displayport
          - AV1
          - lower power
          - good price

          didn't know that was possible
          And to the biased sites, writers and Tubers, all that is irrelevant and will continue fleecing nvida on everyone faces.

          Hell, i am beyond amazed as to how many fans nvidia has in this site, regardless in how little they do in favor of the FOSS community yet AMD efforts are worth less than an used toile paper.

          Comment


          • #45
            Originally posted by coder View Post
            But that's an ancient card. It's potentially very different, if you buy new hardware on launch day. In his reviews, Michael is pretty good about listing what he's had to do to enable support for the hardware he's testing. You could look at some of his old reviews of the first GPUs in each generation, to get an idea for what's typically involved.
            I think you stand the chance of fewer issues with the new cards with a rolling release distro, which will (cough, should) automatically be using the newest kernel and mesa. Firmware... that could be tricker, what distros (if any?) pull the master github repo for linux-firmware?

            Obviously, LTS releases and historically slower to update distros likely won't play well with these new cards without some effort.

            Comment


            • #46
              Originally posted by luno View Post
              - 2.1 displayport
              - AV1
              - lower power
              - good price

              didn't know that was possible
              What's the benefit of DP 2.1 on a gaming monitor when Display Stream Compression is lossless? That supports 4K 240hz, 8K 120Hz and 10K 100hz.

              A RTX 4090 averaged ~150 FPS at 4K and used about the same power as a RX 6950 XT when gaming which has a lower TBP than a RX 7900 XTX: https://tpucdn.com/review/nvidia-gef...wer-gaming.png

              Seems like it's priced accordingly due to it's lower performance than a RTX 4090

              Comment


              • #47
                Originally posted by Danny3 View Post
                CoreCtrl is so barebones that it's pretty much useless!
                There's no: Radeon Image Sharpening, Anti-Lag, chill, Boost, Enhanced sync, GPU scaling, color depth, chroma subsampling, etc
                I don't disagree with your assessment if you consider lacking all those features barebones, but note that your original post didn't mention any of them. For better or worse, we can't read minds on this forum. As has been mentioned in subsequent posts about e.g. color depth, I also wonder how many of the linux equivalents of said features are enabled by default (if zero cost), controlled at a different level of granularity or configured in the OS level instead of in a control panel.

                Comment


                • #48
                  Shame Linux won't/can't get any of these special low latency and special driver software improvements that come with the Windows AMD software suite.
                  It's one of those unfortunate loses, even NVIDIA cards loose allot of those features.
                  X11 and Wayland just too limited in what they can do atm, maybe in 10 years they'll catch up. Hope so.

                  Comment


                  • #49
                    Originally posted by coder View Post
                    Interesting. There was a rumored Zen 3+ destined for AM4. After it was canceled, the only Zen 3+ to emerge were the Ryzen 6000-series laptop CPUs. I suppose they could revive it. I think it was probably chiplet-based, since I think the 6000-series laptop models require LPDDR5 (i.e. different die than what you could use in the DDR4-based AM4 socket).
                    "Warhol Zen 3+" for desktop. Probably an example of some nugget of internal info spiraling out of control once it entered the rumor arena.

                    AMD's promises about AM4 continuing for a long time are vague. It doesn't mean we'll see anything newer than the 5800X3D on that socket. It would be fascinating if they did.

                    Originally posted by WannaBeOCer View Post
                    What's the benefit of DP 2.1 on a gaming monitor when Display Stream Compression is lossless?
                    DSC is "visually lossless", aka lossy. If someone wants to make the case for DSC being good or bad, I'll let them, because I've never tried it. Just pointing out the fact.

                    Also, AMD specifically mentioned 8K165 and 4K480, which likely need DSC *AND* DP2.1 to reach:



                    You might be able to get to 8K165 on HBR3 (DP1.4) by using DSC with 4:2:0 chroma subsampling.

                    Comment


                    • #50
                      Originally posted by jaxa View Post

                      "Warhol Zen 3+" for desktop. Probably an example of some nugget of internal info spiraling out of control once it entered the rumor arena.

                      AMD's promises about AM4 continuing for a long time are vague. It doesn't mean we'll see anything newer than the 5800X3D on that socket. It would be fascinating if they did.



                      DSC is "visually lossless", aka lossy. If someone wants to make the case for DSC being good or bad, I'll let them, because I've never tried it. Just pointing out the fact.

                      Also, AMD specifically mentioned 8K165 and 4K480, which likely need DSC *AND* DP2.1 to reach:



                      You might be able to get to 8K165 on HBR3 (DP1.4) by using DSC with 4:2:0 chroma subsampling.
                      I currently use DSC on my LG 27GN950 which is 4K 160Hz: https://vesa.org/vesa-display-compression-codecs/ Nothing looks blurry, calibrating it with a x-rite i1 display pro it gets Delta E ≦ 1.2.

                      The bandwidth of DP2.1 gets 8K165/4K480 but the card wouldn't unless you use FSR or playing at low settings. Overwatch 2 for example has dynamic scaling enabled by default which is why that game gets high frames but native 4K is lower.

                      Comment

                      Working...
                      X