Announcement

Collapse
No announcement yet.

AMD Radeon RX 7900 XTX + RX 7900 XT Linux Support & Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #81
    Originally posted by crowen View Post

    To be honest, i think nowadays graphics with games are also somewhat overrated. A lot of nice and fun games with some depth and or great stories (Dwarf Fortress, Rimworld, Stellaris, HOI, (...)) do not have top notch state of the art graphics but are fun to play. There are few games that have both, like The Witcher 3 or some of the recent Rockstar Games... but really it is a rare breed. Watching e.g. all the failed attempts to get a game that is as fun to play as Jagged Alliance 2 (1999 released) done haha omg... Oh well there is always hope. Same goes for movies... lots of CGI and action scenes do not necessarily result in an interesting to watch movie.
    Same here. The most demanding game I've played recently is either Civ6 or Anno1800. And both work sufficiently well on my RX470, at least for my 1080p60 monitor and current city size. Can I set max graphics? No. Does it make a difference? Some. Do I care enough to dump hundreds of euros on a GPU? Definitely not. So I'm looking forward to the RX7500 and RTX4050

    @Michael: Great to see you re-testing not just last gen to compare, but also second-last gen!

    Comment


    • #82
      Originally posted by chithanh View Post
      Is the maximum desktop size of the RX 7000 cards still 16384x16384 pixels, or has the limit been removed?
      FWIW, with Wayland, any such HW limit can only affect the size of a single window / output. It does not limit how a Wayland compositor can combine multiple outputs to a single desktop.

      Comment


      • #83
        Originally posted by catpig View Post

        Why would you upgrade every year? I have never done that, and even people with very different taste in games and resolution than mine don't need to do that, with pretty rare exceptions. For most (NOT all!) people the only reason to upgrade every year is bragging, lack of knowledge or both.

        Edit: And by "never" I mean "since I started playing on PCs in the early 90s".
        You missed the point. I don't upgrade once every year either. But Nvidia REALLY wants you to, and it will push technologies that will make developers create games that require semi-yearly upgrades. We just had FSR and DLSS came out, now there is DLSS V3, that doest work on previous last 2 generation GPUs from nvidia(with hacks), but they have NOT released technology on older GPU, because they want you to buy it. Some dude made DLSS3 work on 2080 or some such gpu.

        Comment


        • #84
          Originally posted by Lycanthropist View Post
          I really would like to know how well "Horizon: Zero Dawn" runs on those cards. I currently own a 5700XT and can achieve 4K@60 only at low settings.
          5700xt owner here, there is something wrong with your set up. I get much smoother experience.
          out of curiosity, can you compile kernel? how many cores CPU?

          Comment


          • #85
            Originally posted by JPFSanders View Post

            Your thoughts are a carbon copy of mine.

            Once AMD released their open driver there was no way back to the proprietary blob.

            What NVidia did back in the day was commendable and technically great (I was an NVidia user for years), but at the end of the day their solution was a bolted closed driver on top of the kernel to solve Linux's lack of proper graphics infrastructure and video device management, as Linux evolved and the graphics infrastructure within the kernel improved that bolt-on technology became both a kludge and a serious handicap to wider Linux adoption.

            The people who like NVidia did like it because it worked well (and bugs, kludge and Linux handicap aside still works well for most user applications) but the majority of people don't understand how much of a drag the current NVidia software ecosystem based on the closed driver is for Linux, NVidia (in my opinion) is along with Gnome (this is another story for another day) responsible of delaying the adoption of Linux desktop technologies for at least 5 to 10 years. Yes, it is that bad, but oh well, muh 50 FPS using RTX wasting 800 watts will improve game play so much.

            The sad part is that there is no reason for this, NVidia could have had their kernel driver completely open source and maintain their proprietary user-space stack, they could had their market dominance without chocking Linux development, their proprietary compute infrastructure is great for example, but there was no reason whatsoever they couldn't have an open source kernel-space open driver and proprietary user-space stacks. They could have made everybody happy and had less headaches themselves with an upstream kernel driver.

            The good news is that they seem to have begun to correct this and in a couple of years once they mature their new open-source driver I will be first in line to buy an NVidia card and test it.
            This is me also. The 7900 xtx release is solid from my perspective. I wish AMD were more competitive in ray tracing and rasterization performance with the 4090, but for me those performance numbers are moot. The AMD open source driver support trumps the extra performance provided by the 4090. I've been rocking the RX480 for years now and couldn't be happier. I will be moving to the 7900 xtx once the OEM cards become available.

            Comment


            • #86
              Originally posted by dimko View Post

              You missed the point. I don't upgrade once every year either. But Nvidia REALLY wants you to, and it will push technologies that will make developers create games that require semi-yearly upgrades. We just had FSR and DLSS came out, now there is DLSS V3, that doest work on previous last 2 generation GPUs from nvidia(with hacks), but they have NOT released technology on older GPU, because they want you to buy it. Some dude made DLSS3 work on 2080 or some such gpu.
              Yeah but there comes a point where you gotta ask the individuals in question whether they realise that they live in a capitalist society, that NV is a for-profit company, and that you never take at face value the claims of anyone trying to get money from or through you
              Of course some people who had bought a 2080 then bought a 2080Ti, then a 3090, and now a 4090... but if someone has that kinda money to burn and can't think of a wiser use my sympathy is rather limited, just like with people who buy new cars every year just to show off their destructive egomania (again, exceptions confirm the rule: if you're driving 400 000km/yr that's obviously different).
              As for DLSS, that came out in early 2019 with the RTX2000 series. And I am unaware of any game requiring it. I can't even see how any game could require it, tbh, since it's not that kind of technology. As for FSR, that works on any GPU afaik, even those from competing vendors (at least the original version).
              Also bear in mind that it takes, except for a typically microscopic number of "specially chosen" (read: subsidised by the hardware vendor) flagship titles, years for new HW features to be used - let alone required - by mainstream software. And that will never change, as it's the result of basic laws of economics: if you make an expensive title, you need a sufficient market size to make your profit. If you restrict your market to early adopters of new hardware, that's not gonna happen (again, with few exceptions). How many games require RT hardware, which also launched in 2019? Not counting stuff like Quake RTX, which is merely a patched version of a game that runs fine without RT hardware.

              Comment


              • #87
                Originally posted by castlefox View Post
                Can anyone tell me if the RX 7900 video card would be overkill for my aging system?
                CPU: Ryzen 5 1500
                RAM: 32GB (but not set at a super fast clock, because I am using 4DIMM slots) (didnt know my mobo couldnt handle faster ram speeds like 4 DIMM slots were used).
                I have a 1440p screen, that has a high refresh (144hz)
                Extreme overkill for 1440p and those specs, holy heck.
                A 6700/6700xt would largely suffice for 1440p in almost all games. No decent Raytracing, but I'd buy a 5600x/5800x long before spending the dough on a 7000 series card.

                Comment


                • #88
                  Originally posted by dimko View Post
                  HOW ABOUT DEVS STOP SCRATCHING ARSE
                  Preposterous, being a dev is squarely about arse scratching.

                  Comment


                  • #89
                    Originally posted by Michael View Post

                    He was repeatedly berating other users and causing a lot of friction among users, was easily the most 'reported' user in the forums for his posts.
                    I assumed that was his method of communication!

                    Comment


                    • #90
                      Originally posted by dimko View Post

                      You missed the point. I don't upgrade once every year either. But Nvidia REALLY wants you to, and it will push technologies that will make developers create games that require semi-yearly upgrades. We just had FSR and DLSS came out, now there is DLSS V3, that doest work on previous last 2 generation GPUs from nvidia(with hacks), but they have NOT released technology on older GPU, because they want you to buy it. Some dude made DLSS3 work on 2080 or some such gpu.
                      AFAIK nVidia stated that this feature may be enabled on previous gen in future, but Optical Flow Accelerator on previous gen may be not good enough to provide good gaming experience. They haven't said that it's impossible.

                      And BTW this only shows how nvidia is pushing GPUs forward, while AMD is just a copycat. CUDA in 2006, AI Tensor units, hardware ray tracing.
                      Now we learn that all nvidia RTX units have hardware Optical Flow Accelerator. Even RTX2000 series!
                      Something you would use only for video or emulate using AI Cores (see RIFE - Real-Time Intermediate Flow Estimation for Video Frame Interpolation). And they put it to good use with DLSS3.
                      AMD is years behind, trying to emulate everything on shaders, and even if they manage to make the hardware, they fail to deliver working drivers.

                      When they released RTX back in 2018 I was wondering why waste die space for something as useless as AI and RT. Now fast forward to 2022 and AI and RT are big things for GPUs (I'm not talking about games, but see pytorch, tensorflow, blender etc). AMD is focusing strictly on gaming, with rocm not officially supported on any consumer cards, while nvidia delivers complete working solutions beyond just gaming. Both my 8600GT and 750Ti could run CUDA years ago, fully supported.

                      I don't like nvidia and I have RX6800XT (superb card for linux, and rocm seems to be working somehow), but this is how I see it.
                      Last edited by sobrus; 13 December 2022, 01:10 PM.

                      Comment

                      Working...
                      X