AMD Radeon RX 7900 XTX + RX 7900 XT Linux Support & Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • Mahboi
    Senior Member
    • May 2022
    • 204

    #91
    Originally posted by WannaBeOCer View Post
    I’d argue and say ray tracing has been the second major graphic change to photo realism since tessellation was introduced in regards to games.

    Aside from ray tracing being physically accurate it will save game developers hours. Which can be spent on the actual story/game bugs.
    I heard that RT's youth still means that in fact, we are years away from it saving any time. The toolchain is still too new to properly accelerate dev time compared to the very mature rasterisation.

    I'd personally bet that RT will go from a gimmick/side toy to a proper mandatory part of gaming around the end of the decade, somewhere between 27 and 30.
    Thoughts?

    Comment

    • Mahboi
      Senior Member
      • May 2022
      • 204

      #92
      Originally posted by Paradigm Shifter View Post
      I appreciate your optimism, but I don't think "saving hours on graphics development" will translate into "spent those hours on other areas"... I think it'll be "push out the door faster and for more profit" (while saying "RT exclusive!" so they can charge more...?).

      Sadly.
      As a former Overwatch fan (still one, but Overwatch 2 has obliterated the franchise IMO), I am shocked by how many people are utterly naive about how most game dev studios actually work. Mick Gordon's recent statement about his treatment by Id Software for DOOM Eternal says a lot, but most people still believe that it's "incompetent devs" or "communication misunderstandings" and not conscious decisions from extremely fraudy and shortsighted managers.

      Comment

      • Mahboi
        Senior Member
        • May 2022
        • 204

        #93
        Originally posted by xfcemint View Post
        On the opposite side, you have "the loud majority" on various tech-elitist forums, who use super hi-res monitors (4K),
        So me then. I just bought my second 4K monitor.

        The text reading experience is incomparable. With 1080p I felt like my eyesight was weakening every year (it probably is). With 4K I now can read code all day with 0 eye strain.

        Don't have the card to run games with it, but you should try 4K at some point. I don't play with LEDs or extreme cooling or tools, but 4K is not a small improvement.

        Comment

        • NeoMorpheus
          Senior Member
          • Aug 2022
          • 601

          #94
          Originally posted by Mahboi View Post


          I'd personally bet that RT will go from a gimmick/side toy to a proper mandatory part of gaming around the end of the decade, somewhere between 27 and 30.
          Thoughts?
          Agreed, but for some reason, you have these crazy people that are demanding RT hardware now for games that simply dont exist.

          If you hear them, you would swear that Steam has thousands of RT enabled games and each one of those are absolute buys and masterpieces.

          Its really bizarre this obsession.

          Even though, I think its just an excuse to convince themselves that they need to pay for overpriced nvidia gpus.

          Comment

          • WannaBeOCer
            Senior Member
            • Jun 2020
            • 308

            #95
            Originally posted by Mahboi View Post
            I heard that RT's youth still means that in fact, we are years away from it saving any time. The toolchain is still too new to properly accelerate dev time compared to the very mature rasterisation.

            I'd personally bet that RT will go from a gimmick/side toy to a proper mandatory part of gaming around the end of the decade, somewhere between 27 and 30.
            Thoughts?
            Ray tracing is definitely easier to implement, which is the reason we’re seeing many indie games implementing it. The issue is that larger studios are still targeting the Xbox One/PS4. Which means that implementing ray tracing would just add another task on their list.

            Never played it but I do recall Stay in the Light an indie title that required ray tracing due to the core game play depending on it. Was an interesting implementation of ray tracing. Hope we see more titles that use it in this manner.

            When ever support for the Xbox One/PS4 ends I believe in 2028 then we’ll see a ramp up of ray tracing.

            Comment

            • WannaBeOCer
              Senior Member
              • Jun 2020
              • 308

              #96
              Originally posted by xfcemint View Post
              What you have described appears to me as likely a consequence of the "choice-supportive bias" (Wikipedia).
              It is highly unlikely that "your eyesight was wearing out" because of an 1080p monitor.

              Other explanations for "The text reading experience is incomparable" and "can read code all day with 0 eye strain" are much more likely to be related to the size of the monitor, the luminance and contrast settings, the distance of the monitor from the user, and the size and type of the font that you have selected. It is unlikely to be significantly correlated to the 1080p vs 4K transition.

              Notice that I'm not saying that 4K monitors have no benefits at all. If you like the monitor to cover a very large part of the field-of-view, or to keep it close to your eyes, or you like a very large number of lines of a single document to be visible, then 4K is the right tech.

              So, it depends on personal preferences about the desired setup.



              I like to keep my monitors 100 cm away from my eyes, so that the desk surface in front of me is large, while still being able to reach the monitor's controls by hand. At that distance, 24" @ 1080p is just fine, especaially when working on two or three monitors. The downside is that games and movies do not cover a very large field-of-view, but I prefer to setup a separate gaming/console system for that purpose.

              EDIT: I forgot to mention, I suffer from a mild "far-sightedness" (Wikipedia), but I still don't need glasses. That might have influenced my decisions. On the other hand, many people suffer from mild "near-sightedness". They can actually get away with less expensive, smaller monitors than what I require.
              I could see someone getting eye strain from a low PPI monitor due to fuzzy/blurry text. I stick with 27”/4K and haven’t had an issue.

              Every user I’ve given a 27” 1080p monitor to they complain about the blurry text.

              Comment

              • AdrianBc
                Senior Member
                • Nov 2015
                • 297

                #97
                Originally posted by xfcemint View Post
                What you have described appears to me as likely a consequence of the "choice-supportive bias" (Wikipedia).
                It is highly unlikely that "your eyesight was wearing out" because of an 1080p monitor.

                Other explanations for "The text reading experience is incomparable" and "can read code all day with 0 eye strain" are much more likely to be related to the size of the monitor, the luminance and contrast settings, the distance of the monitor from the user, and the size and type of the font that you have selected. It is unlikely to be significantly correlated to the 1080p vs 4K transition.

                Notice that I'm not saying that 4K monitors have no benefits at all. If you like the monitor to cover a very large part of the field-of-view, or to keep it close to your eyes, or you like a very large number of lines of a single document to be visible, then 4K is the right tech.

                So, it depends on personal preferences about the desired setup.

                For playing games or watching movies, it may be claimed that the benefits of 4k vs. 1080p are not essential.

                However for reading text, it is objectively true that "the text reading experience is incomparable".

                At desktop monitor sizes and 1080p resolution, there are too few pixels per character cell to render correctly any beautiful typeface.

                Only the simplified typefaces with uniform line thickness and without contrast between thin lines and thick lines can be rendered acceptably well, e.g. Helvetica, Arial, Tahoma, Courier and many other such typefaces.

                I consider these simplified typefaces, which have been created in the century between the Napoleonic Wars and WWI for cheap printing on bad paper, e.g. for advertising, and for typewriting, and which have become excessively popular for computer applications, due to the reluctance during two decades of the display manufacturers to increase the resolution above 1080p, as being too ugly, so I avoid them.

                Therefore, because I use mainly beautiful typefaces and I read a lot, I use only 4k monitors.

                Anyone who is not aware how great the differences between the true forms of the letters and their renderings on an 1080p monitor are, should compare a page printed by a good laser printer with how that page is displayed on the monitor.

                Comment

                • jaxa
                  Senior Member
                  • Jul 2020
                  • 350

                  #98
                  Reports from sources close to Moore's Law Is Dead suggest that AMD's driver team has been tasked with working on quick fixes for some of the bugs plaguing the 7900 XTX models. Hopefully the new drivers can fix the power draw issues and maybe even improve the underwhelming performance a little.


                  Reviewers are concluding that AMD oversold the 7900 XTX's performance with its +50-70% claim. That may have been due in part to driver issues. Maybe it can pick up a few percent, but the first impression is the most important.

                  It's also said that the 7900 XT could be the majority of stock, which sucks if true. AMD can have its own 4080 moment with 7900 XTs gathering dust on store shelves.

                  Comment

                  • AdrianBc
                    Senior Member
                    • Nov 2015
                    • 297

                    #99
                    Originally posted by xfcemint View Post
                    What you are saying is incorrect. You have failed to specify the eye-to-monitor distance, therefore all your conclusions are ambiguous.

                    At 100 cm distance, 1080p @ 24 inch is fine. It is not perfect, but you really have to make an effort to see any significant difference.
                    At 50 cm - 70 cm distance, 4K @ 24 inch is the way to go for reading text. Many people have monitors at this distance (most desks are less than 80 cm deep). 4K @ 50 cm distance is exactly the same as 1080p @ 100 cm.

                    You are right that the viewing distance must be mentioned together with the monitor size, to be able to judge whether a resolution is good enough.

                    However I did not mention it because I believe everyone here is familiar with the viewing distance for a desktop monitor.

                    Exactly as you say, what I had in mind was a 50 cm - 70 cm distance, for a 27 inch or at least 24 inch monitor.

                    I have never seen someone using an 100 cm distance when reading or working and not playing games or watching movies.


                    An equivalent way of expressing the resolution requirements, instead of giving the monitor size and the viewing distance, is to say that one must look at an A4 or letter sized page, which must be displayed entirely on the monitor.

                    When that is the requirement, retreating at an 100 cm distance will not help you at all.


                    This is the actual criterion that I use for judging a monitor. An 1080p monitor provides only a resolution slightly larger than 90 dpi for viewing an A4 page, which is unacceptably low.
                    Last edited by AdrianBc; 14 December 2022, 04:06 AM.

                    Comment

                    • dimko
                      Senior Member
                      • Dec 2009
                      • 930

                      Originally posted by sobrus View Post
                      I don't like nvidia and I have RX6800XT (superb card for linux, and rocm seems to be working somehow), but this is how I see it.
                      Lets say, I agree with you. Nvidia DOES look forward and pushes the technology. But it does it SO FAST, that it hurts its own customer base and developers. And make no mistake, they don't do it to advance technology, nononono, they do it to make themselves a monopoly, which they de facto are at this time.
                      I coulc compare it to cars. Imagine there are 3 general car manufacturers, a, b, c. A is building a car that is gonna fly soon, b is building relatively good car, but without some bells and whistles, and c manufacturer is eating glue somewhere. IMHO customers and auto shops should adapt to b, as its what benefits them best. us gamers, are fucking stupid... We go for a every time.

                      Comment

                      Working...
                      X