Announcement

Collapse
No announcement yet.

AMD Introduces FidelityFX Super Resolution, NVIDIA Announces DLSS For Steam Play

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by artivision View Post
    DLSS? Not the same with TAAU? Last time i checked v2 was a subset of TAAU,
    We simply don't know enough about DLSS 2.0 to call it a subset of TAAU. It likely does enough that it should be considered more a superset.

    Anyway, I linked 5 different professional reviews of it. I really haven't seen any negative reviews of 2.0. If you want to be fair-minded, you should probably take a look.

    Comment


    • #52
      Originally posted by coder View Post
      We simply don't know enough about DLSS 2.0 to call it a subset of TAAU. It likely does enough that it should be considered more a superset.

      Anyway, I linked 5 different professional reviews of it. I really haven't seen any negative reviews of 2.0. If you want to be fair-minded, you should probably take a look.

      Yes reviews with stationary captured images. Here some images: https://imgur.com/a/UASNzos

      Comment


      • #53
        Originally posted by artivision View Post
        Yes reviews with stationary captured images. Here some images: https://imgur.com/a/UASNzos
        I don't see where it says anything about DLSS, though.

        Also, you need some motion for TAA-based superresolution to work.

        Comment


        • #54
          It seems like you really, really struggle with both reading comprehension and some sort of insecurities that make you incredibly defensive.

          Originally posted by Qaridarium
          you speak in absolute terms ... but for example my system has a Vega64...

          this means in your speaking i am "no one"... also if you read here in the forum you can read like 100 people who claim they have a Vega64 right now in their system all these people are "no one" ????---
          I don't think you know what "absolute terms" means (or for that matter, "hyperbole"). Cause....

          "What you're suggesting is not acceptable to many (maybe even most) people."

          You do realize that "many" is literally not an absolute term, right?


          really i don't care if it is "Hot" i have workstation grade computer case with massive airflow even if it hit 105C all the time i don't care.
          Well then that just means you don't take good care of your hardware, since running it at 105C all the time will kill it faster than just about anything, and is a really stupid thing to do.

          also in my point of view it is not "loud" if i run a game the game sounds are more loud than the card... this means this does not count at all.
          Um, yes it does.

          really "inefficient" who fucking cares ? i do not pay for electric bill my family also has multible electric power plants we do not pay for electric energy at all. (believe it or not)
          Lmao so your response to how horrible the Vega 56/64 are at power efficiency is "well I don't pay for electric so who cares?" That's the dumbest thing I've ever heard. Electricity cannot be created for nothing, so whether you pay for it yourself or not, it's still using up too much energy and that has a negative impact. Jesus.

          of course you can run raytracing on this card... it only does not support "nvidia -RTX"
          for example you can run crysis remastered with raytracing on a vega64 (without Nvidia RTX)
          https://www.pcgamer.com/crysis-remas...tion-textures/
          so to claim it can not run raytracing is plain and simple a lie...
          No, you can't. You can use SOFTWARE ray tracing, which is not the same thing as hardware-accelerated ray tracing. Software ray tracing is irrelevant and is only available in Crysis Remastered. You can't run hardware ray-tracing whatsoever. And you never will be able to. It sounds like you don't comprehend what HW ray tracing actually is.

          it is a good gaming GPU... and the price for a used one on ebay is higher than i bought it new in 2017 i bought it for 666€ and today you can sell it for 800€ on ebay...
          Are you serious? Or are you that delusional, or are you just dumb? EVERY GPU is going for 2X or more MSRP right now because of the global supply shortage. Actually the fact that you can only get an extra $134 dollars proves my point even more. RX 580s are going for like 400-500 dollars, which is 4X more than they should cost. 3080s are going for 2-3X MSRP. Yet you could only get like 15% above MSRP for your GPU.

          what you don't get my 2 threadripper systems did costs 8000-9000€ in 2017 so it is not about saving money as you may thing
          No one said that. YOU were the only one that said it. Projecting much?

          i will buy a 6800XT as soon as the driver is finished and bug-free...
          i am in the game for a "good gaming GPU" with good opensource linux drivers nvidia does not have it anyway and to "newish" AMD cards does not have it to... so it is clear i wait another 6 month until i buy a 6800XT...
          You're LITERALLY proving my point with this statement. So, thanks.

          believe it or not i had many Nvidia gpus in my life NEVER AGAIN... i remember a geforce 9600 what broke my laptop because of faulty chip soldering....
          yes right I "sacrifice all that" and "wait a year" to not buy Nvidia exactly....
          Oh my god. Dude you're talking about a 13 YEAR OLD GPU. You do realize that AMD's Linux support in 2008 is universally known to have been OBJECTIVELY HORRIBLE. Like, unusable. SO no, you don't have any knowledge to speak on when it comes to comparing AMD vs Nvidia today. Meanwhile I actually do, since I've ran an RX 580, an RX 5600 XT, an RX 5700 XT, and an RTX 3090 in the past 2 years alone on Linux.

          I am sick of Nvidia really.. NEVER AGAIN... (remember kernbel update break X session and need to upgrade/reinstall the nvidia driver for every kernel update not supported by the nvidia driver...)
          Looks like someone doesn't know what a hook is, or really anything about what they're talking about. I've never had to do this.


          i run the newest kernel even RC kernels ... and Nvidia can't do this...
          LMAO you're just embarrassing yourself more and more with every sentence. I'm running 5.13-rc5 right now (note the neofetch screenshot, genius).

          Actually, I've run the latest rc kernel since the RTX 3090 launched, and always without issue. MUCH less issue than when I was running RDNA 1.

          See, this is the problem with you people. You go around either blatantly lying, or ignorantly spreading false information, and then you cause other people to ruin their experience, like when new users post on reddit and are like "I'm thinking of switching to Linux but I have an RTX 2060 and I heard Nvidia doesn't work with Linux, do I need to switch to AMD? And then idiots like you tell them yes, then months later they post "I never should have listened, I've had nothing but problems with AMD and so I went back to Nvidia and it's been smooth sailing." This happens every day.

          irrational is a person who use absolute terms like "No one wants to run Vega 56's or 64's" and you speak with a person who run a vega64 and you know there are like 100 or more people in this forum who also run a vega64...
          LMAO again with the failure at reading. I literally spoke in relative, non-absolute terms: ""What you're suggesting is not acceptable to many (maybe even most) people."

          well i see you irrationally do not want a "solution" to buy 1 year old AMD hardware and not the lastest and newest is the "solution" for people who do not want to buy Nvidia gpus because for example they want to run the lastest RC kernels....
          Again, with your ignorance. I RUN THE LATEST RC KERNELS, AND ALWAYS HAVE. ON NVIDIA. I'M LITERALLY DOING IT RIGHT NOW AS I TYPE. Stop making claims when you don't know what you're talking about.

          in your world Nvidia is "perfect" i do not agree... so go and buy your nvidia card and be happy
          So, this is like the 10th lie you've told. I never once said Nvidia were perfect. Never once. I never even insinuated it. So stop lying, it makes you look like a scummy person, and you need all the help you can get considering how blatantly ignorant and misinformative you are.

          And I will "go buy" the best-performing GPU in my price range, regardless of whether it's AMD or Nvidia, which is what I always do, and what I always tell others to do when they ask which GPU they should get. Because I'm not some lunatic AMD cult member who is incapable of speaking without either lying or spreading stupid false information.


          Comment


          • #55
            Originally posted by gardotd426 View Post
            See, this is the problem with you people. You go around either blatantly lying, or ignorantly spreading false information, and then you cause other people to ruin their experience, like when new users post on reddit and are like "I'm thinking of switching to Linux but I have an RTX 2060 and I heard Nvidia doesn't work with Linux, do I need to switch to AMD? And then idiots like you tell them yes
            So, if we take a step back and ask why this might be so, I can think of 2 principal reasons:
            1. AMD drivers & most of their userspace is open source, which is important to some (for both practical & ideological reasons).
            2. Because AMD is open source, there's a lot more visibility into their progress, which translates into more reporting on sites like this. That could foster a sense of a "relentless march of progress", where most of the reporting is about improvements and fixes. And when regressions are reported, it's usually in the context of getting fixed.
            I think what could help #2 is if this site would do some sort of periodic review of the stability and functionality of different families of AMD products on Linux. The primary way I have a sense of everything that's broken is by reading forum posts, which means there's an opportunity for Michael to better serve his readers.

            Comment


            • #56
              Originally posted by coder View Post
              So, if we take a step back and ask why this might be so, I can think of 2 principal reasons:
              1. AMD drivers & most of their userspace is open source, which is important to some (for both practical & ideological reasons).
              2. Because AMD is open source, there's a lot more visibility into their progress, which translates into more reporting on sites like this. That could foster a sense of a "relentless march of progress", where most of the reporting is about improvements and fixes. And when regressions are reported, it's usually in the context of getting fixed.
              I think what could help #2 is if this site would do some sort of periodic review of the stability and functionality of different families of AMD products on Linux. The primary way I have a sense of everything that's broken is by reading forum posts, which means there's an opportunity for Michael to better serve his readers.
              Regarding point 1, I think that's most of it. There's also the fact that Nvidia has done some things that do go against that philosophy in the past.

              But the problem is, huge parts of the community have gone WAY farther than that. Sure, I've seen a few (and I mean "few") people say things like "well I prefer to go with AMD because I believe in FOSS and they opened up their kernel driver." But the majority of the AMD fans are legitimate cult members, who flat-out lie to people, and harbor straight-up vitriolic *hate* for Nvidia, and claim that Nvidia despises open source and has never done a single thing for the FOSS community. Even though Nvidia has more open source code than AMD has code, period. Nvidia has dozens of open-source projects, many of them are huge and in areas where Nvidia makes a TON of money (like machine learning). But many in the community only look at graphics drivers and decide that AMD is all for open-source (not true) and Nvidia hates open source (also not true).

              And that does the entire community a HUGE disservice. Because it keeps Nvidia from getting ANY proper credit when they do good things, and it makes AMD basically able to do ANYTHING without criticism. And they're both corporations, and both do awful shit.

              And then there's the hypocrisy. A lot of AMD cultists will mention Nvidia's forced segmentation (like their *FORMER* refusal to allow their GPUs to be used in VFIO) as one of the reasons they hate Nvidia so much, meanwhile AMD does *literally* the same thing with SR-IOV (among other things). It allows AMD to get away with basically anything, while Nvidia is always greeted with vitriol even when they try to help the Linux community. AMD is able to get away with releasing HORRIBLY buggy drivers that cause system-breaking bugs, reported by hundreds of users, every day, for years, and do nothing about it. But all the cultists let that slide. It's preposterous.

              And honestly, Michael has absolutely contributed to this on multiple occasions. Constantly reporting on little things AMD's done as if they're the greatest thing ever, and never calling them out when they have crippling bugs that ruin numerous user's experience.

              And when someone like me and a few of the other people I know even try to reason with these people, we get called Nvidia shills (despite the fact that I've owned like 5-6 AMD GPUs including 2 RNDA 1 GPUs, and 6 Ryzen CPUs including 2 Zen 2 and 2 Zen 3 CPUs). The fanboyism and straight-up cult-like behavior has to stop, it does a huge disservice and it makes us worse off as a community.

              Comment


              • #57
                Originally posted by gardotd426 View Post
                And that does the entire community a HUGE disservice. Because it keeps Nvidia from getting ANY proper credit when they do good things, and it makes AMD basically able to do ANYTHING without criticism. And they're both corporations, and both do awful shit.
                We're going to have to agree to disagree on this point. Neither is perfect, but that doesn't make them equal in terms of the degree to which they embrace open source and open standards.

                And you might be right that, in sheer lines of code, Nvidia released more. However, one has to look only at their crown jewels and ask whether those are open source. AMD also publishes all the low-level programming specifications of their GPUs, and it's one reason GCC now has a backend for GCN/CDNA.

                That said, my main issue isn't open source, at all. What I care about is open standards. Specifically, I prefer to avoid writing another line of CUDA code, if possible. Sadly, AMD took their eye off the ball and now it's mainly Intel who's carrying the torch for OpenCL.

                Originally posted by gardotd426 View Post
                The fanboyism and straight-up cult-like behavior has to stop, it does a huge disservice and it makes us worse off as a community.
                I hear you, on this point. I had to recommend my own employer use Nvidia GPUs, in spite of the fact that they cost us more, because AMD's software support simply wasn't anywhere close to where we needed it to be. I would never advise someone on a solution, without understanding their needs and priorities. And even while I hoped AMD would get their act together, I never shied away from acknowledging where Nvidia excelled.

                Comment


                • #58
                  LMAO you're really embarrassing yourself.

                  Originally posted by Qaridarium
                  some care some other people don't care... but it is hard to believe you care... an 3090 is sucking energy like a burning vulcano and the cards is 2500€ if you can buy this the electric bill is for sure not a valid problem for you
                  The Vega 64 drew 300W. The 3090 draws 350. For more than double the performance. The 56 could draw up to 220W and even at 200W the 3090 is almost 3 times the performance. So my GPU is literally *twice* as efficient as yours. But yeah, good try.

                  well in hardware i like elitism (just buy the fastest) but to be honest you have a bad "taste" in my point of view.
                  Um, you say you like to "just buy the fastest," then try and talk shit for me "buying the fastest."

                  Isn't it more like you just buy the fastest "gpu" and you don't care for the price at all? and you don't care for power consumsion to because the 6900XT is faster per FPS per Watt compared to an 3090...
                  Well, if you're able to grasp this, the difference in perf-per-watt between the 3090 and 6900 XT isn't enough to matter, and the 3090 actually greatly outperforms the 6900 XT on Linux (more so than on Windows), but that isn't even why I bought the 3090 instead of the 6900 XT. I knew well ahead of time the 6900 XT would pretty much equal the 3090 in performance. But I bought the 3090 because of my HORRIBLE experience with the 5600 XT and 5700 XT, and I was not willing to sacrifice anywhere between 6 months and 2 years after launch to have a usable GPU, which is what happened with RDNA 1. So I knew the 3090 and 6900 XT would be pretty close in performance, but with the 6900 XT there was a good chance I'd be waiting a year for it to be usable, but with the 3090 it would work completely on day one. So I went with the 3090. It seems like you don't comprehend things like "people actually want to be able to USE their GPUs."

                  something really makes no sense reading your text here... a 3090 is 2500€/3000dollars.. a 580 was 250-300€ a 5600XT was like 400€ a 5700 was like 450-500 dollars... if you buy AMD you save money if you buy nvidia you buy the most expensive (professional cards aside)

                  does not fit together logically... but yes you can do whatever you like
                  See, yet another instance of you just being either completely ignorant or blatantly misleading. I didn't pay $3000 for my 3090. I paid MSRP, $1619.99 for the EVGA AIB model I got. Not a penny over. Just like I paid MSRP for my 5800X($449.99) and my 5900X ($549.99). So, your entire statement is false, and is disregarded.

                  so whats the problem? i don't buy AMD 6800/6900XT right now because no raytracing support in RADV and no ROCm support in the full open stack... and the 6000 series is 7 month old... it is really better you wait 1 year or longer if you want to full opensource driver experience without big downfalls.
                  You LITERALLY prove my point with that quote right there. Like, it's embarrassing. You're embarrassing yourself. You ask why I wouldn't buy a 6900 XT, when I already explained how awful AMD's Linux GPU drivers are at launch, and then you say the same damn thing! You can't make stupidity like this up.

                  in your "Price range"
                  https://geizhals.de/nvidia-geforce-r...loc=at&hloc=de
                  in germany this means at minimum "2519,95€" what is like 3000 US dollars...
                  No, I paid MSRP. $1619.99 for the EVGA XC3 Ultra at Micro Center on launch day. I don't buy from scalpers.

                  some care some other people don't care... but it is hard to believe you care... an 3090 is sucking energy like a burning vulcano and the cards is 2500€ if you can buy this the electric bill is for sure not a valid problem for you
                  DUDE. My GPU is ***TWICE AS EFFICIENT AS YOURS***. You do realize that, right? The 3090 gets twice the performance per watt of the Vega 56. Even more than that for the Vega 64. So, please try to refrain from making such bafflingly objectively stupid remarks.

                  well fine. i did run Nvidia hardware 15 years... and the lastest and newest kernel was always a problem.
                  but now they magically fixed everything. fine. thats wonderfull. but just why other forum members report problems on that matter?
                  It's called PEBKAC. I've already proven I'm running an RC kernel with my 3090, and I've run every single RC kernel since the 3090 launched on September 24th. It's not difficult, at all.

                  and really people don't care if it is software raytracing or assisted raytracing or full hardware raytraicng.... the result is what they want.
                  This is objectively stupid and one of the dumbest things I've ever read. Seriously.

                  well i am fine with that and i am 100% sure that only hardware with fully opensource drivers can be "perfect"
                  hardware with closed source driver can not be perfect no matter what you do.
                  This is blatantly false, and even if it *were* true, it doesn't matter because AMD's open drivers are nowhere NEAR perfect, never have been, and never will be. They're mediocre, and not even better than there *very bad* Windows drivers.

                  See, your problem is that you are ignorant in the literal sense, you have all this nonsensical false information in your head, and when you combine that with dogma that you don't even comprehend, you end up making yourself out to be a clown.

                  Comment

                  Working...
                  X