Announcement

Collapse
No announcement yet.

AMD Introduces FidelityFX Super Resolution, NVIDIA Announces DLSS For Steam Play

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    Originally posted by artivision View Post
    Yes reviews with stationary captured images. Here some images: https://imgur.com/a/UASNzos
    I don't see where it says anything about DLSS, though.

    Also, you need some motion for TAA-based superresolution to work.

    Comment


    • #62
      Originally posted by gardotd426 View Post
      What you're suggesting is not acceptable to many (maybe even most) people.
      No one wants to run Vega 56's or 64's.
      you speak in absolute terms ... but for example my system has a Vega64...

      this means in your speaking i am "no one"... also if you read here in the forum you can read like 100 people who claim they have a Vega64 right now in their system all these people are "no one" ????---

      stop using absolut terms and also stop claiming that your personal opinion is everyones opinion...

      this makes you sound like a stupid person... see i am very different to you i would never claim no one want to tun a nvidia 1080....
      the 1080 is hot and loud and inefficient and the 1080 does not even support raytracing and has no useable opensource driver...
      see? i would never say something like that because i know 100 of people in this forum alone run a 1080...

      Originally posted by gardotd426 View Post
      Those cards are hot, loud, inefficient, and at this point don't even qualify as midrange GPUs,
      really i don't care if it is "Hot" i have workstation grade computer case with massive airflow even if it hit 105C all the time i don't care.

      also in my point of view it is not "loud" if i run a game the game sounds are more loud than the card... this means this does not count at all.

      really "inefficient" who fucking cares ? i do not pay for electric bill my family also has multible electric power plants we do not pay for electric energy at all. (believe it or not)

      "don't even qualify as midrange GPUs"

      right Today this card is not mid range but back in 2017 in the moment i bought it was in fact better than mid range...


      Originally posted by gardotd426 View Post
      not to mention the fact that they don't support new tech like RT.
      of course you can run raytracing on this card... it only does not support "nvidia -RTX"
      for example you can run crysis remastered with raytracing on a vega64 (without Nvidia RTX)
      https://www.pcgamer.com/crysis-remas...tion-textures/
      so to claim it can not run raytracing is plain and simple a lie...


      Originally posted by gardotd426 View Post
      If you're someone who doesn't need a good gaming GPU, then that's fantastic for you. But what you're talking about is completely unacceptable to the majority of people that care about gaming.
      it is a good gaming GPU... and the price for a used one on ebay is higher than i bought it new in 2017 i bought it for 666€ and today you can sell it for 800€ on ebay...

      what you don't get my 2 threadripper systems did costs 8000-9000€ in 2017 so it is not about saving money as you may thing
      as soon as there is a "better card" i will buy one for sure... but a 5700XT is only in average 10% faster than a vega64 and the 6800XT/6900XT
      as you know the driver support is "NOT" in the same status as for the vega64...
      i will buy a 6800XT as soon as the driver is finished and bug-free...
      i am in the game for a "good gaming GPU" with good opensource linux drivers nvidia does not have it anyway and to "newish" AMD cards does not have it to... so it is clear i wait another 6 month until i buy a 6800XT...

      Originally posted by gardotd426 View Post
      And to sacrifice all that, just to what, not buy Nvidia for some nonsense arbitrary reason, when Nvidia will give you full support on day one and not make you wait a year for your new GPU to work?
      believe it or not i had many Nvidia gpus in my life NEVER AGAIN... i remember a geforce 9600 what broke my laptop because of faulty chip soldering....
      yes right I "sacrifice all that" and "wait a year" to not buy Nvidia exactly....

      I am sick of Nvidia really.. NEVER AGAIN... (remember kernbel update break X session and need to upgrade/reinstall the nvidia driver for every kernel update not supported by the nvidia driver...)

      i run the newest kernel even RC kernels ... and Nvidia can't do this...

      Originally posted by gardotd426 View Post
      That's preposterous, and you have zero room to call anyone else irrational, because what you just said is one of the most irrational things I've ever heard when it comes to PC hardware - "I don't want to buy Nvidia because I don't like them despite the fact that Intel and AMD do the exact same shit, so I'm going to only buy old hardware, but what's more, I'm going to tell *everyone else* to only buy old hardware too, even if that isn't acceptable for their needs."
      irrational is a person who use absolute terms like "No one wants to run Vega 56's or 64's" and you speak with a person who run a vega64 and you know there are like 100 or more people in this forum who also run a vega64...

      also irrational is a person who think his own opinion is everyone else opinion...

      well i see you irrationally do not want a "solution" to buy 1 year old AMD hardware and not the lastest and newest is the "solution" for people who do not want to buy Nvidia gpus because for example they want to run the lastest RC kernels....

      in your world Nvidia is "perfect" i do not agree... so go and buy your nvidia card and be happy

      and i and many others go and buy 1 year old AMD card and we are happy to.
      Phantom circuit Sequence Reducer Dyslexia

      Comment


      • #63
        It seems like you really, really struggle with both reading comprehension and some sort of insecurities that make you incredibly defensive.

        Originally posted by Qaridarium View Post
        you speak in absolute terms ... but for example my system has a Vega64...

        this means in your speaking i am "no one"... also if you read here in the forum you can read like 100 people who claim they have a Vega64 right now in their system all these people are "no one" ????---
        I don't think you know what "absolute terms" means (or for that matter, "hyperbole"). Cause....

        "What you're suggesting is not acceptable to many (maybe even most) people."

        You do realize that "many" is literally not an absolute term, right?


        really i don't care if it is "Hot" i have workstation grade computer case with massive airflow even if it hit 105C all the time i don't care.
        Well then that just means you don't take good care of your hardware, since running it at 105C all the time will kill it faster than just about anything, and is a really stupid thing to do.

        also in my point of view it is not "loud" if i run a game the game sounds are more loud than the card... this means this does not count at all.
        Um, yes it does.

        really "inefficient" who fucking cares ? i do not pay for electric bill my family also has multible electric power plants we do not pay for electric energy at all. (believe it or not)
        Lmao so your response to how horrible the Vega 56/64 are at power efficiency is "well I don't pay for electric so who cares?" That's the dumbest thing I've ever heard. Electricity cannot be created for nothing, so whether you pay for it yourself or not, it's still using up too much energy and that has a negative impact. Jesus.

        of course you can run raytracing on this card... it only does not support "nvidia -RTX"
        for example you can run crysis remastered with raytracing on a vega64 (without Nvidia RTX)
        https://www.pcgamer.com/crysis-remas...tion-textures/
        so to claim it can not run raytracing is plain and simple a lie...
        No, you can't. You can use SOFTWARE ray tracing, which is not the same thing as hardware-accelerated ray tracing. Software ray tracing is irrelevant and is only available in Crysis Remastered. You can't run hardware ray-tracing whatsoever. And you never will be able to. It sounds like you don't comprehend what HW ray tracing actually is.

        it is a good gaming GPU... and the price for a used one on ebay is higher than i bought it new in 2017 i bought it for 666€ and today you can sell it for 800€ on ebay...
        Are you serious? Or are you that delusional, or are you just dumb? EVERY GPU is going for 2X or more MSRP right now because of the global supply shortage. Actually the fact that you can only get an extra $134 dollars proves my point even more. RX 580s are going for like 400-500 dollars, which is 4X more than they should cost. 3080s are going for 2-3X MSRP. Yet you could only get like 15% above MSRP for your GPU.

        what you don't get my 2 threadripper systems did costs 8000-9000€ in 2017 so it is not about saving money as you may thing
        No one said that. YOU were the only one that said it. Projecting much?

        i will buy a 6800XT as soon as the driver is finished and bug-free...
        i am in the game for a "good gaming GPU" with good opensource linux drivers nvidia does not have it anyway and to "newish" AMD cards does not have it to... so it is clear i wait another 6 month until i buy a 6800XT...
        You're LITERALLY proving my point with this statement. So, thanks.

        believe it or not i had many Nvidia gpus in my life NEVER AGAIN... i remember a geforce 9600 what broke my laptop because of faulty chip soldering....
        yes right I "sacrifice all that" and "wait a year" to not buy Nvidia exactly....
        Oh my god. Dude you're talking about a 13 YEAR OLD GPU. You do realize that AMD's Linux support in 2008 is universally known to have been OBJECTIVELY HORRIBLE. Like, unusable. SO no, you don't have any knowledge to speak on when it comes to comparing AMD vs Nvidia today. Meanwhile I actually do, since I've ran an RX 580, an RX 5600 XT, an RX 5700 XT, and an RTX 3090 in the past 2 years alone on Linux.

        I am sick of Nvidia really.. NEVER AGAIN... (remember kernbel update break X session and need to upgrade/reinstall the nvidia driver for every kernel update not supported by the nvidia driver...)
        Looks like someone doesn't know what a hook is, or really anything about what they're talking about. I've never had to do this.


        i run the newest kernel even RC kernels ... and Nvidia can't do this...
        LMAO you're just embarrassing yourself more and more with every sentence. I'm running 5.13-rc5 right now (note the neofetch screenshot, genius).

        Actually, I've run the latest rc kernel since the RTX 3090 launched, and always without issue. MUCH less issue than when I was running RDNA 1.

        See, this is the problem with you people. You go around either blatantly lying, or ignorantly spreading false information, and then you cause other people to ruin their experience, like when new users post on reddit and are like "I'm thinking of switching to Linux but I have an RTX 2060 and I heard Nvidia doesn't work with Linux, do I need to switch to AMD? And then idiots like you tell them yes, then months later they post "I never should have listened, I've had nothing but problems with AMD and so I went back to Nvidia and it's been smooth sailing." This happens every day.

        irrational is a person who use absolute terms like "No one wants to run Vega 56's or 64's" and you speak with a person who run a vega64 and you know there are like 100 or more people in this forum who also run a vega64...
        LMAO again with the failure at reading. I literally spoke in relative, non-absolute terms: ""What you're suggesting is not acceptable to many (maybe even most) people."

        well i see you irrationally do not want a "solution" to buy 1 year old AMD hardware and not the lastest and newest is the "solution" for people who do not want to buy Nvidia gpus because for example they want to run the lastest RC kernels....
        Again, with your ignorance. I RUN THE LATEST RC KERNELS, AND ALWAYS HAVE. ON NVIDIA. I'M LITERALLY DOING IT RIGHT NOW AS I TYPE. Stop making claims when you don't know what you're talking about.

        in your world Nvidia is "perfect" i do not agree... so go and buy your nvidia card and be happy
        So, this is like the 10th lie you've told. I never once said Nvidia were perfect. Never once. I never even insinuated it. So stop lying, it makes you look like a scummy person, and you need all the help you can get considering how blatantly ignorant and misinformative you are.

        And I will "go buy" the best-performing GPU in my price range, regardless of whether it's AMD or Nvidia, which is what I always do, and what I always tell others to do when they ask which GPU they should get. Because I'm not some lunatic AMD cult member who is incapable of speaking without either lying or spreading stupid false information.


        Comment


        • #64
          Originally posted by gardotd426 View Post
          See, this is the problem with you people. You go around either blatantly lying, or ignorantly spreading false information, and then you cause other people to ruin their experience, like when new users post on reddit and are like "I'm thinking of switching to Linux but I have an RTX 2060 and I heard Nvidia doesn't work with Linux, do I need to switch to AMD? And then idiots like you tell them yes
          So, if we take a step back and ask why this might be so, I can think of 2 principal reasons:
          1. AMD drivers & most of their userspace is open source, which is important to some (for both practical & ideological reasons).
          2. Because AMD is open source, there's a lot more visibility into their progress, which translates into more reporting on sites like this. That could foster a sense of a "relentless march of progress", where most of the reporting is about improvements and fixes. And when regressions are reported, it's usually in the context of getting fixed.
          I think what could help #2 is if this site would do some sort of periodic review of the stability and functionality of different families of AMD products on Linux. The primary way I have a sense of everything that's broken is by reading forum posts, which means there's an opportunity for Michael to better serve his readers.

          Comment


          • #65
            Originally posted by coder View Post
            So, if we take a step back and ask why this might be so, I can think of 2 principal reasons:
            1. AMD drivers & most of their userspace is open source, which is important to some (for both practical & ideological reasons).
            2. Because AMD is open source, there's a lot more visibility into their progress, which translates into more reporting on sites like this. That could foster a sense of a "relentless march of progress", where most of the reporting is about improvements and fixes. And when regressions are reported, it's usually in the context of getting fixed.
            I think what could help #2 is if this site would do some sort of periodic review of the stability and functionality of different families of AMD products on Linux. The primary way I have a sense of everything that's broken is by reading forum posts, which means there's an opportunity for Michael to better serve his readers.
            Regarding point 1, I think that's most of it. There's also the fact that Nvidia has done some things that do go against that philosophy in the past.

            But the problem is, huge parts of the community have gone WAY farther than that. Sure, I've seen a few (and I mean "few") people say things like "well I prefer to go with AMD because I believe in FOSS and they opened up their kernel driver." But the majority of the AMD fans are legitimate cult members, who flat-out lie to people, and harbor straight-up vitriolic *hate* for Nvidia, and claim that Nvidia despises open source and has never done a single thing for the FOSS community. Even though Nvidia has more open source code than AMD has code, period. Nvidia has dozens of open-source projects, many of them are huge and in areas where Nvidia makes a TON of money (like machine learning). But many in the community only look at graphics drivers and decide that AMD is all for open-source (not true) and Nvidia hates open source (also not true).

            And that does the entire community a HUGE disservice. Because it keeps Nvidia from getting ANY proper credit when they do good things, and it makes AMD basically able to do ANYTHING without criticism. And they're both corporations, and both do awful shit.

            And then there's the hypocrisy. A lot of AMD cultists will mention Nvidia's forced segmentation (like their *FORMER* refusal to allow their GPUs to be used in VFIO) as one of the reasons they hate Nvidia so much, meanwhile AMD does *literally* the same thing with SR-IOV (among other things). It allows AMD to get away with basically anything, while Nvidia is always greeted with vitriol even when they try to help the Linux community. AMD is able to get away with releasing HORRIBLY buggy drivers that cause system-breaking bugs, reported by hundreds of users, every day, for years, and do nothing about it. But all the cultists let that slide. It's preposterous.

            And honestly, Michael has absolutely contributed to this on multiple occasions. Constantly reporting on little things AMD's done as if they're the greatest thing ever, and never calling them out when they have crippling bugs that ruin numerous user's experience.

            And when someone like me and a few of the other people I know even try to reason with these people, we get called Nvidia shills (despite the fact that I've owned like 5-6 AMD GPUs including 2 RNDA 1 GPUs, and 6 Ryzen CPUs including 2 Zen 2 and 2 Zen 3 CPUs). The fanboyism and straight-up cult-like behavior has to stop, it does a huge disservice and it makes us worse off as a community.

            Comment


            • #66
              Originally posted by gardotd426 View Post
              And that does the entire community a HUGE disservice. Because it keeps Nvidia from getting ANY proper credit when they do good things, and it makes AMD basically able to do ANYTHING without criticism. And they're both corporations, and both do awful shit.
              We're going to have to agree to disagree on this point. Neither is perfect, but that doesn't make them equal in terms of the degree to which they embrace open source and open standards.

              And you might be right that, in sheer lines of code, Nvidia released more. However, one has to look only at their crown jewels and ask whether those are open source. AMD also publishes all the low-level programming specifications of their GPUs, and it's one reason GCC now has a backend for GCN/CDNA.

              That said, my main issue isn't open source, at all. What I care about is open standards. Specifically, I prefer to avoid writing another line of CUDA code, if possible. Sadly, AMD took their eye off the ball and now it's mainly Intel who's carrying the torch for OpenCL.

              Originally posted by gardotd426 View Post
              The fanboyism and straight-up cult-like behavior has to stop, it does a huge disservice and it makes us worse off as a community.
              I hear you, on this point. I had to recommend my own employer use Nvidia GPUs, in spite of the fact that they cost us more, because AMD's software support simply wasn't anywhere close to where we needed it to be. I would never advise someone on a solution, without understanding their needs and priorities. And even while I hoped AMD would get their act together, I never shied away from acknowledging where Nvidia excelled.

              Comment


              • #67

                Originally posted by gardotd426 View Post
                Well then that just means you don't take good care of your hardware, since running it at 105C all the time will kill it faster than just about anything, and is a really stupid thing to do.
                i did buy 6 vega64 in 2017 and one died because of ethereum mining ... all other 5 are fine. and yes they did run at 105C for years...

                Originally posted by gardotd426 View Post
                Lmao so your response to how horrible the Vega 56/64 are at power efficiency is "well I don't pay for electric so who cares?" That's the dumbest thing I've ever heard. Electricity cannot be created for nothing, so whether you pay for it yourself or not, it's still using up too much energy and that has a negative impact. Jesus.
                some care some other people don't care... but it is hard to believe you care... an 3090 is sucking energy like a burning vulcano and the cards is 2500€ if you can buy this the electric bill is for sure not a valid problem for you


                Originally posted by gardotd426 View Post
                No, you can't. You can use SOFTWARE ray tracing, which is not the same thing as hardware-accelerated ray tracing. Software ray tracing is irrelevant and is only available in Crysis Remastered. You can't run hardware ray-tracing whatsoever. And you never will be able to. It sounds like you don't comprehend what HW ray tracing actually is.
                you have to admit that you never spoke about hardware assisted(AMD 6900XT) or hardware accelerated (Nvidia 2000/3000)
                you did talk about RT what means raytracing... and yes Software Raytracing is in fact raytracing.
                and really people don't care if it is software raytracing or assisted raytracing or full hardware raytraicng.... the result is what they want.
                ok to be fear to you: 1-2 games with software raytracing is not the market the people hope to get on the RTX side it is like 20-30 games (on windows) so most people whould avoid software raytracing for this fact alone.

                Originally posted by gardotd426 View Post
                Are you serious? Or are you that delusional, or are you just dumb? EVERY GPU is going for 2X or more MSRP right now because of the global supply shortage. Actually the fact that you can only get an extra $134 dollars proves my point even more. RX 580s are going for like 400-500 dollars, which is 4X more than they should cost. 3080s are going for 2-3X MSRP. Yet you could only get like 15% above MSRP for your GPU.
                well this supply shortage is really really strange. and i do not care if my card is 800€ or 2000€...
                what AMD card really does this on ebay is the Radeon VII it was like 700€ back in the release date and today you get like 2000€ on ebay.
                what is right used nviida cards tend to have higher price on ebay... even super old geforce 8800 can run openCL/Cuda so yes if you buy a new card only to sell it on ebay years later nvidia is the way to go. but i don'T do this i use my hardware for many years until it dies or is near obsolete. and even after that i don't put it on ebay instead i have poor friends who need an upgrade as a gift.

                Originally posted by gardotd426 View Post
                You're LITERALLY proving my point with this statement. So, thanks.
                so whats the problem? i don't buy AMD 6800/6900XT right now because no raytracing support in RADV and no ROCm support in the full open stack... and the 6000 series is 7 month old... it is really better you wait 1 year or longer if you want to full opensource driver experience without big downfalls.

                Originally posted by gardotd426 View Post
                Oh my god. Dude you're talking about a 13 YEAR OLD GPU. You do realize that AMD's Linux support in 2008 is universally known to have been OBJECTIVELY HORRIBLE. Like, unusable. SO no, you don't have any knowledge to speak on when it comes to comparing AMD vs Nvidia today. Meanwhile I actually do, since I've ran an RX 580, an RX 5600 XT, an RX 5700 XT, and an RTX 3090 in the past 2 years alone on Linux.
                something really makes no sense reading your text here... a 3090 is 2500€/3000dollars.. a 580 was 250-300€ a 5600XT was like 400€ a 5700 was like 450-500 dollars... if you buy AMD you save money if you buy nvidia you buy the most expensive (professional cards aside)

                does not fit together logically... but yes you can do whatever you like

                Originally posted by gardotd426 View Post
                See, this is the problem with you people. You go around either blatantly lying, or ignorantly spreading false information, and then you cause other people to ruin their experience, like when new users post on reddit and are like "I'm thinking of switching to Linux but I have an RTX 2060 and I heard Nvidia doesn't work with Linux, do I need to switch to AMD? And then idiots like you tell them yes, then months later they post "I never should have listened, I've had nothing but problems with AMD and so I went back to Nvidia and it's been smooth sailing." This happens every day.
                so all the AMD hardware users a victims of some kind of mind control and propaganda and the Nvidia people are Woke and aware ...
                i get the point really.

                Originally posted by gardotd426 View Post
                Again, with your ignorance. I RUN THE LATEST RC KERNELS, AND ALWAYS HAVE. ON NVIDIA. I'M LITERALLY DOING IT RIGHT NOW AS I TYPE. Stop making claims when you don't know what you're talking about.
                well fine. i did run Nvidia hardware 15 years... and the lastest and newest kernel was always a problem.
                but now they magically fixed everything. fine. thats wonderfull. but just why other forum members report problems on that matter?


                Originally posted by gardotd426 View Post
                So, this is like the 10th lie you've told. I never once said Nvidia were perfect. Never once. I never even insinuated it. So stop lying, it makes you look like a scummy person, and you need all the help you can get considering how blatantly ignorant and misinformative you are.
                well i am fine with that and i am 100% sure that only hardware with fully opensource drivers can be "perfect"
                hardware with closed source driver can not be perfect no matter what you do.

                Originally posted by gardotd426 View Post
                And I will "go buy" the best-performing GPU in my price range, regardless of whether it's AMD or Nvidia, which is what I always do, and what I always tell others to do when they ask which GPU they should get. Because I'm not some lunatic AMD cult member who is incapable of speaking without either lying or spreading stupid false information.
                in your "Price range"
                https://geizhals.de/nvidia-geforce-r...loc=at&hloc=de
                in germany this means at minimum "2519,95€" what is like 3000 US dollars...
                Isn't it more like you just buy the fastest "gpu" and you don't care for the price at all? and you don't care for power consumsion to because the 6900XT is faster per FPS per Watt compared to an 3090...

                well in hardware i like elitism (just buy the fastest) but to be honest you have a bad "taste" in my point of view.

                Well i am not a AMD cult member i am a open-source/FLOSS advocate.

                Phantom circuit Sequence Reducer Dyslexia

                Comment


                • #68
                  LMAO you're really embarrassing yourself.

                  Originally posted by Qaridarium View Post
                  some care some other people don't care... but it is hard to believe you care... an 3090 is sucking energy like a burning vulcano and the cards is 2500€ if you can buy this the electric bill is for sure not a valid problem for you
                  The Vega 64 drew 300W. The 3090 draws 350. For more than double the performance. The 56 could draw up to 220W and even at 200W the 3090 is almost 3 times the performance. So my GPU is literally *twice* as efficient as yours. But yeah, good try.

                  well in hardware i like elitism (just buy the fastest) but to be honest you have a bad "taste" in my point of view.
                  Um, you say you like to "just buy the fastest," then try and talk shit for me "buying the fastest."

                  Isn't it more like you just buy the fastest "gpu" and you don't care for the price at all? and you don't care for power consumsion to because the 6900XT is faster per FPS per Watt compared to an 3090...
                  Well, if you're able to grasp this, the difference in perf-per-watt between the 3090 and 6900 XT isn't enough to matter, and the 3090 actually greatly outperforms the 6900 XT on Linux (more so than on Windows), but that isn't even why I bought the 3090 instead of the 6900 XT. I knew well ahead of time the 6900 XT would pretty much equal the 3090 in performance. But I bought the 3090 because of my HORRIBLE experience with the 5600 XT and 5700 XT, and I was not willing to sacrifice anywhere between 6 months and 2 years after launch to have a usable GPU, which is what happened with RDNA 1. So I knew the 3090 and 6900 XT would be pretty close in performance, but with the 6900 XT there was a good chance I'd be waiting a year for it to be usable, but with the 3090 it would work completely on day one. So I went with the 3090. It seems like you don't comprehend things like "people actually want to be able to USE their GPUs."

                  something really makes no sense reading your text here... a 3090 is 2500€/3000dollars.. a 580 was 250-300€ a 5600XT was like 400€ a 5700 was like 450-500 dollars... if you buy AMD you save money if you buy nvidia you buy the most expensive (professional cards aside)

                  does not fit together logically... but yes you can do whatever you like
                  See, yet another instance of you just being either completely ignorant or blatantly misleading. I didn't pay $3000 for my 3090. I paid MSRP, $1619.99 for the EVGA AIB model I got. Not a penny over. Just like I paid MSRP for my 5800X($449.99) and my 5900X ($549.99). So, your entire statement is false, and is disregarded.

                  so whats the problem? i don't buy AMD 6800/6900XT right now because no raytracing support in RADV and no ROCm support in the full open stack... and the 6000 series is 7 month old... it is really better you wait 1 year or longer if you want to full opensource driver experience without big downfalls.
                  You LITERALLY prove my point with that quote right there. Like, it's embarrassing. You're embarrassing yourself. You ask why I wouldn't buy a 6900 XT, when I already explained how awful AMD's Linux GPU drivers are at launch, and then you say the same damn thing! You can't make stupidity like this up.

                  in your "Price range"
                  https://geizhals.de/nvidia-geforce-r...loc=at&hloc=de
                  in germany this means at minimum "2519,95€" what is like 3000 US dollars...
                  No, I paid MSRP. $1619.99 for the EVGA XC3 Ultra at Micro Center on launch day. I don't buy from scalpers.

                  some care some other people don't care... but it is hard to believe you care... an 3090 is sucking energy like a burning vulcano and the cards is 2500€ if you can buy this the electric bill is for sure not a valid problem for you
                  DUDE. My GPU is ***TWICE AS EFFICIENT AS YOURS***. You do realize that, right? The 3090 gets twice the performance per watt of the Vega 56. Even more than that for the Vega 64. So, please try to refrain from making such bafflingly objectively stupid remarks.

                  well fine. i did run Nvidia hardware 15 years... and the lastest and newest kernel was always a problem.
                  but now they magically fixed everything. fine. thats wonderfull. but just why other forum members report problems on that matter?
                  It's called PEBKAC. I've already proven I'm running an RC kernel with my 3090, and I've run every single RC kernel since the 3090 launched on September 24th. It's not difficult, at all.

                  and really people don't care if it is software raytracing or assisted raytracing or full hardware raytraicng.... the result is what they want.
                  This is objectively stupid and one of the dumbest things I've ever read. Seriously.

                  well i am fine with that and i am 100% sure that only hardware with fully opensource drivers can be "perfect"
                  hardware with closed source driver can not be perfect no matter what you do.
                  This is blatantly false, and even if it *were* true, it doesn't matter because AMD's open drivers are nowhere NEAR perfect, never have been, and never will be. They're mediocre, and not even better than there *very bad* Windows drivers.

                  See, your problem is that you are ignorant in the literal sense, you have all this nonsensical false information in your head, and when you combine that with dogma that you don't even comprehend, you end up making yourself out to be a clown.

                  Comment


                  • #69
                    Originally posted by gardotd426 View Post
                    LMAO you're really embarrassing yourself.
                    The Vega 64 drew 300W. The 3090 draws 350. For more than double the performance. The 56 could draw up to 220W and even at 200W the 3090 is almost 3 times the performance. So my GPU is literally *twice* as efficient as yours. But yeah, good try.
                    what try?.... really... my vega64 is from 2017.... and it is 14nm... and your 3090 is from 2021 and 8nm....

                    "The 3090 draws 350. For more than double the performance."

                    this is not some "magic" it is simple 8nm... and a 2 time newer architecture...

                    Originally posted by gardotd426 View Post
                    Um, you say you like to "just buy the fastest," then try and talk shit for me "buying the fastest."
                    well you can buy the fastest but still have a bad taste for wasting your time on something what is not open-source...
                    we already know you don't care about opensource.. so why use linux? install windows and be happy.


                    Originally posted by gardotd426 View Post
                    Well, if you're able to grasp this, the difference in perf-per-watt between the 3090 and 6900 XT isn't enough to matter, and the 3090 actually greatly outperforms the 6900 XT on Linux (more so than on Windows), but that isn't even why I bought the 3090 instead of the 6900 XT. I knew well ahead of time the 6900 XT would pretty much equal the 3090 in performance. But I bought the 3090 because of my HORRIBLE experience with the 5600 XT and 5700 XT, and I was not willing to sacrifice anywhere between 6 months and 2 years after launch to have a usable GPU, which is what happened with RDNA 1. So I knew the 3090 and 6900 XT would be pretty close in performance, but with the 6900 XT there was a good chance I'd be waiting a year for it to be usable, but with the 3090 it would work completely on day one. So I went with the 3090. It seems like you don't comprehend things like "people actually want to be able to USE their GPUs."
                    well believe it or not but thats perfectly fine for you.


                    Originally posted by gardotd426 View Post
                    See, yet another instance of you just being either completely ignorant or blatantly misleading. I didn't pay $3000 for my 3090. I paid MSRP, $1619.99 for the EVGA AIB model I got. Not a penny over. Just like I paid MSRP for my 5800X($449.99) and my 5900X ($549.99). So, your entire statement is false, and is disregarded.
                    well you are a lucky guy to get the 3090 for only 1619 dollars...


                    Originally posted by gardotd426 View Post
                    You LITERALLY prove my point with that quote right there. Like, it's embarrassing. You're embarrassing yourself. You ask why I wouldn't buy a 6900 XT, when I already explained how awful AMD's Linux GPU drivers are at launch, and then you say the same damn thing! You can't make stupidity like this up.
                    I have no problem to admit the fact that the AMD driver is awful at launch....
                    so the only solution for this is: wait 1 year after launch (this is what i do) or else you buy a 3090(this is what you do)

                    Originally posted by gardotd426 View Post
                    No, I paid MSRP. $1619.99 for the EVGA XC3 Ultra at Micro Center on launch day. I don't buy from scalpers.
                    again looks like you are a lucky person.

                    Originally posted by gardotd426 View Post
                    DUDE. My GPU is ***TWICE AS EFFICIENT AS YOURS***. You do realize that, right? The 3090 gets twice the performance per watt of the Vega 56. Even more than that for the Vega 64. So, please try to refrain from making such bafflingly objectively stupid remarks.
                    yes if you invent time travel and send your 3090 back in the year 2017 then true... i bought my hardware in 2017 not 2021.
                    also your memory is not so good i said to you: i do not pay power bill so your argument *TWICE AS EFFICIENT AS YOURS*
                    is relevant for you but not for me....

                    Originally posted by gardotd426 View Post
                    It's called PEBKAC. I've already proven I'm running an RC kernel with my 3090, and I've run every single RC kernel since the 3090 launched on September 24th. It's not difficult, at all.
                    right looks like you are "THE EXPERT" and i am only a forum troll who is trolling you. i really don't have any problem with that.

                    Originally posted by gardotd426 View Post
                    This is objectively stupid and one of the dumbest things I've ever read. Seriously.
                    Looks like you are the one who is gifted with Intelligence and i am the stupid loser... i do not even have any problem with that....


                    Originally posted by gardotd426 View Post
                    This is blatantly false, and even if it *were* true, it doesn't matter because AMD's open drivers are nowhere NEAR perfect, never have been, and never will be. They're mediocre, and not even better than there *very bad* Windows drivers.
                    See, your problem is that you are ignorant in the literal sense, you have all this nonsensical false information in your head, and when you combine that with dogma that you don't even comprehend, you end up making yourself out to be a clown.
                    Really i do not even have a problem with beeing a clown....
                    Phantom circuit Sequence Reducer Dyslexia

                    Comment

                    Working...
                    X