Announcement

Collapse
No announcement yet.

AMD's RX7900 XT*, I am not impressed

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • AMD's RX7900 XT*, I am not impressed

    What I expected:
    1. 7900xtx to be a competitor for overpriced 4090. At least in rasterization, performance wise.
    2. Push other GPU prices down. Especially considering that chiplet design obliterated CPU market.
    3. 7900xtx be below 1000$. While on paper it is, in practice it won't be.
    4. 7900 XT to be a good competitor to 7900 xtx.

    All of those 4 have failed.
    image.png

    1. It's not. 4090 beats 7900 in EVERY aspect, with exception of 4k monitor high refresh rate support. And even then, its questionable.
    2. AMD has put itself into price niche where nothing changes. Or changes marginally.
    3. Self evident.
    4. Performance tests across the board show that XT version is a BAD DEAL.

    I to be honest can't wait to see, when Nvidia drops price for overstocked 4080 by 100$ and it will f*** AMD with no lube and sand paper attached to a di**.
    I was considering buying new cards, but fuck no I am doing it now.
    Attached Files
    Last edited by dimko; 14 December 2022, 06:43 AM.

  • #2
    Fine, I'll play ball.
    What I expected:
    1. 7900xtx to be a competitor for overpriced 4090. At least in rasterization, performance wise.
    2. Push other GPU prices down. Especially considering that chiplet design obliterated CPU market.
    3. 7900xtx be below 1000$. While on paper it is, in practice it won't be.
    4. 7900 XT to be a good competitor to 7900 xtx.​
    1/ No. AMD's game was clearly to drop the fight there. From day 1 of the announcement, they said it's a 4080 competitor, not a 4090. They didn't aim to beat the 4090 at all in any field.

    2/ True, but it's a different story. Chiplets didn't "obliterate" CPUs, Intel sat on its fat ass for 5 years giving tiny improvements. AMD didn't catch up to Intel as much as Intel let itself get politely walked all over by Mommy Lisa. Chiplets are a practicality, especially at scale, they provide better yields and thus better margins. They do not provide a necessarily stronger performance or ease of construction, quite the contrary actually. At least for CPUs everything was already multi core, but for GPUs it's far worse. The real feat in RDNA 3 for now is the work on Infinity Bus, aka the thing that makes MCDs a possibility without hacking perf.

    3/ I expected 1000$. I expected BETTER perfs though.

    4/ Honestly the 7900 xt is a shit. It's possible to understand for 5th graders: 900 / 1000 = 90% of the price. 5/6 MCDs and 80/96 of the GCD = 83.33% of the compute. If the card was 850$, an argument could be made that it's lesser but roughly as good. For 900? What a joke! It's literally the 4080 but in red. A lesser card but with much worse value than the top of the line!

    Comment


    • #3
      Now here's the real take besides your points:

      I think that AMD DID aim for a card that would handily beat the 4080. The XTX aimed to get roughly inbetween the 4080 and 4090 in terms of power.
      This is corroborated by a few things of various repute:
      • AMD wouldn't go so brazenly advertising 54% better power efficiency in front of everyone only to provide a median of 35%. They generally have a much better reputation for not lying compared to Intel or Nvidia, but even besides any kind of reputation, it's just a dumb as hell move. Why advertise something so openly and with such a precise number that you can't deliver on? I believe they thought they WOULD deliver on it.
      • AMD seemed to have no clue that the card would fall this low. I mean, maybe they are utterly incompetent, but when you sell a card and it gets THIS badly received (both of them), you're either asleep at the wheel or didn't see it coming. Considering the amount of time and effort put into chiplets and porting it to GPUs, I expect they weren't asleep at the wheel at all. So my guess is that they truly expected higher perfs from the XTX.
      • AMD also has essentially ported RDNA 2 to 5nm, chipletized it, and improved on it here and there. While the minor improvements could be the cause of the poor results, I'd wager that's unlikely. So is it the chipletization? Well, it's not impossible, but they put the man who handled chiplets for Zen for years to work on it. I'm fairly sure that if chiplets truly disappointed that much, there'd have been some red flags on the way that would have been noticed. And since it's all based on RDNA2, was RDNA 2 ultimately not able to scale to this level of power? And this is where it gets interesting, because we have one fascinating result to look at, its HW Unboxed's Cod MW2 results:

      Comment


      • #4
        How do you add images to this antiquated forum?

        Comment


        • #5
          Originally posted by Mahboi View Post
          Fine, I'll play ball.


          1/ No. AMD's game was clearly to drop the fight there. From day 1 of the announcement, they said it's a 4080 competitor, not a 4090. They didn't aim to beat the 4090 at all in any field.
          Now I am not sure, did i mean to put 4090 or 4080, but problem with either is that:
          The Radeon RX 7900 XTX is a pretty good GPU, at least relative to its GeForce competitor, but whether or not it's worth $1,000 will depend on...

          TLDR; XTX competes with 4080 when comes to rasterization! Minus the ray tracing. Since there are stocks of 4080 out there, there is a good chance Nvidia will drop price for it. by 100$. And when that happens, MSRP of XTX and 4080 will be +- 50$! So AMD will have their ass delivered to them.

          And at this stage, I was buying under performing CPUs from AMD for over a decade, until recently. I am tired of being a sponsor to multi billion company. At this stage I don't mind if they die on GPU market.(send from PC with 5700XT on it)

          Comment


          • #6
            In any case, look at that, the XTX beating the 4090 quite nicely.

            So if it is an RDNA 2 problem, why can it scale so well?

            My suspicion is that it's not the base, the chiplets or the minor improvements, but on the same problem that AMD always had: their FineWine drivers.
            Of course I have no proof, so don't go full nerd on me for this, nobody has proof, probably not even AMD themselves.

            But there are a lot of clues. Why is a 61 TFlops card sometimes capable of 50% more perf than a 20Tflops card of the same base architecture (6950 xt), and other times barely does better than 20%? Sometimes barely better than 15%? Why can it somehow overpower the 4090 in one benchmark and show great prowess in lots of production works?
            Is there a serious fault on the card? Possibly, but I want to believe considering the effort in porting RDNA to chiplets that they'd have seen a serious performance defect.

            Another thing that makes sense with this trail of thought is the latest MLID video. Now I know, people poop on him all the time for being a leaker and not owning up when he's mistaken, fine. But I don't listen to leakers to hear leaks, I listen to them to hear the thread of logic that they weave with their narrative. If it makes sense, if the story is selling is logical enough, I listen to them.

            MLID's last vid seriously encourages my trail of thought, because there were 3 elements that were brought up:
            1. The results surprised AMD as much as any of us.
            2. Apparently, RDNA 3 does not suffer any real defects on HW level. People have also managed to OC it to 3.7 GHz frontend, so there's definitely credence to give to all these rumours about RDNA 3 being 3Ghz capable and that it would eat Lovelace whole.
            3. There's apparently "a lot of little problems", which means nothing, but it's accompanied by another leak: the Radeon Drivers team has been summoned by the top brass to work over the winter holidays. Which would again indicate that from top to bottom, nobody in AMD expected RDNA 3 to disappoint so much.

            Comment


            • #7
              Thus, I see the current situation as an obviously botched launch. AMD f'd up quite seriously in missing out enough that their card underperforms severely compared to promises, looks like a bunch of liars in front of the world, and generally has damaged its reputation with a lot of reviewers and others seriously.

              Nonetheless, a real question remains. Are the cards really much poorer than expected, or is it some FineWine Drivers bs that we're looking at?

              If the cards have more severe deficiencies and the benchmarks we saw will not be changed in the next 6-12 months, I agree that the XTX is woefully overpriced and although not as bad as the 4080, it still doesn't deserve a 1000$ price tag when it does between 15% and 50% better than a 6950xt. As the Nvidia mouthpiece known as DigitalFoundry put it, the XTX is only "good" if you compare it with the 4080, which is one of the most terrible values in all gfx history.

              However, if the problem is drivers, and the grapevine does talk about driver teams being sent back to work with a kick in the butt from the top brass, then we could see the XTX seriously gain performance until it goes from being between 0-10% better than the 4080 to 10-20%. THAT would make the card seriously more valuable, because then a 1000% dollar card would be roughly 15% above and 15% below a 1200$ and 1600$ card respectively (well I say this, prices in France are already several hundreds above that for all cards).

              I obviously have no idea if it's only drivers or not. And I wonder how useful the holiday work will be, if they missed problems this deep, it'll likely be months, not a few days and some brilliant goggles man going EUREKA and unlocking a hidden 10% in the card for almost all games.

              Comment


              • #8
                Originally posted by Mahboi View Post
                In any case, look at that, the XTX beating the 4090 quite nicely.
                What do you mean? last screenshot of mine, shows rastersization 4090 and xtx are neck in neck.

                Originally posted by Mahboi View Post
                1. Apparently, RDNA 3 does not suffer any real defects on HW level. People have also managed to OC it to 3.7 GHz frontend, so there's definitely credence to give to all these rumours about RDNA 3 being 3Ghz capable and that it would eat Lovelace whole.
                Not everyone will do overclocking. Majority will probably not. By time warranty has expired extra 5% wont do much. Remember, extreme overclocking is not something that even overclockers do.

                Comment


                • #9
                  So you see, my assessment is very different from the current public mood. People feel betrayed and like they were fed a 700 dollar card for 1000$. It's understandable because the XTX is clearly underperforming compared to promises, and even compared to Nvidia it's just a little better while they cranked up the prices to insane.

                  But the question is "is the XTX really the XTX that AMD tried to make and sell?" and I think the answer is no.

                  I want to believe that the XTX will seriously improve through driver updates in the coming months, and shine far better than it has all the way here. Of course it's wishful thinking, it could just be a hardware problem and it'll stay a black mark on AMD's history, but I don't think that it's a scam. I don't think AMD wanted to sell us a 700-800 card for 1000$. I don't think they tried to overhype and then screw us over. And I don't think that the card is "barely better than a 4080". If so, it wouldn't even get close to the 4090 anywhere, let alone beat it. And it so very clearly beats it in at least one benchmark. I really think that AMD aimed at selling a card that Nvidia would sell for 1400$, an inbetween 4080 and 90, for 1000$. I think the price is right.

                  Many other things also reinforce that thought: the OC capability (jeebus, 3.7Ghz!), the AIB models with seriously more power draw, the number of 200000 cards available within the next months. AMD does not intend to scam us with an overpriced card, they really expected to sell a monster for a high price, not a meh-good card for 1000$.

                  So I'm hoping. I'm smelling my hopium that within the next 6 months, AMD will proudly present their apologies for this botched launch and ridiculous perfs, in the form of shiny new drivers that will give 10+% increase in perfs. Currently the 7900 XTX is around 35% better than the 6950xt. If it reaches even just 45% or 50% better, AMD will make good on its promises and deliver a true high power card that'll sit pretty between Nvidia's top cards while being incredibly cheaper. If the 6800xt could proudly be the card that trades blow for blow with the 3080 for the same price, the XTX wants to be the card that will eat the 4080 for a largely lesser price.

                  Also, to put an actual real bad point up: the XT is a turd and should lose at least 50$ to be even remotely competitive. I heard somewhere that the XTs were just XTXs that had a defect, AMD shuts down 1/6th of the card where the defect is, puts a dummy MCD, and sells the thing while retaining a lot of yield. Made sense to me. It even explained the pricing, 900 dollars is bad, but that way you don't create demand for a card that you don't want to make, you just want to sell your defective XTXs. And then I learned that actually there were more XTs for sale than XTXs on launch day, which proved that entire theory wrong. Wtf AMD, this card is bad, lower the price or shove it.

                  Comment


                  • #10
                    Also one thing, the RT part is both worrying and not worrying.

                    It's worrying because now AMD is squarely one gen behind Nvidia. The XTX never really offers more than a 3090 Ti. Considering that both Nvidia and AMD don't seem to be able to make jumps larger than 1.7/1.8x per gen, we might see Nvidia be ahead in RT for the entire decade, which doesn't matter much now, but will as time goes on.

                    It's also not worrying because if RT really becomes their Achilles Heel, they might put a bit more resources towards RT rather than raster. Also, Nvidia will eventually have to fix their pricing situation. Whether they chipletize or not is not known, but if they don't, considering the price of 5nm/4x, I dread the prices Nvidia will present to customers for the 5000 series in 3nm. Prices will have to come down eventually, especially with AMD being sitting pretty in the chiplet bath.

                    Comment

                    Working...
                    X