Former Nouveau Lead Developer Joins NVIDIA, Continues Working On Open-Source Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • bhptitotoss
    replied
    Originally posted by Panix View Post
    Huh? You're so full of manure...lol.... how is the power consumption better than a 4070 Ti Super? What are you looking at?

    No one in their right mind would try to argue a 7900 series even the 7900 GRE has better or more efficient power consumption than a 4070 Ti Super - maybe you meant the 4070? It's about even with that card on his graph.

    Even most AMD fans will concede that the 4070 series is more power efficient than any AMD 7900 RDNA 3 card.

    AMD gpus still suck in Blender and only use HIP - which is slower than the hacked ZLUDA - and the 7900 xtx is slower at rendering than the 4070 series - whether it's the 12gb vanilla card or a 4070 Super also at 12gb of vram or the 16gb 4070 Ti Super - yes, those cards are able to use optiX so a bit unfair but then it's AMD and Blender's fault - one or both of them - that haven't been able to use the OPEN SOURCE ray tracing element of HIP-RT.

    So, stop smoking all those drugs you have.
    I have a questions ........

    How can someone be full of manure? Sorry English isn't my first language and so I don't understand.

    Leave a comment:


  • tenchrio
    replied
    Originally posted by Panix View Post
    Huh? You're so full of manure...lol.... how is the power consumption better than a 4070 Ti Super? What are you looking at?
    image.png
    7900 GRE | Min : 6 | Avg :200.15 | Max :240
    4070 TI Super | Min 10.47 | Avg 228.6 | Max 284.93


    Now I might not be a mathematician but I think 228 is a bigger number compared to 200 and 284 is a bigger number compared to 240.
    Seems my previous message had a typo and was meant to say "better power consumption and performance per dollar", normally I would say my bad but commenting to you is such a draining chore as you seem to barely be able to read any of the articles you link yourself and honestly writing them feels like a waste of time.

    The 7900GRE is rated for 260W TDP, the 4070 TI Super is 285W TDP. But I guess that would require explaining TDP to you.
    Again you commented on that article, the least you can do is actually read the articles you comment on. Oh wait you don't really do that, do you; still evident as you point towards 60HZ V-sync and power spikes not understanding that power spikes are not an actual benchmark of any sort and in that case specifically came from running the Furmark as evident from using your eyes on the literally graph on the top of the page of the techpowerup article you linked. While during Gaming and Video Playback this spike wasn't there at all. And of course Vsync is outdated, not to mention optional and generally speaking looked down upon, if you need to enable 60HZ Vsync, you don't need a new GPU, you need a new monitor.

    Originally posted by Panix View Post
    AMD gpus still suck in Blender and only use HIP
    It amazes me how every time, you showcase how absolutely little you know about Blender.
    The Eevee render engine, again the default render engine for Blender, uses OpenGL. So does Workbench, the render engine used for the viewport during modeling, sculpting and animation preview. Eevee-next, Blender's upcoming Render engine that uses Ray Tracing will use Vulkan (and most likely will become the default in the near future). And I have shared you countless benchmarks where AMD has proven they have decent performance in one and absolutely excel in the other compared to Nvidia.

    Originally posted by Panix View Post

    - which is slower than the hacked ZLUDA - and the 7900 xtx is slower at rendering than the 4070 series - whether it's the 12gb vanilla card or a 4070 Super also at 12gb of vram or the 16gb 4070 Ti Super - yes, those cards are able to use optiX so a bit unfair but then it's AMD and Blender's fault - one or both of them - that haven't been able to use the OPEN SOURCE ray tracing element of HIP-RT.
    And again all those performance metrics you mention are for Cycles without OIDN and the benchmarks files are un-optimized, they have been for a long time (note how this guy talks about 400 samples and I linked a thread in the post you quoted where someone was using 4 samples in Blender 4.1 and that 400 isn't even the default for the BMW27_GPU benchmark, if you download it today, BMW27_GPU still has 1225 samples). And as said before AMD has their own Render engine, that can be used inside of Blender, that uses RT acceleration, so there is that option if you want RT acceleration so dearly (you even get a free Material Library).

    If I run the BMW benchmark at 1440P with about 12 Samples but turning the denoiser and adaptive sampling on (as they are off since they did not exist back in 2016, but are now on by default whenever you make a new project) and disable Tiling + BVH, I get a render time of 06.3 seconds. Meanwhile someone compared Optix and CUDA on the RTX 4090 using BMW at 1440P clearly kept the default settings ("Sample x/1225" during rendering) and achieved a time of 41.64s on Cuda and 28.46s on Optix, I must have some kind of insane futuristic graphics card to pull of 06.3s, a whopping 1/4th the render time of an RTX 4090 using Optix!! Or I know how to optimize render settings and have been around long enough to know that BMW is outdated and is also present in the test suite of the Blender Opendata benchmark to calculate the samples/min. And this isn't even mentioning the ways the actual materials in the scene can be optimized as well (the benchmark's last update dates from 2016 after all). Add Shaders nodes that can be replaced by the Principled BSDF, Mixed Shader Nodes for the roughness which could be baked on the already existing UVs etc etc
    Screenshot from 2024-05-04 02-36-59.png

    Not to mention that I told you countless times (even linking to articles of multiple sites on that topic, which aren't made by techreviewers but actual Blender artists) that depending on your use case it is very likely you won't ever need Cycles and by extension HIP/HIP-RT, I think I asked you a million times about what it is you wish to make in Blender but it seems you care more about Blender as a benchmark compared to the amazing 3D tool that it is, because the absolute truth is that you are incredibly biased and hate AMD, I have linked to everything so many times and each time you bring up the same points in the same disingenuous and untrue way. You don't care, you're not going to use Blender, that much is clear. You just need something to bash AMD on, and since it can't be gaming you need to find the next most popular thing (which is funny as for the 7900GRE you point to a lot of gaming focused articles).

    Originally posted by Panix View Post

    and the 7900 xtx is slower at rendering than the 4070 series - whether it's the 12gb vanilla card or a 4070 Super also at 12gb of vram or the 16gb 4070 Ti Super.

    Remember the Blender 3.6 Deepdive I linked you before; where the RT acceleration barely affected performance in one of the Cycles benchmarks, remember how the RTX 4070 performed slightly worse than the RX7900XTX and RX7900XT in both RT on and off, remember the comment you quoted just now and how I told you that Blender 4.0 and 4.1 has performance regression on every Nvidia card but AMD with the RX7900XTX is seeing performance increases on both versions. Heck I would ask you if you remember the Blender Rookie video on how the RTX 3060 12GB beat the RTX 3070 (8GB) because the benchmark in question was VRAM intensive so system/normal RAM had to be used for the 307 and that it took the RTX 3070 more than double the time (4 min 52 seconds) compared to the RTX 3060 12GB (2min 9seconds) showing the absolute importance VRAM can have (and 24GB is double that of 12GB). So saying "the 7900 xtx is slower at rendering than the 4070 series" is plain wrong, it's more along the lines of "the 7900 xtx is sometimes slower at rendering with Cycles than the 4070 series depending on the scene (and 4070 card) in question".

    AMD works ​with Blender, for some usecases it is faster, for some it is not, HIP-RT Is not a requirement not even for Cycles.
    If your usecase or workflow benefits from Nvidia you go with Nvidia, if your usecase or workflow benefits from AMD you go with AMD (and yes quite a few do, not all, but enough that AMD is viable). What you don't do is focus on a benchmark that is barely representative of 1 workflow all while not even using the software in question and pretending as if you're an expert while only referencing techreviewer benchmarks (which tend to be lazily done to begin with) and on top of that disregarding any fact or knowledge coming from someone with actual experience in said software just to bash a singular GPU manufacturer, as that would be pretty pathetic.

    Originally posted by Panix View Post

    So, stop smoking all those drugs you have.
    Maybe you need to start taking the ones the doctor is prescribing to you, because the memory loss is getting quite bad.
    Last edited by tenchrio; 08 May 2024, 06:26 AM.

    Leave a comment:


  • Panix
    replied
    Originally posted by tenchrio View Post

    Man if only there was a Linux site for this Linux centric forum that did Benchmarks and showed the 7900GRE having better power consumption and performance on Linux compared to the RX 7900XT(X) and 4070 TI super. Oh wait.
    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite


    And the peak power is even lower compared to the benchmarks you provided, hot damn.
    Also isn't it hilariously that a benchmark you actively commented on is one you don't link since it of course doesn't fit your narrative.

    Gaming performance, Vulkan compute, more RAM, video editing in both Davinci Resolve and Adobe Premiere Pro, 3D sculpting and modeling while using a rasterized based engine to render like Eevee or Beer. Seems like there is more than just the FOSS aspect.

    Lol as if you would every do anything with productivity software. Your only understanding of it comes from Benchmarks and even there you need to cherry pick whatever floats your boat, for example you don't bring up that currently the Nvidia cards have had a performance regression in Blender 4.0. And while no one seems to have done this for Blender 4.1, the Opendata benchmark site would suggest that Nvidia cards regressed even further with 4.1 while AMD actually saw an increase in their score.

    So oh no, terrible news right? Don't use Blender 4.1 with Nvidia! Except no, since Blender 4.1 comes with a ton of features that makes the performance regression in those benchmarks negligible, one of my favorite ones being Geometry node Baking which has doubled the performance in a lot of my scenes for both Cycles and Eevee with the click of a button. And if you actually learned anything from my previous explanations on VRAM to you before, you might remember that Baking is where you precalculate something increasing VRAM but also increasing performance which has now been added to Geometry nodes.

    Meanwhile an absolute madman on the Blender subredit used the new Intel Open Image Denoiser's GPU acceleration that came with Blender 4.1 to render a video in Cycles at 5FPS by lowering the sample count to 4 with an RTX 4070 TI super. So while the RTX 4070 TI super also saw a 10% drop in performance in the Blender benchmark since 3.6, it has better actual rendering performance when actually rendering as OIDN is on by default and now is significantly faster due to GPU acceleration being added.
    As a small side node the denoiser also uses VRAM, and the tooltip even warns you that the bigger the scene the more VRAM is required. From my own testing the OIDN still uses less VRAM than the Nvidia Optix denoiser while offering better results and to top it off it works on all 3 GPUs unlike the Optix denoiser.

    But neither of these performance gains are visible on the benchmark, as most of the benchmark scenes used by opendata are older and still use particle systems where these days tutorials would suggest and use geometry nodes and OIDN is technically post processing to get a clearer image from lower samples (in the above case single digits which is pretty nuts) while the open data benchmark score is the estimated number of samples per minute, summed for all benchmark scenes. So while the card now produces less samples in the latest Blender version, it still actually renders faster not to mention a ton of other great features like Light Linking.
    Huh? You're so full of manure...lol.... how is the power consumption better than a 4070 Ti Super? What are you looking at?

    No one in their right mind would try to argue a 7900 series even the 7900 GRE has better or more efficient power consumption than a 4070 Ti Super - maybe you meant the 4070? It's about even with that card on his graph.

    Even most AMD fans will concede that the 4070 series is more power efficient than any AMD 7900 RDNA 3 card.

    AMD gpus still suck in Blender and only use HIP - which is slower than the hacked ZLUDA - and the 7900 xtx is slower at rendering than the 4070 series - whether it's the 12gb vanilla card or a 4070 Super also at 12gb of vram or the 16gb 4070 Ti Super - yes, those cards are able to use optiX so a bit unfair but then it's AMD and Blender's fault - one or both of them - that haven't been able to use the OPEN SOURCE ray tracing element of HIP-RT.

    So, stop smoking all those drugs you have.

    Leave a comment:


  • tenchrio
    replied
    Originally posted by Panix View Post
    I hear you but I am talking about the overall superior card - both software-related and hardware-related and afaik, the Nvidia side is better when comparing the 7800 xt and 7900 gre - if you want more vram, then, sure, you will have to pay for it - that's where Nvidia are 'bastards' - the 4070 Ti Super - has 16gb but power consumption is still more efficient - so what about your argument, huh? 288w vs 304w
    The ASUS Radeon RX 7900 GRE TUF OC comes with a dual BIOS feature and a premium all-metal cooling solution that runs whisper-quiet. The TUF also offers adjustable RGB lighting, and cooling performance that's among the best of all the GRE cards that we've tested.

    Power spikes and vsync 60 hz - major power differences there, too - with the gre consuming even more power than its 7900 'siblings.' Wow.
    It's also better for gpuCompute and other productivity tasks (with a few exceptions):
    Man if only there was a Linux site for this Linux centric forum that did Benchmarks and showed the 7900GRE having better power consumption and performance per dollar on Linux compared to the RX 7900XT(X) and 4070 TI super. Oh wait.
    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite


    And the peak power is even lower compared to the benchmarks you provided, hot damn.
    Also isn't it hilariously that a benchmark you actively commented on is one you don't link since it of course doesn't fit your narrative.

    Originally posted by Panix View Post
    The FOSS aspect is the ONLY reason to even CONSIDER the amd gpu - and if explicit sync 'solves' some issues - then there's even more incentive to pick the nvidia card -whichever it is.
    Gaming performance, Vulkan compute, more RAM, video editing in both Davinci Resolve and Adobe Premiere Pro, 3D sculpting and modeling while using a rasterized based engine to render like Eevee or Beer. Seems like there is more than just the FOSS aspect.

    Originally posted by Panix View Post
    I'm waiting to see if it does - as AMD doesn't look like it's fixing any of the productivity software problems - not even close or anytime soon so I have no choice.

    Lol as if you would every do anything with productivity software. Your only understanding of it comes from Benchmarks and even there you need to cherry pick whatever floats your boat, for example you don't bring up that currently the Nvidia cards have had a performance regression in Blender 4.0. And while no one seems to have done this for Blender 4.1, the Opendata benchmark site would suggest that Nvidia cards regressed even further with 4.1 while AMD actually saw an increase in their score.

    So oh no, terrible news right? Don't use Blender 4.1 with Nvidia! Except no, since Blender 4.1 comes with a ton of features that makes the performance regression in those benchmarks negligible, one of my favorite ones being Geometry node Baking which has doubled the performance in a lot of my scenes for both Cycles and Eevee with the click of a button. And if you actually learned anything from my previous explanations on VRAM to you before, you might remember that Baking is where you precalculate something increasing VRAM but also increasing performance which has now been added to Geometry nodes.

    Meanwhile an absolute madman on the Blender subredit used the new Intel Open Image Denoiser's GPU acceleration that came with Blender 4.1 to render a video in Cycles at 5FPS by lowering the sample count to 4 with an RTX 4070 TI super. So while the RTX 4070 TI super also saw a 10% drop in performance in the Blender benchmark since 3.6, it has better actual rendering performance when actually rendering as OIDN is on by default and now is significantly faster due to GPU acceleration being added.
    As a small side node the denoiser also uses VRAM, and the tooltip even warns you that the bigger the scene the more VRAM is required. From my own testing the OIDN still uses less VRAM than the Nvidia Optix denoiser while offering better results and to top it off it works on all 3 GPUs unlike the Optix denoiser.

    But neither of these performance gains are visible on the benchmark, as most of the benchmark scenes used by opendata are older and still use particle systems where these days tutorials would suggest and use geometry nodes and OIDN is technically post processing to get a clearer image from lower samples (in the above case single digits which is pretty nuts) while the open data benchmark score is the estimated number of samples per minute, summed for all benchmark scenes. So while the card now produces less samples in the latest Blender version, it still actually renders faster not to mention a ton of other great features like Light Linking.
    Last edited by tenchrio; 03 May 2024, 07:40 PM.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by Panix View Post
    I hear you but I am talking about the overall superior card - both software-related and hardware-related and afaik, the Nvidia side is better when comparing the 7800 xt and 7900 gre - if you want more vram, then, sure, you will have to pay for it - that's where Nvidia are 'bastards' - the 4070 Ti Super - has 16gb but power consumption is still more efficient - so what about your argument, huh? 288w vs 304w
    https://www.techpowerup.com/review/a...re-tuf/39.html
    You need to look at that again. There is a stock RX7900 GRE in there list. The ASUS is overclocked.

    There is another thing DisplayPort 2.1 is on the AMD cards. Where as the Nvidia is only DisplayPort 1.4a​. This is important for hooking up some monitors. Particular thinking you need Displayport 2.1 to use a Display-port to hDMI 2.1 dongle that works well.

    Less ram worst connectivity you have with the 4070 Ti Super. Remember I said my target it to use card for 10 years. Poorer connectivity is a big factor.

    Leave a comment:


  • Panix
    replied
    Originally posted by oiaohm View Post

    The RX7800TX is a 16GB card RX7900RE 16G card. and the 4070Super is 12G card. Guess what running ram consumes power. The power difference between 4070 Super and RX7800TX and RX7900GRE is mostly ram.

    4070 Super is not cheaper everywhere. That 4G of ram difference can be the difference if a compute workload works well or not. 16 G ram card is a lot more expensive from Nvidia.

    Linux users like me build systems I run for at least a decade. AMD I will have driver updates for that time frame. Nvidia not so much. So do more Nvidia matter if in 5 years time I am going to have replace the Nvidia card because it drivers don't want to work with the current X11/Wayland solution of Linux by then or by the AMD card that going to be good for the 10+ years.

    This is a factor of long term performance is driver support. It common for AMD developers to find something wrong current driver and see that the old driver had the same problem on Linux and fix that has well.

    Me building a system to run for a decade I am compare the 7900GRE against the 7800TX the 4070 super from Nvidia does not come into it.
    I hear you but I am talking about the overall superior card - both software-related and hardware-related and afaik, the Nvidia side is better when comparing the 7800 xt and 7900 gre - if you want more vram, then, sure, you will have to pay for it - that's where Nvidia are 'bastards' - the 4070 Ti Super - has 16gb but power consumption is still more efficient - so what about your argument, huh? 288w vs 304w
    The ASUS Radeon RX 7900 GRE TUF OC comes with a dual BIOS feature and a premium all-metal cooling solution that runs whisper-quiet. The TUF also offers adjustable RGB lighting, and cooling performance that's among the best of all the GRE cards that we've tested.

    Power spikes and vsync 60 hz - major power differences there, too - with the gre consuming even more power than its 7900 'siblings.' Wow.
    It's also better for gpuCompute and other productivity tasks (with a few exceptions):

    The FOSS aspect is the ONLY reason to even CONSIDER the amd gpu - and if explicit sync 'solves' some issues - then there's even more incentive to pick the nvidia card -whichever it is. I'm waiting to see if it does - as AMD doesn't look like it's fixing any of the productivity software problems - not even close or anytime soon so I have no choice.
    Last edited by Panix; 30 April 2024, 05:25 PM.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by Panix View Post
    The 4070 is a bit cheaper, here. Thus, when you compare tasks other than gaming, the 4070 Super is a much better choice. The only disadv. it has is lacking vram in comparison.
    The RX7800TX is a 16GB card RX7900RE 16G card. and the 4070Super is 12G card. Guess what running ram consumes power. The power difference between 4070 Super and RX7800TX and RX7900GRE is mostly ram.

    4070 Super is not cheaper everywhere. That 4G of ram difference can be the difference if a compute workload works well or not. 16 G ram card is a lot more expensive from Nvidia.

    Linux users like me build systems I run for at least a decade. AMD I will have driver updates for that time frame. Nvidia not so much. So do more Nvidia matter if in 5 years time I am going to have replace the Nvidia card because it drivers don't want to work with the current X11/Wayland solution of Linux by then or by the AMD card that going to be good for the 10+ years.

    This is a factor of long term performance is driver support. It common for AMD developers to find something wrong current driver and see that the old driver had the same problem on Linux and fix that has well.

    Me building a system to run for a decade I am compare the 7900GRE against the 7800TX the 4070 super from Nvidia does not come into it.

    Leave a comment:


  • tenchrio
    replied
    Originally posted by Panix View Post
    Blah, blah, blah, more bs from you so that you can insult - as that is all you do. I agree with oiaohm and prefer discussing with him because he's above insulting unlike you.
    LOL so you can give but can't take, typical hyprocite.
    And a reminder that you started with the insults once again with your "reading comprehension" and "this site is full of delusional AMD fanboys so no chance of that happening" as you probably already knew you were wrong deep inside. And the best part is now you admit it !! You admit it is different but have to continue arguing about the name of a card while it is unrelated to the topic that was discussed previously, but hey progress I guess.

    Originally posted by Panix View Post
    I do understand the chip difference but regardless, it doesn't change the fact the 7900 GRE was named under the 7900 series and is heavily gimped - your average consumer probably won't have any idea and might be assuming near-7900 XT performance - which it DOES NOT come close.
    That was never relevant to this thread, you just randomly brought it up to thread hijack because once again you have to hate on AMD, regardless of the topic, regardless of the news being discussed, you have for some reason a burning desire to bring AMD up and to bash it. The thread here was about the open source nouveau drivers of Nvidia and how particularly confusing it is getting with the automatic re-clocking what is and isn't supported under Nouveau (all thanks to Nvidia being actively in the way of the open source driver) . The card is clearly named differently, how about you using that reading comprehension of yours to read the name.

    Originally posted by Panix View Post
    The reason I didn't care much about the 710 and 1030 fiasco is because Nvidia is already not really compliant or known for supporting open source anyway - they're already hated for this - so, the chip thing isn't very surprising from them.
    Except that this discussion right here is about Nvidia slowly given us this compliance at least somewhat for Turing and newer. Which leaves us with a weird spot of Nvidia GPUs that have automatic reclock but don't use the new GSP firmware. We need it for Kepler because it is EOL, we want it for Maxwell as it is reaching EOL hence why it is relevant in this thread about the open source nvidia driver.

    Originally posted by Panix View Post
    I shopped for a GT 1030 a while ago when I had an old sff Dell - so, I was aware of what they did with it. Since, I was looking for used card and refused to buy new - partly because it was overly expensive for a small form factor gpu, partly because of what Nvidia DID - so, I was not contributing to Nvidia revenues and profit!
    Okay, cool story.

    Originally posted by Panix View Post
    It's basically a gpu to get video, imho - so, it's pretty pathetic they'd cripple that one and to save money further, use the wrong tech/chip with it. But, that's Nvidia? So, yes, the fact it's not a gaming chip - I think it is RELEVANT! More ppl will buy a gaming card so more ppl will likely be buying the 7900 'Rabbit card' - which is not really a 7900 series 'under the hood' - with defective chips, also - we all agree, right? The semantics of arguing it's 'still a 7900' notwithstanding. I already posted a number of ppl discussing it that proven without doubt, it's only a 7900 in name - since, there's a bunch of gimping done because it was silicon they could still savage.
    Oh my god Argumentum ad populum, what you think the earth is also flat and Scientology is real because a bunch of people are telling you that is the case too?
    It always amazes me how idiots try to flock together in the face of facts and go "wE cAn'T aLl Be WrOnG!".

    As me and oiaohm have been saying the 7900GRE uses the same silicon as the 7900XT but a cut down version to salvage chips that came short.
    The xda-developers you quoted even says so that "on paper it is extremely close to the 7900XT" (how about you read your sources for once), just less cores, less buswidth/bandwith and less VRAM. The 7800XT uses NAVI 32 , the 7900 GRE And 7900XT both use NAVI 31 and with Nvidia this naming behavior is also the same the RTX 4070 and 4070 TI both use AD104.... on launch as Nvidia starting March 2024 also began to ship RTX 4070 with a cutdown AD103 (RTX 4080 chips) and just like before you won't know if yours has AD103 or AD104 until you plug it in or take it apart entirely (the effects are also hard to measure as reviewers would have to get lucky and receive both an AD103 and AD104 and again they aren't sold as separate models, they are both put inside of the same RTX 4070 box).

    But if you hate the 7900GRE because its performance is less than the other cards that it shares its die with (for some reason) wait until I hit you with this bombshell, Nvidia's 4000 mobile GPUs always use the silicon for one tier lower. So the RTX 4090 (and Dragon) uses AD102 but the RTX 4090 mobile uses AD103, the same as the RTX 4080, the RTX 4080 Mobile however uses AD104, the same one as the RTX 4070 and 4070 TI and this goes on until AD107 where finally the RTX 4050 and RTX 4050 Mobile are together at last (as are the RTX 4060 and 4060 Mobile. Well sort of as just like the RTX 4070 with AD103, nvidia started shipping RTX 4060's with AD106, again you can't know if yours does or not). [s]My god you must be livid!! How can Nvidia do this absolute crime of saying it is a 4080 (mobile) but use the 4070 silicon![/s]
    And mind you this wasn't the case last gen where the corresponding mobile GPU did have the same silicon so the RTX 3070 had GA104, same as the RTX 3070 mobile.

    Also originally there was a RTX 4080 12GB and if you look at the spec sheet like the AD104 silicon instead of AD103 like the RTX 4080 16GB, the 7680 shading units, the 240... TMUs, the 80 ROPS.... wait a minute that is identical to the RTX 4070 TI specsheet! Nvidia actually tried to sell a "4080" which in reality had 4070 silicon and way lower specs than the 4080 16GB, they then "unlaunched" this card and relaunched it as the 4070 TI which makes more sense since the RTX 4070 also uses AD104. Mind you that historically every xx70 and TI card had a corresponding silicon with 104 (3070 (TI)=> both GA104, 1070 (TI) => both GP104, 970 => GM204, Turing being weird and making the RTX 2070 TU106 but the RTX 2070 Super TU104). So uh Nvidia actually attempted and does what you accuse AMD of (wrongfully since the name stems less from performance and more from the silicon as it has done for ages and man finding out that stuff about the 4000 series mobile GPUs would not have been so hilarious if it wasn't for your ridiculous opinion).

    And still: all of this was off topic as the original topic talked about was the state of nouveau drivers and how the GT710M users are in a coin flip predicament if they can use nouveau or must use the nvidia legacy drivers as the card is no longer supported by Nvidia, your terribly informed rant was never related and as usual the premise of it is your own delusion and a far cry from reality for people who actually do have knowledge on GPU manufacturing, naming and silicon.
    Originally posted by Panix View Post
    I don't hate AMD - well, maybe somewhat -
    No, you absolute do, you posted in a thread of which the article was already deleted for over a day and despite it being about a performance fix (on APUs but hey the article was deleted, how could you know) you immediately went to bash AMD for it (hell even if it was for GPU, wouldn't that be a good thing? That AMD fixes something that you find important?).That is clearly indicative of an unhealthy obsession. Here is the kicker; when Intel fixed an actual similar issue for their dedicated GPUs this month, you didn't randomly start posting there bashing Intel (or AMD, which wouldn't make sense but hey you are doing it here), despite that article still being up (and thus still accessible through the front page and most importantly readable).

    Originally posted by Panix View Post
    because they refuse to support their hardware and neglect the software side - they don't support programs in the productivity sphere -
    Oh god here we go again, I think we had this song and dance like what 4 times already?
    I shared the benchmarks of Davinci Resolve and Premiere Pro before, the AMD cards punch well above their price in that department (compared to Nvidia).
    So no you are wrong, per usual. AMD is behind with HIP-RT but that is less of "don't support" and more of "in development" and I have already giving you a dozen 3D use case where HIP-RT and Optix performance do not apply. AMD has also developed the current Blender USD Hydra plugin that is shipped with Blender that can be used as a render delegate for more advanced render engines like Dreamworks Moonray or Pixar's Renderman and it is usable for both Nvidia and Intel and as said multiple times before their own render engine that can also be used for Nvidia and Intel, so they do support the productivity sphere and their support of that sphere even benefits Nvidia and Intel (the things you know if you are actually in the productivity sphere).

    Originally posted by Panix View Post
    so, yeah, they are like Nvidia in that they put profit and shortcuts above a good hardware/good software stack. Their processors are good - if I was buying a new PC or build, I'd definitely pick AM5 today.
    ...what? If there is one thing you need to hand to Nvidia it is that CUDA is a great (albeit be it closed source) software stack, being exceptionally well optimized compared to OpenCL and having good documentation. CUDA being that good is what helped Nvidia achieve its current dominance (not to mention the insane performance/value aspect of the Pascal line during a time when people were taking bets when AMD would be declared bankrupt), but ROCM Is slowly and steadily getting better and in some cases has proven to give a better value already which is great for the consumer, stronger competition leads to better products, just look at the CPU space before and after first gen Ryzen.

    Originally posted by Panix View Post
    Several links detail the tech differences. I'd give the gpu a hard pass - most ppl just care about gaming performance and the price.
    There is also a difference between the 7900XT and 7900XTX, and the 4070 and 4070 TI doesn't mean the name does not make sense.
    And okay, sure I think most people are of that opinion on that card. Doesn't mean you need to randomly rant about it (or how about you do it in the Phoronix review about it, well you actually already did do that but clearly you need to get it out of your system).

    Leave a comment:


  • mrg666
    replied
    Originally posted by Panix View Post
    Wrong again, Mr. Oiaohm.... I know you are trying but this is more evidence that AMD just doesn't care about the consumer and is pricing their hardware way too high - and they're just like Nvidia. Everything I've read indicates that the 7900 GRE is geared towards being a 2k or 1440p card. It's going against the 4070 and 4070 Super - one could argue it's a 'better pick' than the 4070 but that can't be said with a 4070 Super comparison.

    In my country, the 4070 Super is only $40 more. The cheapest 7900 GRE is $750 and the cheapest 4070 Super is $790.

    We've benchmarked the Radeon RX 7900 GRE and GeForce RTX 4070 Super across 58 different game configurations, taking an in-depth look at rasterization, ray tracing, and upscaling performance.


    vs 4070:


    The 4070 is a bit cheaper, here. Thus, when you compare tasks other than gaming, the 4070 Super is a much better choice. The only disadv. it has is lacking vram in comparison.

    This is a typical 'loss' for AMD though - with their gpus. It's the typical pattern.

    As for the 7800 xt - $680.
    Two of AMD's newly released graphic cards, specifically the RX 7800 XT vs RX 7900 GRE, are colliding to showcase their differences.

    Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.


    It seems like there's around a 7-8 fps gain for the 7900 gre. Anyway, I don't care too much about these 2 cards - they seem mediocre to me - worse power efficiency than the 4070 series cards and weaker, overall performance.
    Well, the videos you linked are for Windows. As a Linux enthusiast, AMD is less trouble and as good (maybe better) performance on Linux. I would not buy Nvidia for Linux as it was a very frustrating experience for me. Just my 2c though, that is your money.

    Leave a comment:


  • Panix
    replied
    Originally posted by oiaohm View Post

    Also the 7900GRE AMD has priced as if it a updated 7800TX. So the price tag of the 7900GRE is not close to the 7900XT either.

    RX 7900GRE is 599.99​ USD card.
    RX 7900TX is $936.71 USD​ card
    $936.71​ USD card
    This is you amazon PowerColor Red Devil Hell hounds.
    Had to go to a lower fighter class card to get
    PowerColor Fighter AMD Radeon RX 7800 XT is USD $489.99​ The fighter is a 100 USD cheaper heatsink than the Red Devils. Recommend retail has 50USD difference between the RX7900GRE and the RX 7800XT of the same type.

    RX 7900GRE is performance from the benchmarks is on majority of the time slightly better than you RX7800. You would say the RX7900GRE is priced for the performance it in fact gives.

    GPUs AMD Radeon RX 7900 GRE GPU Review & Benchmarks vs. RX 7900 XT, 7800 XT, RTX 4070 Super March 11, 2024 Last Updated: 2024-03-11 Previously only released in China, the 7900 GRE is now available more widely. We put it to the test. The Highlights With 80 Compute Units, the 7900 GRE has less CUs than the 7900 XT but more than the 7800 XTIts closest NVIDIA competitor is the RTX 4070, which holds a significant advantage in some ray tracing workloadsThe 7900 GRE generally performs equidistant between the RX 7900 XT and 7800 XTOriginal MSRP: $550Release Date: February 27, 2024 (international release) Table of Contents AutoTOC Grab a GN15 Large Anti-Static Modmat to celebrate our 15th Anniversary and for a high-quality PC building work surface. The Modmat features useful PC building diagrams and is anti-static conductive. Purchases directly fund our work! (or consider a direct donation or a Patreon contribution!) Intro AMD liked its 7900 GRE so much that it’s launching it a second time, this time to the rest of the world. The card originally shipped to China, adopting the naming “Golden Rabbit Edition” for the launch, and we actually bought it last year -- but because of its limited availability, we never got around to reviewing it. AMD has changed its mind and is now making the 7900 GRE available at an MSRP of $550 for the rest of the world. That’s a huge change, because the card slots-in and shuffles the stack at the now-critical battleground of $500-$600, which NVIDIA is currently heavily occupying with its doubled-up RTX 4070 and 4070 Super. The RX 7900 GRE allows AMD to now offer more selection in the same price range, giving NVIDIA some competition that we think is deserving of coverage.Intel Arc is absent in this price category, but we just ran a revisit of it for those interested in performance. Editor's note: This was originally published on February 26, 2024 as a video. This content has been adapted to written format for this article and is unchanged from the original publication. Credits Test Lead, Host, Writing Steve Burke Testing Mike Gaglione Camera, Video Editing Vitalii Makhnovets Video Editing Tim Phetdara Web Editing Jimmy Thang AMD RX 7900 GRE Overview This card has already launched so it isn’t a new GPU, but it’s now more widely available and is worth talking about.  There are more partner models available now than at its original launch.  The $550 price mark anchors it right against the RTX 4070 (watch our review) and 4070 Super (read our review). It may displace the 7800 XT. We wouldn’t be surprised to see its price drop alongside the 7700 XT. Specs Differences Here’s a quick overview of the specs. AMD RX 7900 XT, 7900 GRE, & 7800 XT Specs AMD RX 7900 XTAMD RX 7900 GREAMD RX 7800 XTGPU DieNavi 31Navi 31Navi 32ArchitectureRDNA 3RDNA 3RDNA 3Memory Capacity20GB G616GB G616GB G6Memory Bus320-bit256-bit256-bitMemory Bandwidth800GB/s576GB/s624GB/sCompute Units (CUs)848060Stream Processors537651203840RT Cores848060TMUs336320240ROPs19219296"Game" / Boost2000-2400MHz1880-2245MHz2124-2430MHzAMD Rated Board Power315W260W263W The RX 7900 GRE has 80 Compute Units, which is cut down from 84 on the 7900 XT and boosted from 60 on the 7800 XT. The gap between the 7800 XT (watch our review) and 7900 GRE will primarily emerge in situations where the compute workload is higher, although the 7900 GRE’s clocks are lower than the 7800 XT by pure spec sheet. Partner models can boost this higher, however, and will make-up for the difference. TDP is about the same between these cards, hence the lower power budget available for boosting while carrying more CUs than the 7800 XT. Memory bandwidth is also superior on the 7800 XT as a result of higher memory clocks. The 7900 XT’s main advantage is the hugely increased memory bandwidth. Current GPU Pricing & Alternatives It’d help to get a quick pricing update to understand the closest alternatives.  The RX 7900 GRE should be $550 when it hits retailers worldwide. The RX 7900 XT (read our revisit) has finally fallen in price -- again -- and seems to be stable at around $730 to $750, but we’ve seen some options occasionally hit $700. At $700, it’s becoming a good overall value. But even that is $150 more expensive than the base 7900 GRE price, so they’re different price classes. The RX 7800 XT has an MSRP of $500 and currently has units from $490 to $500 with regular availability. That’ll be the closest price alternative to the 7900 GRE at $50 cheaper. NVIDIA’s closest price competitors would be the RTX 4070 Super at the upper end, at $590 and up. Decent cards can be had at $600. The RTX 4070 wasn’t discontinued and has been kept at around $550 officially, but we saw a few models around $525 to $540 on Newegg. As for the third player in the market, Intel’s closest card would be the Arc A770 at $290 to $300. This is a distant price class and isn’t really competition for someone seeking to spend in the $500+ range, but check out our recent Arc revisit for our thoughts on it. Let’s get into the benchmarks. GPU Reviews Test Bench PartComponentProvided ByCPUIntel Core i7-12700KF Overclocked(4.9GHz P-Cores, 3.9GHz E-Cores)Bought by GNMotherboardMSI Z690 UnifyMSIRAMDDR5-6000 G.Skill Trident Z (manually tightened timings)G.SkillCoolerArctic Liquid Freezer II 360 @ 100% Fan SpeedBought by GNPSUEVGA 1600W T2 SupernovaCorsair AX1600iEVGACorsairOSWindows 11Bought by GN Additional parameters include: Hardware-accelerated GPU scheduling and ReBAR both enabled. Power plan set to High Performance. Note: Acoustic testing uses a bench with 0 fans, so passive PSU + coolers. Grab a GN Soldering & Project Mat for a high-quality work surface with extreme heat resistance. These purchases directly fund our operation, including our build-out of the hemi-anechoic chamber for our acoustic testing! (or consider a direct donation or a Patreon contribution!) AMD RX 7900 GRE Game Benchmarks Resident Evil 4 1080p Benchmarks In Resident Evil 4 at 1080p, the RX 7900 GRE landed at 223 FPS AVG, which has it between the RTX 4070 Super and the RX 6950 XT (watch our review). The 6950 XT was about $600 when we last saw it available new. The 4070 Super is currently $590 to $600, so the 7900 GRE ends up cheaper here while outperforming the 4070 Super by 3.1%. It’s not likely to be noticeable, but it is a real advantage (and the price is noticeable). That means the 4070 Ti Super leads by 13%, with the 7900 XT 17% ahead of the 7900 GRE. Lower down the stack, the 6800 XT remains an impressive performer and sits just below the 4070 Super, with the 7800 XT at 198 FPS AVG, giving the 7900 GRE a 13% lead. 1440p Benchmarks At 1440p, the 7900 GRE’s 157 FPS AVG has it 10% ahead of the 4070 Super, a significant boost from its 1080p lead of 3.1%. Going up to the RX 7900 XT would get you about 17% more performance for the price hike. Dropping down to the 7800 XT, we see an advantage form for the 7900 GRE of 15%.  Other notables include the RX 6950 XT, which is about equal to the 7900 GRE, and the original RTX 4070 Ti, also about equal. Uplift over something older, like the RTX 2070 (watch our review) at 58 FPS AVG, would yield a 170% improvement to the GRE. 4K Benchmarks At 4K, the RX 7900 GRE ran at 82 FPS AVG and landed between the 4070 Ti and 4070 Ti “Meh edition,” the latter of which was about tied with the 6950 XT as well. The 7900 GRE now leads the 4070 Super by an impressive 15.6%. That’s why we test at all three of these resolutions: As we’ve seen in the past, NVIDIA’s new non-flagship 40-series cards struggle to keep scaling as the resolution increases. AMD has a disproportionate advantage against the 4070 Super here.  Buying up to the 7900 XT would give a boost of 19% with these settings. This is worth considering but the price hike is definitely noticeable.  Rainbow Six Siege 4K Benchmarks Rainbow Six Siege is up now. First at 4K, the RX 7900 GRE ran at about 154 FPS AVG. That’s the same as the 4070 Super and behind the 3080 Ti (watch our review). The 7900 XT leads the 7900 GRE by about 23% here, a noteworthy advantage attached to a noteworthy price increase of about 27-30%. Against the 7800 XT, the 7900 GRE leads by 6.3%, or in terms of the absolute, 9 FPS average. The 7900 GRE falls exactly where you’d expect -- between the 7900 XT and 7800 XT -- but in this case, it’s more similar to the 7800 XT than the 7900 XT. 1440p Benchmarks Moving on to 1440p, the 7900 GRE runs about 7% ahead of the 4070 Super. The 7900 XT’s lead over the 7900 GRE this time is about 14%. Generally speaking, the 7900 XT starts to produce a meaningful uplift here, with the 4070 Ti Super not particularly meaningfully different in this title; however, that changes in ray tracing later. As an upgrade, the 7900 GRE produces an appreciable improvement over the last-gen 3070 (although the 3070 is still perfectly fine – watch our review) and over AMD’s RX 6700 XT. 1080p Benchmarks At 1080p, the 7900 GRE is about tied with the 6950 XT, leads the 4070 Super marginally, and leads the 7800 XT by 4.6%. Overall though, this plays pretty well at 1080p on just about anything within the last few generations. Dying Light 2 4K Benchmarks Dying Light 2 is up now. At 4K, the RX 7900 GRE ran at 58 FPS AVG, which has it just barely hitting the minimum that most people target for framerate. Settings adjustments would fully accommodate 60 FPS. That has it about equal to the 4070 Ti and just ahead of the 4070 Super; however, realistically, all 3 of these devices produce about the same experience. Increasing budget to the 7900 XT would improve framerate by 18%. The 4070 Ti Super would yield a similar uplift. Against the RX 7800 XT, the 7900 GRE yields a 13% uplift. It’s almost perfectly in between the 7900 XT and 7800 XT. 1440p Benchmarks At 1440p, the RX 7900 GRE’s 118 FPS performance has it about equal to a 3080 Ti or 4070 Ti, with the 6950 XT slightly leading. The 7900 XT’s lead is only 14% here. The 7800 XT also happens to allow the 7900 GRE a 14% lead, planting it perfectly in between. As for the 4070 Super, the lead is about 6% while the price is lower for the GRE. 1080p Benchmarks At 1080p, the 7900 GRE’s lead over the 4070 Super evaporates as the two cards converge to functional equivalence. The 7900 XT and the 7800 XT position themselves equidistant from the 7900 GRE, with the superior card in each scenario leading by about 12-14%. FFXIV 4K Benchmarks We’re moving on to Final Fantasy 14 now. At 4K, the 7900 GRE runs at 98 FPS AVG and produces the same overall experience as the RX 6800 XT and RTX 3080 (watch our review). The 4070 Super leads this time, now by 7%.  The GRE is less centered between its AMD peers this time: The 7900 XT leads by a noteworthy 27%, with the 7800 XT getting led by about 13%. NVIDIA’s alternatives include, again, the 4070 Super, with the 4070 Ti Super ranking a little ahead of the 7900 XT. Upgrades from RTX 2060 (watch our review) or RTX 2070 (watch our review) class cards would be meaningful, but anything more modern can wait another generation or two. 1440p Benchmarks At 1440p, the 7900 XT begins running into overhead in this particular game, as we’ve talked about before. That extends to 1080p. The 7900 GRE therefore lands closer to it than when outside of these conditions, although the GRE still fully scales -- so its lead over the 7800 XT remains about 10%. The 4070 Super outperforms the GRE here. They swap positions depending on the game. GTA V 4K Benchmarks GTA V at 4K is one of the charts we keep around for generational comparison data because our results don’t ever change -- you can see the 5700 XT here as an example of that.  The 7900 GRE runs at about 106 FPS AVG, which has it leading the 4070 by a slight 3.7%. The 4070 Super leads again in this one, at 118 FPS AVG and establishing an 11% uplift. The 7900 XT has another one of its stronger leads in this test, with the 4K resolution contributing, up at 24% advantage over the GRE, closer to parity with the price increase. Against the 7800 XT, the 7900 GRE is relatively close this time: It’s an 8% lead. As for the 5700 XT (watch our review), the 7900 GRE improves over the former flagship by 94%. It’s about 2x the framerate. For a reminder, Intel Arc struggles with our test settings in this game. Disabling MSAA for Intel Arc improves its performance disproportionately from the impact to AMD and NVIDIA. Starfield 4K Benchmarks Starfield is up now. This is a title that AMD tends to do well with. At 4K, the 7900 GRE runs at about 60 FPS average. Lows are expected and consistent. The 4070 Super produced a 53 FPS result, giving the 7900 GRE a 9% lead this time. The 7800 XT is closer to the 4070 Super than the GRE, with the GRE holding a 13% lead. That’s similar to what we’ve seen elsewhere. The same is true for the 7900 XT, which keeps its 19% lead over the GRE. 1440p Benchmarks At 1440p, the 7900 GRE’s 91 FPS AVG has it just behind the 4070 Ti and Ti Super and leading the 4070 Super by 5%, reduced from 9% at 4K. This continues NVIDIA’s trend of sometimes suffering disproportionately at higher resolutions. Otherwise, the lineup remains comparable to before. We added the RTX 2080 to this chart, so you’ve got an extra data point you can use for this one. AMD RX 7900 GRE Ray Tracing Benchmarks Now we're moving on to ray tracing benchmarks. Some of these games are the same games but with RT and settings changed, so the numbers can't be transplanted between the charts. (RT) Cyberpunk  1080p/Medium Cyberpunk is our heaviest RT workload, so we’ll start there. We’ve also shown that AMD falls increasingly behind as the RT workload in this game increases. This is the lightest of the two settings we test, meant to represent a balanced option between the vendors. Ultra is next and is what allows NVIDIA to pull wildly ahead. At 1080p/Medium RT, the 7900 GRE runs at about 57 FPS AVG, giving the 7900 XT a 14% lead and leading the 7800 XT 13%. NVIDIA is where it gets interesting: The 4070 Super now leads the 7900 GRE by a more remarkable 34%. If you really care about super heavy ray tracing, especially in Cyberpunk or games like Dying Light, then NVIDIA still maintains the advantage. But in lighter RT games coming up, it evens out. Just depends on the load. Arc actually does extremely well in Cyberpunk with RT, roughly equating the 4060 with the much cheaper A770 GPU. This is due to architectural choices the Intel team made. Unfortunately, since the stack stops at the A770, they don’t have anything up at the 7900 GRE levels of competition. 1080p/Ultra At 1080p and with Ultra RT settings, the 7900 GRE falls to 40 FPS AVG, this time leading the 7800 XT by 17% and with the 7900 XT about 12% ahead of the GRE. But this is what we were talking about: Even the RTX 4070 non-Super is outperforming the 7900 XTX here, with the 4070 Super now benefiting from an overwhelming advantage of 63% over the 7900 GRE. It’s not even close anymore. (RT) Resident Evil 4 4K Benchmarks Resident Evil 4 is way, way kinder to the AMD stack. This is more representative of lighter-weight RT workloads and we’re using FSR Quality on all devices here. The RX 7900 GRE’s 91 FPS AVG result has it about equal with the 4070 Ti (watch our review) and 6950 XT, producing the same experience. The 7900 XT leads the 7900 GRE by 17% in this one, with the 7800 XT’s 83 FPS AVG giving a 10% lead to the GRE. Notably, the GRE leads the 4070 Super this time around, which itself is about tied with the 7800 XT. 1440p Benchmarks At 1440p, the GRE ran at 137 FPS AVG and fell behind the 4070 Super, although they’re less than 3% apart. Otherwise, everything else is predictable on the AMD side -- the 7900 XT and 7800 XT remain the flanks, and on the NVIDIA side, AMD is more competitive here than previously. That’s especially true now that the 7900 XT has dropped and held its cheaper price. 1080p Benchmarks At 1080p, the 4070 Super continues its upward trajectory and gains an 8% advantage over the GRE. As we’ve seen elsewhere, the AMD part is benefited by the higher resolution here, while NVIDIA benefits from the lower resolution. (RT) Dying Light 2 1080p Benchmarks In Dying Light 2 with ray tracing, we’re back to a performance disparity closer to the Cyberpunk Medium results. Starting with 1080p, the RTX 4070 Super’s 119 FPS AVG leads the 7900 GRE by a comparatively staggering 24%. For the times AMD led this card in rasterization, NVIDIA is definitely trying to make it up in some of these RT loads. Positioning against the 7800 XT is the same as before: The GRE is about 14% better. 1440p Benchmarks At 1440p, the 7900 GRE runs at about 65 FPS AVG. It’s still capable of this game with RT when using FSR quality like this. The 4070 Super keeps its 24% lead. The 7900 XT and 7800 XT remain positionally the same as before, with the 4070 non-Super ahead of the GRE now. AMD RX 7900 GRE Power Consumption For power consumption, this is pretty quick. The 7900 GRE that we tested measured out about 270 watts when it was under a complete workload. The 7900 XT was around 313 just for reference. The 4070 Super, which was the closest competitor in the rasterized testing, was at 222 watts so NVIDIA definitely holds an efficiency advantage here. And then the 7900 XT with the Hellhound partner model card was at 330 watts.  AMD RX 7900 GRE Conclusion Visit our Patreon page to contribute a few dollars toward this website's operation (or consider a direct donation or buying something from our GN Store!) Additionally, when you purchase through links to retailers on our site, we may earn a small affiliate commission. The AMD RX 7900 GRE performs exactly as you’d expect: Generally speaking, it’s often equidistant from the RX 7900 XT and RX 7800 XT. Some scenarios that are more memory bandwidth-constrained or core clock sensitive allow the 7800 XT to get closer to the GRE and the 7900 XT to pull ahead more notably. The 7900 GRE trades places with the 4070 Super in many of the rasterized tests we ran. It also is disproportionately favored at 4K in some of these tests, where NVIDIA’s bandwidth choices allow it to slip in the ranks and lose some of that proportional scaling it has at 1080p.  NVIDIA holds a significant advantage in some of the RT workloads, to the extent that you really should strongly consider going that route if heavy RT games -- like Cyberpunk -- are part of your play plans. If not, or if the RT games you’re interested in are more focused on singular features or reduced RT strain overall, then the relevance is reduced. Again, this isn’t a new card, but we do think it’s relevant as a competitor to the NVIDIA 4070-class cards.

    The gamersnexus write up is important.

    7900 XT 17% faster than the 7900 GRE. But remember you are paying 1/3 less for the GRE. Performance per dollar 7900GRE is better buying than the 7900TX.
    7900 GRE a 13% faster than the RX7800TX. At recommend retail price on performance vs price the 7900GRE is better buying than the RX7800TX. Now lot of cases you are not buying at recommend retail price so RX7800TX in some cases could be better buying in performance vs price.

    Its kind of surprising for how much is cut out of the 7900GRE that is only a 17% performance loss from the RX7900XT back to it.. Think there is over a 20% between the RX7900TX and the RX7900XTX

    In reality the 7900GRE performs good enough to be a historic non TX/XTX card. It would have been a lot worse if AMD had just label it RX7900 and called it done. Yes the card has the performance for that.

    You are not thinking that its ~20% performance difference between the TX and the XTX with AMD and that it should be 20-30% back to the not titled card. We have not seen AMD releasing non titled cards in recent years.

    Consumers would be more confused if AMD had just released a non titled card after all these years because people would over presume how much untitled card performs.. They created a new title the GRE.

    Panix being near china I have seen different AMD cards for only the china market that don't have TX/XTX on them that are the 20-30% slower to the TX versions from older generations of course normally only in small batches(the duds). We have got use to not seeing the duds in markets outside china. Something really gone wrong with the 31 die production has to have had a lot of duds for the GRE to be coming into the Weston market. I say they suspected this case when they started making the GRE card this is why the GRE marking from the start instead of the normal nothing marking.

    Yes the navi 31 die is the first time AMD has attempted to mass produce a multi chip GPU.

    Yes the 7900GRE performance is lot closer to the 7900TX than one would suspect reading the specification sheet for what has been cut out. Also AMD priced as if RX7900GRE was 30% slower and turned out not to be. .
    Wrong again, Mr. Oiaohm.... I know you are trying but this is more evidence that AMD just doesn't care about the consumer and is pricing their hardware way too high - and they're just like Nvidia. Everything I've read indicates that the 7900 GRE is geared towards being a 2k or 1440p card. It's going against the 4070 and 4070 Super - one could argue it's a 'better pick' than the 4070 but that can't be said with a 4070 Super comparison.

    In my country, the 4070 Super is only $40 more. The cheapest 7900 GRE is $750 and the cheapest 4070 Super is $790.

    We've benchmarked the Radeon RX 7900 GRE and GeForce RTX 4070 Super across 58 different game configurations, taking an in-depth look at rasterization, ray tracing, and upscaling performance.


    vs 4070:


    The 4070 is a bit cheaper, here. Thus, when you compare tasks other than gaming, the 4070 Super is a much better choice. The only disadv. it has is lacking vram in comparison.

    This is a typical 'loss' for AMD though - with their gpus. It's the typical pattern.

    As for the 7800 xt - $680.
    Two of AMD's newly released graphic cards, specifically the RX 7800 XT vs RX 7900 GRE, are colliding to showcase their differences.

    Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.


    It seems like there's around a 7-8 fps gain for the 7900 gre. Anyway, I don't care too much about these 2 cards - they seem mediocre to me - worse power efficiency than the 4070 series cards and weaker, overall performance.

    Leave a comment:

Working...
X