Announcement

Collapse
No announcement yet.

AMD Announces The Radeon RX 7600 XT For 1080p~1440p Gaming At $329

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Anux
    replied
    Originally posted by khnazile View Post
    I don't care that much about performance, integrated graphics sucks in more than just way
    Care to explain? I find them to be perfect for anything that doesn't need much 3D/compute perf.

    Leave a comment:


  • khnazile
    replied
    Originally posted by pWe00Iri3e7Z9lHOX2Qx View Post

    But why? You could spend the extra money that this hypothetical RX 7300 would cost and get a better APU that would offer similar GPU performance. Now a single slot 75W bus powered card would be interesting. With 75W all to itself and more than an order of magnitude more space to work with, you should be able to make something significantly faster than what the APU would offer.
    I don't care that much about performance, integrated graphics sucks in more than just way, especially if it's integrated graphics by AMD. People need modern low-power GPUs in form of add-in cards, and 6400 was never a good solution, with only two display outputs, no hardware video encoding and 4x pcie bus.

    Leave a comment:


  • pong
    replied
    The only way we'll stop getting abused by GPU limitations / makers is for the platform CPU / RAM design to actually
    evolve (just like the GPUs have over the past 30 years) to have wider RAM channels / throughput comparable to GPUs,
    and CPU / motherboard capability for 4k-16k+ "SIMD" processors similar to what GPUs have.
    Take the non graphical but performance relevant parts of a GPU and put it into the
    core architecture where any other such "general compute / memory performance" features belong.
    Then save the GPU for display interfaces, ray tracing HW, what not.

    And yeah given that an important role for GPUs today is general purpose computing as well
    as ML inferencing on your own HW the GPUs should be based on raw performance not upscaling hackery.
    But as of now they don't share nicely (multiple users / different application contexts, VMs, ...) they're
    unreliable, they don't even physically fit well into PCs cases / motherboards, the power supply
    connections are nonsense, how is this remotely going to scale come 4-8 years in the future?

    Originally posted by pieman View Post
    upscaling
    I don't blame AMD, nor Nvidia for this. I blame the user base who demand upscaling and frame rate interpolation to such a degree, they are willing to pay an extra $100-$600 for press "reduce my resolution only to upscale it button please." Look at the Starfield drama over Bethesda only having FSR at release and not DLSS. I saw people with Nvidia 3080's and 3090's MAD they can't enable DLSS at 1080 / 1440p... both of those cards are able to run Starfield at max settings at those resolutions, natively, with no need for upscaling. BUT PEOPLE WANTED IT, and it HAD TO BE DLSS!

    It also doesn't help that you have reviewers, like hardware unbox stating "nvidia is worth $200-$300 more than AMD for upscaling!" and praising MSRP $1,600 GPU's like the 4090 having, AND NEEDING upscaling. $1,600 for a video card, to get a $300 level card experience of needing to reduce settings boggles my mind the most. The whole point in buying high end cards, let alone flagships, was for that no compromise experience. You don't need to reduce anything. You can max out the game's settings, and run it natively, and now you have people mad that they buy a $1,000 GPU and can't reduce their image settings by enabling upscaling... I don't get it, and it doesn't matter the amount of mental gymnastics I get from people, I will never get it, nor understand their mental gymnastics. But but my 4k!!! maybe its to soon for 4k still. Maybe dropping $1,600 for a card that still can't do native 4K is a bad investment. If you are going to enable upscaling because no matter what, you can't play native, then maybe just buy a cheaper card to get the same damn experience.

    Upscaling unfortunately is here to stay and with people clapping like seals for it, all it does is incentives manufacturers now to build their cards and set their tier class around needing an upscaler enabled now. I don't blame them, they don't have to worry about needing to squeeze maximum performance out of the card anymore. Just reduce your image settings with this neat, one button click! Gamers Nexus approves, says its better than native!!! And with 2020 showing people are willing to sacrifice their first born for a video card, and enabled scalpers, people are more than willing to pay stupid prices for stupid features and stupid cards. As much as people whine about the prices, they still go out there and buy 4090's and 4080's. And 4070's.

    Leave a comment:


  • Anux
    replied
    Originally posted by pWe00Iri3e7Z9lHOX2Qx View Post
    But why? You could spend the extra money that this hypothetical RX 7300 would cost and get a better APU that would offer similar GPU performance. Now a single slot 75W bus powered card would be interesting. With 75W all to itself and more than an order of magnitude more space to work with, you should be able to make something significantly faster than what the APU would offer.
    I agree to some degree. Last gen we had the 6400 which was heavily crippled (no hardware encoding and only x4 PCIe lanes) so it's no option to enhance an old system and questionable if you want that in a new system. I also remember AMD said at some point they won't release slower GPUs to not cut in their APU segment.
    The 6400 had 54 W TDP (which probably peaks to 75 W, so you won't get one with 75 W labeled on the box), we might get a 7400 at some point with at least 50% more performance than the fastest APU.

    But I don't really see a use for this performance class, if I don't need much 3D perf, I'm probably fine with an iGPU and if I need the power of a dGPU, 50% more than integrated is probably not enough perf and if it is as crippled as the 6400 it's hardly of any use.

    Leave a comment:


  • qarium
    replied
    Originally posted by Dukenukemx View Post
    Still too much money for that graphics card. The Intel Arch A770 is 16GB and can be found for less than $300. Problem is the ARC GPU's require ReBar for decent performance.
    the ARC GPUs have bigger problem than ReBAR:

    Originally posted by Teggs View Post
    The benchmarks of Meteor Lake (Arc) graphics performance are relevant and interesting. However, this only matters if the program runs correctly. I think MSI is about to faceplant in public when they release a Steam Deck competitor and half the games the customer tries to play either run like garbage or outright crash. Intel's driver efforts towards gaming still seem to be getting individual popular games to run. Customers don't only play popular or new games. Unless the driver is in a better state than has been reported, selling a Meteor Lake handheld is likely to generate bad press for MSI and Intel both.
    MSI is at least clear of the 'why didn't you cats install Steam OS on that thing?' question. Heh.

    in the end you pay this 30€ price difference between ARC770 16gb vs the 7600XT 16gb for the better driver support.

    because the intel gpu driver is shit as hell. just see the quote commend from teggs...

    Leave a comment:


  • RealNC
    replied
    Originally posted by pieman View Post
    Look at the Starfield drama over Bethesda only having FSR at release and not DLSS. I saw people with Nvidia 3080's and 3090's MAD they can't enable DLSS at 1080 / 1440p... both of those cards are able to run Starfield at max settings at those resolutions, natively, with no need for upscaling. BUT PEOPLE WANTED IT, and it HAD TO BE DLSS!
    To be fair, this is mostly because that game's anti-aliasing looks like dog shit. DLAA (the anti-aliasing part of DLSS) is superior. In games where it's not possible to control DLSS and DLAA separately, you can apply third-party mods that untangle these two settings so you can use DLAA without any upscaling.

    Leave a comment:


  • Dukenukemx
    replied
    Originally posted by ColdDistance View Post

    The Arc A770 can be cheaper, but ANV, the Vulkan driver for Intel GPUs, is still a bad toy compared to RADV.
    That's the problem with Intel in that their drivers are just not as good as AMD's in Linux. As it is right now the GPU market is stupid because nobody offers a sub $250 GPU with more than 8GB of VRAM. A 6600XT can be had for $240 new. An RTX 3060 12GB card is about $285 new. A RX 7600 8GB is $270 new. The A770 16GB is $300 new. The RX 6700 10GB cards don't exist, because people bought them all up. The RX 6700 XT 12GB is available at $320 new.

    Why would I buy the RX 7600 XT 16GB for $330 when the 6700 XT 12GB is $320? The whole point is getting more than 8GB of VRAM because modern games push beyond 8GB at 1080p. I would rather go for the RTX 3060 12GB because for $285 that's the best deal right now. As a Linux user I'd rather avoid dealing with Nvidia cards because their drivers don't always play nice. There are so many 8GB GPU's in the market that to even consider one right now is a waste of money, unless you don't have money. If you don't have money then it isn't going to be the 7600 XT or even the regular 8GB model. Probably better off buying a plain RX 6600 for $200. Aliexpres has RX 6600M's for $159 brand new. It's a Soyo brand based on a laptop GPU, or whatever that is, but when you're broke, you're broke. You can forget Ebay because for some reason the difference between a new and used GPU is $20-$30 at best. That isn't worth the headache and lack of warranty from what is likely a used mining GPU.
    Last edited by Dukenukemx; 08 January 2024, 09:09 PM.

    Leave a comment:


  • Dukenukemx
    replied
    Originally posted by panikal View Post

    This was true with my RX6600, aka AMD SAM / REBAR boosted my frame rates by 20%. I've read nvidia is more spotty perhaps but with my amd stack is was definitely required. Generally any newer motherboard should support this too I thought, like in the last 2-3 years ish?
    I have a MSI B350 Tomahawk that I thought would never get REBAR, but after like 3-4 years they finally updated the bios and now it has Rebar. I did it and switched over to UEFI because Linux plus Vega 56 should give some performance benefits. I know the Omega drivers for AMD pretty much do the same thing on Windows.

    Leave a comment:


  • Eudyptula
    replied
    Originally posted by RealNC View Post

    You are WRONG! It was $250

    (I know because I paid it.)
    Damn, we were both wrong! G-Sync modules can cost up to $500 (HDR-variant)!
    (According to a quick search returning articles claiming $100-500 markup cost for the monitor OEM, depending on the module.)

    For a display input controller board. I bought a 1080p 280 Hz IPS-LCD monitor with VRR and "HDR"400 for around half of that.

    Nvidia is so modest.
    Last edited by Eudyptula; 08 January 2024, 05:18 PM.

    Leave a comment:


  • pieman
    replied
    upscaling
    Originally posted by RealNC View Post
    Meh. Not really good enough for many modern games, but too expensive if you don't play games. And AMD is doing the Nvidia thing now where they use resolution upscaling and frame rate interpolation in their charts to show off how much FPS you're getting, completely hiding the actual performance of the GPU.

    I guess we can't have nice things anymore. You either get ripped off by AMD for mediocre performance, or you need to sell a kidney, an arm and a leg for a higher perf Nvidia card. Maybe Intel's future GPUs will bring something worth looking at.
    I don't blame AMD, nor Nvidia for this. I blame the user base who demand upscaling and frame rate interpolation to such a degree, they are willing to pay an extra $100-$600 for press "reduce my resolution only to upscale it button please." Look at the Starfield drama over Bethesda only having FSR at release and not DLSS. I saw people with Nvidia 3080's and 3090's MAD they can't enable DLSS at 1080 / 1440p... both of those cards are able to run Starfield at max settings at those resolutions, natively, with no need for upscaling. BUT PEOPLE WANTED IT, and it HAD TO BE DLSS!

    It also doesn't help that you have reviewers, like hardware unbox stating "nvidia is worth $200-$300 more than AMD for upscaling!" and praising MSRP $1,600 GPU's like the 4090 having, AND NEEDING upscaling. $1,600 for a video card, to get a $300 level card experience of needing to reduce settings boggles my mind the most. The whole point in buying high end cards, let alone flagships, was for that no compromise experience. You don't need to reduce anything. You can max out the game's settings, and run it natively, and now you have people mad that they buy a $1,000 GPU and can't reduce their image settings by enabling upscaling... I don't get it, and it doesn't matter the amount of mental gymnastics I get from people, I will never get it, nor understand their mental gymnastics. But but my 4k!!! maybe its to soon for 4k still. Maybe dropping $1,600 for a card that still can't do native 4K is a bad investment. If you are going to enable upscaling because no matter what, you can't play native, then maybe just buy a cheaper card to get the same damn experience.

    Upscaling unfortunately is here to stay and with people clapping like seals for it, all it does is incentives manufacturers now to build their cards and set their tier class around needing an upscaler enabled now. I don't blame them, they don't have to worry about needing to squeeze maximum performance out of the card anymore. Just reduce your image settings with this neat, one button click! Gamers Nexus approves, says its better than native!!! And with 2020 showing people are willing to sacrifice their first born for a video card, and enabled scalpers, people are more than willing to pay stupid prices for stupid features and stupid cards. As much as people whine about the prices, they still go out there and buy 4090's and 4080's. And 4070's.

    Leave a comment:

Working...
X