Announcement

Collapse
No announcement yet.

NVIDIA Announces The GeForce RTX 4060 Series

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • skeevy420
    replied
    Originally posted by user1 View Post
    Really absurd that RTX 4060 will have 8gb VRAM, while 3060 had 12GB. The energy efficiency improvements look nice though.
    If I want to upgrade from my 4GB RX 580, it seems I'll have to wait more because even 8GB isn't enough in some cases these days and I'm not willing to pay more than 299$ for a GPU.
    I upgraded from a 4GB RX 580 to a 12GB 6700 XT. The one I have is currently $319 new on Newegg; used for $260. I paid $345. Well, $400 if you include the three Noctua NF-P12s and cables I ended up getting because it's a huge fracking GPU that requires the airflow.

    Trust me on this -- splurge the extra $20. You won't regret it. Especially so if you have a well-cooled PC. The 6700 XT runs practically every game I own on 2K or 4K 60 with High/Ultra settings. I don't even bother with FSR or any of that anymore.

    I got tired of waiting for mid-range 7000 series GPUs from AMD and gaming at 1080p30-60. I'm very, very happy with my decision. I might be a bit salty when AMD finally does release their mid-range 7000 series, but I'll live.

    Also, the 6000 series is what's in the Steam Deck, PS5, and Xbox. That means the 6000 series will have very, very good LTS support because it's commercially necessary. RDNA2 is the next Polaris.
    ‚Äč

    Leave a comment:


  • theriddick
    replied
    So much for their texture compression tech. Just slap more memory on them and call it a day it seems.

    It's starting to look like the 50 series will be required to have better texture compression, kind of getting silly isn't it? lets hope AMD can find a neutral solution that doesn't require everyone to buy new GPU's each generation to support some new critical feature or something...

    PS. I'm a 4090 owner.

    Leave a comment:


  • KingKrouch
    replied
    Originally posted by avis View Post

    Better in raster, price and having open source drivers however worse in:
    • Worse power efficiency (isn't that important in Germany?)
    • Insanely high idle power consumption for certain monitor configurations (likewise)
    • Worse RTRT performance (likewise)
    • No DLSS (likewise)
    • No CUDA
    • No AI [features]
    Did you know that enabling DLSS and forcing VSync/certain maximum refresh rate makes NVIDIA cards even more power efficient? I bet you didn't.

    It makes sense to sometimes think outside of the Open Source cozy bubble/AMD fanboyism and admit that other companies can create better though more expensive products.
    The power consumption thing compared to a 4080 is a good point. Although, you'd be stupid to be running a game nowadays without VSync unless you're benchmarking or are a Counter-Strike player. Did you know that GSync goes completely haywire when you run one display at 144Hz while the other at 60Hz. It will make all windows animate like they're at 60Hz. Still not fixed by NVIDIA. Neither is their hilariously slow and bloated control panel or their godawful Linux drivers that had me going back to Windows every single time I dealt with some nonsense that couldn't be worked around.

    Worse RT performance really depends on if it's a deal-breaker or not. Plenty of PC games still don't support ray-tracing, and the games that do come at a massive performance cost. If you are ultimately concerned about budget, you'd be fine with console equivalent raytracing specifications with faster rasterization for at least a few years. Unless it's Metro Exodus (Where raytracing is done very minimally each frame, thus a lot of light bounces are possible), most games with ray-tracing are going to come at enough of an impact where you may as well disable it. Most people I've talked to end up disabling RT afterwards because they prefer having a higher framerate.

    DLSS isn't really that big of a selling point unless you do close-ups to investigate image quality. And yet it's being used by certain AAA publishers as a crutch for bad optimization. Should only really be used in GPU bound scenarios where you are pushing for a 4K output or higher framerates (120FPS or higher). DLSS to FSR 2 wrappers exist for the games that don't support multiple options, and XeSS is also an option. DLSS Frame Generation still needs some work, and is essentially just sample and hold black frame insertion when you think about it. While the game itself may appear smoother (with some artifacts) it will still feel the same as playing at a lower framerate, thus removing one of the reasons you'd play at a higher framerate to begin with. It's a neat thing to have though, I'll at least say that, and I'm fearful of when AMD releases their own equivalent, because that's just gonna encourage even lower effort optimization on consoles.

    You do realize that you can run Stable Diffusion and image upscalers through ROCM and OpenCL now? Microsoft even offers their own DirectX API for compute and AI applications now for Windows, and it's actually really easy to get Stable Diffusion running. You literally just made two talking points out of the same thing.

    Leave a comment:


  • Paradigm Shifter
    replied
    I'd actually be interested in a pair of 4060Ti 16GB cards if a manufacturer releases a version using a standard 8-pin rather than the 12VHPWR. Why, just why, on a 140W card?

    Sure, the prices aren't great, but unfortunately I still need CUDA (because ROCm is still a joke) and I'd like to have a little more breathing room in the VRAM department than my 11GB 1080Tis are giving me. As long as it's an appreciable increase in performance over those, I'm not going to complain because everything with more VRAM and with appreciably higher performance since has been... "affordable" (for a pair) but more than I want to spend out of my own pocket.

    Leave a comment:


  • andyprough
    replied
    Originally posted by avis View Post
    Blaming NVIDIA for not releasing a complete open source stack for an obscure OS which has zero customers on the desktop is kinda irrational IMO.
    Slow down birdie, I don't think nvidia and Microsoft could love you any more than they already do - you need to be finding a way to monetize all that big tech proprietary love rather than spending your time here trolling a bunch of "zeroes".

    Leave a comment:


  • blackiwid
    replied
    Originally posted by Melcar View Post
    Doubt AMD will release a 7600 with more than 8GB.
    I agree on that.
    Despite AMD's grandstanding on the issue or VRAM, they, like nvidia, are more than happy to be stingy in this regard.
    Price will be the big factor here and for these kind of cards the target should really be $250-$300.
    Well you don't seem to make that connection, because they only ask for 250-300 Dollar for a 8gb card makes them less "stingy". Of course AMD will release a 8gb graphics cards, you seem to get carried away with the names of the card.
    Amd already admitted that 7900XTX only competes with a rtx 4080 and a 7900XT I say therefor competes with a 4070ti, (and that is reflected by the pricing. Nvidia gives you a what is it in the US? 700 Dollar card with 12gb and amd one with 16gb. The same here, Nvidia releases a 400 Dollar RTX 4060ti and AMD will give you for around 400 Dollar a rx 7700xt with 12GB.

    You understand that names are irrelevant? Your whole post would not make any sense if they named the 7900XT(X) 7800XT(X) and just decreased 1 number from all other cards but changed nothing else.

    As you in the end yourself say, 8gb is not the problem but 8gb for what price.

    Leave a comment:


  • blackiwid
    replied
    Originally posted by avis View Post

    Almost everything in your post is either not true, or partially untrue, simply false or you even misrepresent your own links.

    That I can give back to you, I understand that you don't want me to answer because I can debunk you. I admitted that there is a advantage in Fullhd (1080p) you strawmanned me as if I would not said that, I also explained and people can think for themself how many people want to buy a 800 Dollar graphics card or 700 (don't know the exact US prices), to play in 1080p with it. Who invests that much money into a graphics card but still has a 10 year old Monitor? I am sure such people exist, but that is a very very small minority. I concentrate on the other 99.99% of people with such graphic card, that own and want to use a higher resolution.

    But hey if you want to play in 1080p and are willing to pay 700-800 dollars on graphics cards, a rtx 4070ti is the greatest, in that case I am sure I can offer you some mouse I have that is worth 2000 dollar.

    Also what does it matter how long DLSS3.0 is already out, for a person that considers to buy now a graphics card it matters how long will this person have DLSS3.0 and not FSR3.0 if he chooses a nvidia card. If you can offer me a time machine and we move back in time that might be a great argument till you can do that, this is no argument and no debunking of what I said.

    Leave a comment:


  • Melcar
    replied
    Doubt AMD will release a 7600 with more than 8GB. The 7800X will have 16GB and the 7700X 12GB. Despite AMD's grandstanding on the issue or VRAM, they, like nvidia, are more than happy to be stingy in this regard. We are no longer in the days of Polaris GPUs where the performance gap was substantial and they had to entice people with large VRAM pools.
    For a 1080p card 8GB is enough these days and probably for the near future. If you want more then you go to a higher tier with more VRAM. 8GB should be the minimum however, and this should be standard on any card of this type. Price will be the big factor here and for these kind of cards the target should really be $250-$300.

    Leave a comment:


  • tildearrow
    replied
    Originally posted by avis View Post
    The "problem" with NVIDIA is that their driver and user space are near perfect despite all the screams to the contrary here.
    Tell me why does x11grab not work then.
    Tell me why is there no API to record the screen.

    Leave a comment:


  • avis
    replied
    Originally posted by blackiwid View Post
    ....
    Almost everything in your post is either not true, or partially untrue, simply false or you even misrepresent your own links.

    Speaking of the computerbase.de article:

    rtrt.png

    Either $800 4070Ti beats $900 7900XT (cheaper + same RTRT performance) or you're comparing a $1000 7900 XTX with $800 4070 Ti, so choose your poison.

    I won't comment further because the rest of the post is just the same, including "FSR 3.0 will be released soon and DLSS 3.0 has enjoyed just a month of exclusivity". Neither of which is even remotely true. DLSS 3.0 has been out for at least half a year, no one knows when FSR 3.0 is gonna get released and according to the last review of a pro AMD youtube channel, Hardware Unboxed, DLSS 2 handily beats FSR 2 in all resolutions and modes except quality 4K where they are sort of equal. And DLSS 3.0 adoption has been a smashing success.

    Buy AMD and be happy. I'm not a big fan of twisting the truth to fit your agenda which is "AMD is better no matter what". I despise people who prefer companies, not products. I despise people who have idols, no matter what or who they are.

    Be frank, "I choose AMD because I believe in Open Source [drivers]" and leave the rest out. Thank you.

    I don't believe in Open Source drivers maybe because I work closely with bugzilla.kernel.org and https://gitlab.freedesktop.org/drm/amd/-/issues where I see how wonderful AMD open source drivers are. Not a single person here except me touches these projects regularly.

    In anticipation of your further possible counter-arguments, please, don't. AMD is the greatest CPU/GPU company and NVIDIA and Intel both suck. I hope you're happy now. Have a nice evening.

    Last edited by avis; 18 May 2023, 06:27 PM.

    Leave a comment:

Working...
X