Announcement

Collapse
No announcement yet.

NVIDIA Announces The GeForce RTX 4060 Series

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by avis View Post
    Blaming NVIDIA for not releasing a complete open source stack for an obscure OS which has zero customers on the desktop is kinda irrational IMO.
    Slow down birdie, I don't think nvidia and Microsoft could love you any more than they already do - you need to be finding a way to monetize all that big tech proprietary love rather than spending your time here trolling a bunch of "zeroes".

    Comment


    • #42
      I'd actually be interested in a pair of 4060Ti 16GB cards if a manufacturer releases a version using a standard 8-pin rather than the 12VHPWR. Why, just why, on a 140W card?

      Sure, the prices aren't great, but unfortunately I still need CUDA (because ROCm is still a joke) and I'd like to have a little more breathing room in the VRAM department than my 11GB 1080Tis are giving me. As long as it's an appreciable increase in performance over those, I'm not going to complain because everything with more VRAM and with appreciably higher performance since has been... "affordable" (for a pair) but more than I want to spend out of my own pocket.

      Comment


      • #43
        Originally posted by avis View Post

        Better in raster, price and having open source drivers however worse in:
        • Worse power efficiency (isn't that important in Germany?)
        • Insanely high idle power consumption for certain monitor configurations (likewise)
        • Worse RTRT performance (likewise)
        • No DLSS (likewise)
        • No CUDA
        • No AI [features]
        Did you know that enabling DLSS and forcing VSync/certain maximum refresh rate makes NVIDIA cards even more power efficient? I bet you didn't.

        It makes sense to sometimes think outside of the Open Source cozy bubble/AMD fanboyism and admit that other companies can create better though more expensive products.
        The power consumption thing compared to a 4080 is a good point. Although, you'd be stupid to be running a game nowadays without VSync unless you're benchmarking or are a Counter-Strike player. Did you know that GSync goes completely haywire when you run one display at 144Hz while the other at 60Hz. It will make all windows animate like they're at 60Hz. Still not fixed by NVIDIA. Neither is their hilariously slow and bloated control panel or their godawful Linux drivers that had me going back to Windows every single time I dealt with some nonsense that couldn't be worked around.

        Worse RT performance really depends on if it's a deal-breaker or not. Plenty of PC games still don't support ray-tracing, and the games that do come at a massive performance cost. If you are ultimately concerned about budget, you'd be fine with console equivalent raytracing specifications with faster rasterization for at least a few years. Unless it's Metro Exodus (Where raytracing is done very minimally each frame, thus a lot of light bounces are possible), most games with ray-tracing are going to come at enough of an impact where you may as well disable it. Most people I've talked to end up disabling RT afterwards because they prefer having a higher framerate.

        DLSS isn't really that big of a selling point unless you do close-ups to investigate image quality. And yet it's being used by certain AAA publishers as a crutch for bad optimization. Should only really be used in GPU bound scenarios where you are pushing for a 4K output or higher framerates (120FPS or higher). DLSS to FSR 2 wrappers exist for the games that don't support multiple options, and XeSS is also an option. DLSS Frame Generation still needs some work, and is essentially just sample and hold black frame insertion when you think about it. While the game itself may appear smoother (with some artifacts) it will still feel the same as playing at a lower framerate, thus removing one of the reasons you'd play at a higher framerate to begin with. It's a neat thing to have though, I'll at least say that, and I'm fearful of when AMD releases their own equivalent, because that's just gonna encourage even lower effort optimization on consoles.

        You do realize that you can run Stable Diffusion and image upscalers through ROCM and OpenCL now? Microsoft even offers their own DirectX API for compute and AI applications now for Windows, and it's actually really easy to get Stable Diffusion running. You literally just made two talking points out of the same thing.

        Comment


        • #44
          So much for their texture compression tech. Just slap more memory on them and call it a day it seems.

          It's starting to look like the 50 series will be required to have better texture compression, kind of getting silly isn't it? lets hope AMD can find a neutral solution that doesn't require everyone to buy new GPU's each generation to support some new critical feature or something...

          PS. I'm a 4090 owner.

          Comment


          • #45
            Originally posted by user1 View Post
            Really absurd that RTX 4060 will have 8gb VRAM, while 3060 had 12GB. The energy efficiency improvements look nice though.
            If I want to upgrade from my 4GB RX 580, it seems I'll have to wait more because even 8GB isn't enough in some cases these days and I'm not willing to pay more than 299$ for a GPU.
            I upgraded from a 4GB RX 580 to a 12GB 6700 XT. The one I have is currently $319 new on Newegg; used for $260. I paid $345. Well, $400 if you include the three Noctua NF-P12s and cables I ended up getting because it's a huge fracking GPU that requires the airflow.

            Trust me on this -- splurge the extra $20. You won't regret it. Especially so if you have a well-cooled PC. The 6700 XT runs practically every game I own on 2K or 4K 60 with High/Ultra settings. I don't even bother with FSR or any of that anymore.

            I got tired of waiting for mid-range 7000 series GPUs from AMD and gaming at 1080p30-60. I'm very, very happy with my decision. I might be a bit salty when AMD finally does release their mid-range 7000 series, but I'll live.

            Also, the 6000 series is what's in the Steam Deck, PS5, and Xbox. That means the 6000 series will have very, very good LTS support because it's commercially necessary. RDNA2 is the next Polaris.

            Comment


            • #46
              Originally posted by skeevy420 View Post

              I upgraded from a 4GB RX 580 to a 12GB 6700 XT. The one I have is currently $319 new on Newegg; used for $260. I paid $345. Well, $400 if you include the three Noctua NF-P12s and cables I ended up getting because it's a huge fracking GPU that requires the airflow.

              Trust me on this -- splurge the extra $20. You won't regret it. Especially so if you have a well-cooled PC. The 6700 XT runs practically every game I own on 2K or 4K 60 with High/Ultra settings. I don't even bother with FSR or any of that anymore.

              I got tired of waiting for mid-range 7000 series GPUs from AMD and gaming at 1080p30-60. I'm very, very happy with my decision. I might be a bit salty when AMD finally does release their mid-range 7000 series, but I'll live.

              Also, the 6000 series is what's in the Steam Deck, PS5, and Xbox. That means the 6000 series will have very, very good LTS support because it's commercially necessary. RDNA2 is the next Polaris.
              Actually I already thought about upgrading to 6700xt, also because it has full PCI x16, so I won't be bottlenecked by that PCI x8 that unfortunately is now on every mid range and lower gpu (It's a problem for those like me who have pci-e 3.0 motherboards).
              Last edited by user1; 19 May 2023, 02:45 AM.

              Comment


              • #47
                Originally posted by andyprough View Post

                I don't think nvidia and Microsoft could love you any more than they already do - you need to be finding a way to monetize all that big tech proprietary love rather than spending your time here trolling a bunch of "zeroes".
                You think NVIDIA and Microsoft care about Linux? An OS with no desktop presence/market which doesn't even have a functional featureful display manager in 2023? Tell me how to enable HDR in Linux? Tell me how to enable display scaling with a single command (which works both for graphics output and console)? Tell me how install any random application from the net without compiling/chrooting/virtualization? You know something akin to downloading an exe and launching it?

                You really could have slowed down with your fanaticism and belief that enemies and shills are all around. Or welcome to Iran/North Korea/Russia - very nice countries which talk and act exactly like you. No one in the world cares about them yet inside them you'll find out that the entire world is your enemy.

                If you believe I'm shilling for proprietary companies you need to reexamine your logical skills or a lack of them. Windows is the desktop OS. NVIDIA so far has had the best graphics drivers (in terms of features, reliability and performance) for its products. If you hate facts, well, it's your choice. However twisting facts to make me a shill is kinda insincere, dishonest and scummy.

                Comment


                • #48
                  Originally posted by avis View Post
                  However twisting facts to make me a shill is kinda insincere, dishonest and scummy.
                  The pot calling the kettle black ***


                  *** is a proverbial idiom that may be of Spanish origin, of which English versions began to appear in the first half of the 17th century. It means a situation in which somebody accuses someone else of a fault which the accuser shares, and therefore is an example of psychological projection, or hypocrisy


                  Comment


                  • #49
                    Originally posted by avis View Post

                    And some people dare call me an NVIDIA/Intel shill. What a disgrace to intelligence/common sense.
                    A disgrace to intelligence/common sense is your implication that a single instance of critique can absolve someone from being generally biased. For example, I know several BMW fanboys who acknowledge specific problems regarding BMW products, but they are still biased fanboys nevertheless.

                    You are "defending" Intel, NVIDIA and Microsoft in this forum quite regularly. Furthermore, most of the time you don't even provide any decent argument, but thrown ad-homs/stereotypes regarding open source enthusiasts instead, focusing on how they do not comply with your personal standards of how people should treat companies and products. For one reason or another you are obviously biased in your comments, because of your internal need to defend and rationalize specific brands.

                    Comment


                    • #50
                      I work with genome data and I sure need this 4060ti 16gb version. I don't want to pay too much for the L4000 or L5000 versions since they don't offer more for what I need (nvidia parabricks). No VGPU is supported so any desktop gpu is more than enough given it has at least 16gb VRAM. I might try fetching 2 or 4 of them depending on the price and stick them on a X299 or TRX40 motherboard. 25 minutes per genome analysis is sure a good thing (takes 24 hours on CPUs).

                      Comment

                      Working...
                      X