Announcement

Collapse
No announcement yet.

NVIDIA Announces The GeForce RTX 4060 Series

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by drakonas777 View Post
    You are "defending" Intel, NVIDIA and Microsoft in this forum quite regularly.
    Citations needed. I've been accused of that a hundred times already without a single proof.

    Also "defending" is a very strong word. Being rational about companies and products is not "defending" them.

    Comment


    • #52
      Originally posted by avis View Post

      For a 4050 Ti, at most 4060 card - maybe.

      For 4060 Ti 16GB at $500 they look atrocious. This card is just 15%(!) faster in raster than 3060 Ti released two years ago at $400. That's the worst/lowest generational uplift for NVIDIA in roughly two decades. For 20% more price you get 15% more performance, after a two full nodes jump. This is a 122mm2 chip which should have never been called a XX60 Ti. That's a spit in the face. That's a rip off.

      The company is riding the DLSS 3.0 FG horse however all I can say is fuck it, I don't need artificial generated frames with an increased latency/input lag. And don't get me started that this is a 128bit PCIe4 x8 GPU. Everything about it screams 4050/4050 Ti.

      And don't get me started on how due to its limited bandwidth it will suck in 1440p/4K.

      This is not a 4060 Ti, this is a 4050 Ti in disguise sold for twice its actual performance.

      Too sad AMD doesn't want to intervene or compete. And some people dare call me an NVIDIA/Intel shill. What a disgrace to intelligence/common sense.
      Anyone remember Fermi days GF100 was the flagship (and only asked $500) and GF104 was the sweet spot 60ti? Ever since GK104 680, the ship has sailed and could no longer go back, sigh
      Last edited by lilunxm12; 19 May 2023, 05:56 AM.

      Comment


      • #53
        Originally posted by lilunxm12 View Post

        Anyone remember Fermi days GF100 was the flagship (and only asked $500) and GF104 was the sweet spot 60ti? Ever since GK104 680, the ship has sailed and could no longer go back, sigh
        You shall "thanks" for the crypto bubble.

        Comment


        • #54
          Well, curious to see how the 780m will fare, with ddr5 there should be north of 100gb/s bandwidth.

          Some sff system based on that could be interesting for emus and older games (that don't use steam drm of course). Discrete graphics aren't feasible outside professional use at these prices. For games there are far better consoles than wintel or steam.

          Comment


          • #55
            Originally posted by pinguinpc View Post

            If appears dont be pass 300us for 16gb will be good because now rx 6700/6700 xt stay more closer to 300us and have more horsepower according specs than RX 6600/7600

            Personally if amd give some liberty to manufacturers a RX 6700 non xt with 20gb of ram on 400us (or rx 6700xt with 24gb) will be very interesting


            They could probably get away with a $350-400 MSRP for a 7600 XT 16 GB.

            People have been worried about the 7600/7600 XT 8 GB MSRP being as high as what 6700/6700 XT are selling for now (new, USA). Which is currently about $280-330 on sale, meaning there is no point in waiting for these new cards to launch. Just grab a 6700 10 GB or 6700 XT 12 GB.

            Even if a 7600 XT does not match the 6700 XT in performance on average, a 16 GB version would have more VRAM, better ray-tracing, better efficiency, AV1 encode, etc.

            In addition to pricing of existing RDNA2 cards, now Nvidia's pricing is official, and it's pretty high. AMD is willing to undercut Nvidia, as they should, but by how much?

            Comment


            • #56
              Originally posted by Melcar View Post
              For a 1080p card 8GB is enough these days and probably for the near future. If you want more then you go to a higher tier with more VRAM. 8GB should be the minimum however, and this should be standard on any card of this type. Price will be the big factor here and for these kind of cards the target should really be $250-$300.
              Unless you want to play RE4 Remake with all the settings maxed out at 1080p, which then eats up 12gb vram and still wants more:

              In this video I want to prove that you can't max out the game's graphical settings even if you have a GPU with 12GB of VRAM. Even at 1080p this is not possib...


              The game is very impressive to see in action but the high quality textures and al; the visual effects eat up ram vram like it's going out of style.

              To a lesser extend RE Village is the same way and there is on other game that also loves vram, but I can't remember the name.

              This may be a sign of things to come, with game demanding more and more vram and honestly that may be a good thing.

              Card have been focusing on using faster and faster ram and game engines have been focusing on using more aggressive AA and AF modes, but the best thing for visual quality is higher quality textures and that requires a lot of vram.

              I would favor a shift in PC gaming, where games started simply focusing on higher quality textures and card started using slower ram that is cheaper and less power hungry but adding gobs of it.

              Comment


              • #57
                Originally posted by sophisticles View Post

                I would favor a shift in PC gaming, where games started simply focusing on higher quality textures and card started using slower ram that is cheaper and less power hungry but adding gobs of it.
                Might not work since you still need fast ram for good effects, ray tracing, multiple rendering passes, etc. BUT finally using unified memory between GPU and CPU with GPU on-board "cache" (for framebuffer and render targets) might work. Current game just underutilising system memory.

                Comment


                • #58
                  Wow, this is garbage really from Nvidia (or NGreedia as I say it), such a small uplift with crippled 8 GB of VRAM which is not enough in 2023 even on a 1080p card. Just the other day I was playing Forza Horizon 5 on the highest Extreme graphics preset with my 5700 XT (paired with 5950x) at 1080p, and while the performance was great at a smooth 75 fps even in most demanding scenes, the VRAM was full with 8000+ MB of usage. This is a 2019 card that I enjoyed for many years and still serves me perfectly for my 1080p gaming needs, however, it is clear that soon I will have to start turning down settings from the highest Ultra presets. I would have to be some deranged lunatic to get a GPU now with only 8 GB, at least 16 GB and preferably 24 if the monitor is upgraded to 4K.

                  Comment


                  • #59
                    Originally posted by avis View Post

                    You think NVIDIA and Microsoft care about Linux?
                    Actually, yes. most of Nvidia's professional hardware is running on Linux systems. And Microsoft cares so much about Linux they integrate it directly into their operating system.

                    Originally posted by avis View Post
                    Tell me how to enable HDR in Linux? Tell me how to enable display scaling with a single command (which works both for graphics output and console)? Tell me how install any random application from the net without compiling/chrooting/virtualization? You know something akin to downloading an exe and launching it?
                    It is really amusing how clueless your are. It seems as if you ran Linux once a decade ago.
                    Last edited by oleid; 19 May 2023, 04:52 PM.

                    Comment


                    • #60
                      Originally posted by avis View Post
                      Citations needed. I've been accused of that a hundred times already without a single proof.
                      For example, one of your latest posts in recent new comments:

                      Originally posted by avis
                      You cannot say good things about NVIDIA on Phoronix It's an evil scummy company and only AMD should be mentioned here.​


                      This kind of non-argument fanboy shit-posting is almost a credit card of yours. Also, you do this very early in the comment thread and then complain how somebody dared to answer your with another not serious post, because you know, people should meet this kind of garbage takes with respect and seriousness. Fucking LOL

                      Comment

                      Working...
                      X