Announcement

Collapse
No announcement yet.

NVIDIA Releases The GeForce GTX 1650 At $149 USD, Linux Benchmarks Incoming

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • NVIDIA Releases The GeForce GTX 1650 At $149 USD, Linux Benchmarks Incoming

    Phoronix: NVIDIA Releases The GeForce GTX 1650 At $149 USD, Linux Benchmarks Incoming

    Coming in now a step below last month's GeForce GTX 1660 as an RTX-less Turing graphics card is now the GeForce GTX 1650 at the $149 USD price point...

    http://www.phoronix.com/scan.php?pag...TX-1650-Launch

  • #2
    Some bizarro sku's out there. Like this *triple* slot 1650. Ugh, why???

    https://www.newegg.com/Product/Produ...-443-_-Product

    Comment


    • #3
      Originally posted by torsionbar28 View Post
      Some bizarro sku's out there. Like this *triple* slot 1650. Ugh, why???

      https://www.newegg.com/Product/Produ...-443-_-Product
      Some time ago I posted a image of a RX550 with a large dual fan cooler used in things like a RTX 2080Ti. Someone speculated that it may be they are trying not to develop a new cooling solution just for a low end, slim profit margin card. It may help market it for immature kids too.

      BTW, those ITX triple slot EVGA cards were designed by a monkey that never saw a compact ITX case.

      Comment


      • #4
        The amount of VRAM on that card looks like an intentional bottleneck, just like most of the Turing lineup in general. Nvidia has gotten pretty greedy, and it seems like they want their cards to become obsolete the moment next generation comes out.

        Comment


        • #5
          Originally posted by torsionbar28 View Post
          Some bizarro sku's out there. Like this *triple* slot 1650. Ugh, why???

          https://www.newegg.com/Product/Produ...-443-_-Product
          And too many dual fan cards.

          Comment


          • #6
            Originally posted by torsionbar28 View Post
            Some bizarro sku's out there. Like this *triple* slot 1650. Ugh, why???
            Yep that's pretty stupid. This GPU could operate just fine on a single-slot cooler, and would actually be pretty appealing if they made such a thing.

            Comment


            • #7
              interesting I 'm eagerly waiting for the AMD response this could compete maybe with the navi 3060, given the price tag? or maybe the 2650 should be the runner up?

              the 1650 is already laggin a lot with the rx580 and the 3060 should have the same performance, if the 2650 is just a refresh, AMD will be winning there, because the navi 3060 will have a price tag of 130 usd.

              this is starting to look a lot like the time when first generation Ryzens were launched and force intel, after a while, to push the prices down . It took some months but all intels prices went really down, they simply couldn't compete.
              Last edited by Kayote; 04-23-2019, 12:23 PM.

              Comment


              • #8
                The AMD response to this is the cheaper 4GB RX 570 (I checked today and these are £125 now, the 1650 is £140+), which outperforms the card by a bit.
                The 4GB RX 580 is in some cases price competitive with some of the overclocked 1650 SKUs as well.

                The downside is the higher power consumption from the older design, and therefore probably the higher fan noise.

                The upside is excellent open source support from AMD, as well as the additional performance.

                Comment


                • #9
                  Originally posted by DoMiNeLa10 View Post
                  The amount of VRAM on that card looks like an intentional bottleneck, just like most of the Turing lineup in general. Nvidia has gotten pretty greedy, and it seems like they want their cards to become obsolete the moment next generation comes out.
                  With the high res textures that newer games have, even 1080p can require more than 4GB. Although for those games, you could always do "Medium" settings and get good performance.

                  Comment


                  • #10
                    Originally posted by DoMiNeLa10 View Post
                    The amount of VRAM on that card looks like an intentional bottleneck, just like most of the Turing lineup in general. Nvidia has gotten pretty greedy, and it seems like they want their cards to become obsolete the moment next generation comes out.
                    NVidia has an inventory backlog problem at the moment and that is what is pushing prices down.

                    They over forecasted crypto-currency demand and when the currencies started drying up, sales fell to the basement.

                    But they already had an accelerated product release schedule based on that demand cycle and that is why it seems the Turing came out so fast.

                    It's also why AMD looks like they are behind. They didn't throw the proverbial marbles to keep up with NVidia's irrational release cycle (some would call it an arms race).

                    In short, what goes up, must come down.

                    Comment

                    Working...
                    X