Announcement

Collapse
No announcement yet.

GeForce GTX 1660 Ti Launch Today - Supported By The NVIDIA Linux Driver, No Nouveau Yet

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by starshipeleven View Post
    I guess they all think they won't sell it unless it looks GAMING. Which means slapping on a completely oversized heatsink even on low end parts like a rx550 and waste as much board space as possible to make it look like a bigger card.

    Meanwhile, both the XFX single-slot card and the XFX fanless card (both AMD cards) got sold out in a month.
    Look at it. Don't tell me that didn't make you want it more


    Comment


    • #12
      Originally posted by M@GOid View Post
      Yeah I noticed that, even on low end Nvidia cards.
      Yeah but it's not as prevalent, for example if I search for a GTX 1030 I do find SOME fake gaming cards, while most are low profile or slim or fanless small cards (which is what a 1030 is supposed to be).

      Meanwhile, no fucking way for a rx 550 to get the same treatment. Only a couple cards (MSI and XFX) provide a low profile card with a heatsink that makes sense for the actual power draw of the card.

      Comment


      • #13
        Originally posted by starshipeleven View Post
        Yeah but it's not as prevalent, for example if I search for a GTX 1030 I do find SOME fake gaming cards, while most are low profile or slim or fanless small cards (which is what a 1030 is supposed to be).

        Meanwhile, no fucking way for a rx 550 to get the same treatment. Only a couple cards (MSI and XFX) provide a low profile card with a heatsink that makes sense for the actual power draw of the card.
        30W vs 50W is huge difference, like 300W to 500W

        Not to mention these (in)famous GT 1030 DDR4 models, which are rated at 20W One could cool down 20W, but 50W is actually serious heat

        Again i will say like 200W to 500W, just to notice difference better
        Last edited by dungeon; 22 February 2019, 11:51 AM.

        Comment


        • #14
          Originally posted by dungeon View Post

          30W vs 50W is huge difference, like 300W to 500W

          Not to mention these (in)famous GT 1030 DDR4 models, which are rated at 20W One could cool down 20W, but 50W is actually serious heat

          Comment


          • #15
            The sacred for Vega 56 performance is reveled with Vega VII, just overclock your HBM by two terminals or so without overvolting. An extra 200GBps bandwidth will save the problems that are not solved in software. Second undervolt to 1040mv max for 14nm and test what frequency you can get with that. 7nm is probably 100mv less. My Vega56 that way gives 10.5 efficient Teraflops at 160 watts and blasts 1080-2070 to pieces.

            Also forget about Nvidia, this noisy ray garbage cannot be considered Realtime Interactive Raytracing, they only missing a Petaflop after all. Same thing DLSS is for semiblind people that cannot distinguish the orange from the lemon.
            Last edited by artivision; 22 February 2019, 12:07 PM.

            Comment


            • #16
              Originally posted by M@GOid View Post

              Look at it. Don't tell me that didn't make you want it more

              That is what happen with mass production, it is cheaper to put whatever is available than to invent something else

              Comment


              • #17
                Proprietary graphics card drivers vs. huge power draw. Hrm.

                Comment


                • #18
                  Originally posted by Apparition B5 View Post
                  huge power draw. Hrm.
                  AMD cards are very flexible when it comes to this regard. The factory voltages/clocks are just wrong/too high and often require tuning.

                  Furthermore, did you consider idle power draw?

                  Comment


                  • #19
                    Originally posted by Apparition B5 View Post
                    Proprietary graphics card drivers vs. huge power draw. Hrm.
                    As others have pointed out, AMD seems to play it very safe with the voltages to sell even as much as possible of the worst working silicon of a given wafer. That leaves a lot of potential on the table if you've got one of the better chips. At least from my own experiences on Windows with a RX 570, RX 580 and a Vega 56 (with Freesync to 75 Hz enabled), the latter was the most efficient while playing my Battlefield 1 test level with max stable undervolting settings for all of the three. The reviewers all test these cards without VSYNC to the max limit, that's rarely representative of the day-to-day real life usage in games anyway (depends on your usage though, 144 Hz displays and high resolutions push these to their limits).

                    Comment


                    • #20
                      Originally posted by Apparition B5 View Post
                      Proprietary graphics card drivers vs. huge power draw. Hrm.
                      Yeah, that's what it came down to for me when I made my last GPU purchase. I wish Polaris had been around when I bought my GTX950.
                      I also wish Nvidia would RELEASE THE FUCKING FIRMWARE to not hinder the nouveau devs as they pledged. (You know the situation is bad when I go all caps and swear.)

                      Comment

                      Working...
                      X