Announcement

Collapse
No announcement yet.

NVIDIA Launches GeForce GTX 470/480

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by deanjo View Post
    lol, 256 watts max. @ full crunching. Amazing how you can forget Charlies "from reliable sources". Just go back and read his failure of "out of his ass" predictions.

    oh so wrong my little, desperate fanboi.




    NVIDIA SuperSonic-Demo: 262,70 Watt
    3DMark06: 283,16 Watt
    Heaven 2: 242,6 Watt
    Stalker CoP: 239,86 Watt

    and 317Watt in Furmark.

    Add to that injury the insult: 112W bluray and 112W dual screen.

    Nvidia fucked up. Just like Charlie D predicted.

    Comment


    • #22
      oh - and those guys measured the card, not the system. So there is no magic 'PSU is bad' or 'CPU needs power too' point.

      Comment


      • #23
        Originally posted by n0nsense View Post
        I'm not a fanboy of any particular brand. But wtf your problem people ?
        The cards are out, generally, both 480 and 470 performance is better than competition. Fermi architecture is "revolutionary" instead of competitions "evolutionary".
        I fail to see what's so revolutionary about it. So they used more transistors and are using more watts to get higher performance. It doesn't even come close to being competitive to Evergreen in overall power consumption and from a performance per watt perspective.

        I don't know whether this is due to incompetence or they just don't care about power consumption. But I hope one day they get their act together and are able to compete with ATI/AMD not just on raw performance but also on power consumption. Because I think competition is a good thing.

        Comment


        • #24
          Originally posted by rohcQaH View Post
          Any article about fermi that doesn't even mention the huge power consumption looks fishy. Sure, it may be a few percent faster than the 5870, but that's hardly useful if it roars like a bear and melts my PC.

          There have been jokes about cooking eggs on your GPU for ages. But this is the first one that actually gets hot enough to boil water.
          Fermis's fan uses more power than my main machines cpu. Even more than this p4 northwood's cpu. When you use a 21 watt fan and you can't keep it under 95c it's time to give up.

          Comment


          • #25
            Originally posted by energyman View Post
            oh - and those guys measured the card, not the system. So there is no magic 'PSU is bad' or 'CPU needs power too' point.
            Besides, a bunch of reviewers have swapped out an HD5870 for a GTX480 in the same testbed and gotten a ~100W increase in power. HD5870's official max board power is 188W. 188 + 100 != 250 by a long shot, so it seems someone's fudging numbers.

            Judging from the reviews I've seen so far, GTX480 seems to be roughly level with GTX295 in price, performance (tessellation and GPGPU enhancements notwithstanding), and power consumption

            (also, availability ).

            Comment


            • #26
              Originally posted by monraaf View Post
              I fail to see what's so revolutionary about it. So they used more transistors and are using more watts to get higher performance. It doesn't even come close to being competitive to Evergreen in overall power consumption and from a performance per watt perspective.

              I don't know whether this is due to incompetence or they just don't care about power consumption. But I hope one day they get their act together and are able to compete with ATI/AMD not just on raw performance but also on power consumption. Because I think competition is a good thing.
              MIMD is the revolution. This is next step which Intel tried with Larrabee.
              They couldn't make small enough, cool enough chip. Nvidia did.
              It's still much more interesting card than anything from ATI right now.
              Anyway, like it was with G80 launch, everyone cried that it's huge, hot, unprofitable and whatever not. But in the end this chip left ATI in dust for few years. Just for the record, I'm using HD4850 right now. Since i use Linux, and prefer gaming on Linux when possible, Nvidia proved to be much better choice. ATI's OpenGL performance is far from good. Heaven benchmark on my system shows ~30% performance difference between DX11 and OpenGL on 64bit 7. the 5000 series use same architecture as 4000, just improved. Nvidia's Fermi is completely different from previous 200 series. Actually 200 was for G92 what 5000 for 4000. Improved double everything.

              Comment


              • #27
                This might explain the power discrepancies:

                Originally posted by hardware.fr
                Annonc?e avec un TDP record de 250W par Nvidia, la GeForce GTX 480 est bel et bien la carte mono-GPU la plus gourmande. Nous obtenons m?me des valeurs sup?rieures ? ce TDP dans nos mesures. Nous avons bien entendu questionn? Nvidia ? ce niveau et le fabricant nous a r?pondu qu?en fait le TDP donn? ?tait la consommation maximale observ?e en jeu et non la consommation maximale de la carte.
                Rough translation:

                Announced with a record TDP of 250W by Nvidia, the GeForce GTX 480 is quite the hungry single-GPU card. We obtained even higher values than this TDP in our measurements. We naturally asked Nvidia about this [power consumption] level, and they responded that the given TDP is in fact the maximum consumption observed in-game and not the maximum consumption of the card.

                Comment


                • #28
                  no
                  you just failed for their marketing speak. MIMD is not 'the revolution'. It is not new. And certainly not new in GPUs. Fermi is just the logical next step for nvidia. Nothing revolutionary about it. Just another increase in shaders, size and power usage.

                  The HD4k - that was revolutionary. Fermi? Not.

                  Comment


                  • #29
                    Saying Fermi is a MIMD is like saying Sasha Grey is a virgin.

                    Comment


                    • #30
                      Originally posted by Alex_V View Post
                      Saying Fermi is a MIMD is like saying Sasha Grey is a virgin.
                      Sasha Gey IS a virgin! She never ... uhm... said "no"

                      Back to topic:
                      After buying my 3870 short after AMD saying the open up their specs, after years of disappointing things (stock, Phenom I) and waiting (bug free drivers) it seems AMD found their way!

                      They really hurt Nvidia with a design, delivered 6 month earlier (I think thanks to the 4770), drivers are getting better (as far as one reads here), Intel paid (pocket-)money, future looks better with GF as partner and month of low competition by Nvidia (cheap OEM cards with directx 11 anyone?). If bulldozer starts to hurt Intel, too, I have hope for my stocks back to the heights of the Athlon X2 times

                      GZ AMD!
                      (@ bridgeman: If you know the hardware engineers: give 'em a hug from a stock owner )

                      Btw:
                      For anyone interested, a nice story:

                      Comment

                      Working...
                      X