Announcement

Collapse
No announcement yet.

NVIDIA Announces New TITAN X Card With 12 Billion Transistors, 11 TFLOPS Compute

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by devius View Post

    That seems to be where it's headed. More and more stuff is getting integrated in the CPU as time goes by, so it won't be that long before stand-alone graphics card are a thing of the past.
    It does not even deserve to be CPU anymore, but Intel likes to continue to call it like that APU sounds more correct, as it is not just CPU... in both cases it is collection of whatever possible in it.
    Last edited by dungeon; 22 July 2016, 10:06 AM.

    Comment


    • #32
      Originally posted by dungeon View Post
      Ah when i was a kid i had C64, master of silence and sound ... But then PC with ventilator came in and i immidiately knew same day that is nothing but due to capitalistic competition so that something came wrong, that idiot is inside and trolls outside and that all those shits has ventilators
      Aging is automatic, growing up is not.

      Comment


      • #33
        Originally posted by torsionbar28 View Post
        And you're assuming that they aren't.
        No, I'm just saying that rich dumbfucks do exist and are also pretty obvious to spot, while he seems to say that people that have money must have earned it through their superior intelligence so somehow their choice to buy such overpriced cards is because they somehow need them or something.

        Of course I'm not saying every rich person is a dumbfuck, nor that every Titan buyer is a dumbfuck.

        What I'm saying is that MOST people buying these cards (and ALL those that SLI them) are rich dumbfucks.

        Remember that if you're the offspring/married to and don't actually understand anything about money, that money doesn't tend to stick around very long.
        Yeah, because those kinds of people blow money in totally nonsense things like a watercooled quad-sli of Titan X.

        Not that it's bad for us either way,
        Already said so. Nvidia is selling to make money, this is just one of their targets. Nothing out of the ordinary.
        Last edited by starshipeleven; 22 July 2016, 11:50 AM.

        Comment


        • #34
          Originally posted by starshipeleven View Post
          Aging is automatic, growing up is not.
          Aging is polimatic, growning down is hot but no cooler needed

          Comment


          • #35
            Originally posted by devius View Post
            That seems to be where it's headed. More and more stuff is getting integrated in the CPU as time goes by, so it won't be that long before stand-alone graphics card are a thing of the past.
            Well, GPUs have different RAM needs than CPUs (GDDR is different from DDR, there is a reason), and also there is the pretty obvious thermal limits imposed by sharing the same silicon that will limit quite a lot any integrated solution.

            Plus the obvious fact that GPUs tend to get obsolete MUCH faster than CPUs, so any dedicated GPU user would not take well if he had to buy a whole new system every 2 years or so, and there is no way in hell that it's gonna be cheep as consoles because that's not crappy second-rate hardware.

            So yeah, that's the direction we are headed, but it's not gonna happen even mid-term unless something truly wondrous happens.
            Last edited by starshipeleven; 22 July 2016, 10:44 AM.

            Comment


            • #36
              Originally posted by dungeon View Post
              Aging is polimatic, growning down is hot but no cooler needed
              asdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdas dasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasda sdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasd asdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdas dasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasda sdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasd asdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdas dasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasda sdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasd asdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdas dasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasda sdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasd asdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdas dasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasda sdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasd asdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdas dasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasda sdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasd asdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdas dasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasda sdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasd asdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdas dasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasda sdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasdasd asdasdasdasdasdasdasdasdasdasdasdasdasd

              Comment


              • #37
                Want 1080-Ti or Vega, whichever comes first I will buy, that is assuming that the Vega will outperform the 1080 by a noticable amount.

                Comment


                • #38
                  Originally posted by dungeon View Post
                  It does not even deserve to be CPU anymore, but Intel likes to continue to call it like that APU sounds more correct, as it is not just CPU... in both cases it is collection of whatever possible in it.
                  Its name is SoC. System on a Chip.

                  Comment


                  • #39
                    Originally posted by TheOnlyJoey View Post
                    Titans are and always will be focussed on the computing and scientific market.
                    Rendering, Computing clusters etc are a huge market and still consists out of the biggest market for these cards where extra precision and every last bit of performance is key.
                    Normally TItans (and Quadro's) should also be more stable under constant full load.
                    People buying this for gaming rigs is silly, but fortunately this is just a small percentage.
                    No, they're not. That's just an excuse. The first Titan at least had 1/3 FP64/FP32 performance, making it interesting for hobby-scientists, but still not the certified drivers that professional cards get. Now Titans have no noticeable FP64 performance and still only consumer drivers. Pro users go for Quadro or Tesla. Just look at the marketing, the Titan has always been targeting gamers and it is again.
                    Also, a Titan is not a single bit more stable. Those are all plain components used on every gaming card, clocks are not lower ("safer"), nothing like that

                    Originally posted by atomsymbol
                    (Utilizing all shaders implies that the boost clock is inactive, so I used base clock to compute the TFLOPs)
                    No, it doesn't. Those are gaming cards and in games there is headroom. Nvidia cards are boosting all the time in common games. BTW. they even include the variation in this specification. They call it average boost clock, better chips go higher, worse ones lower under common load scenarios.

                    Originally posted by rabcor View Post
                    Want 1080-Ti or Vega, whichever comes first I will buy, that is assuming that the Vega will outperform the 1080 by a noticable amount.
                    There are two Vega chips coming. I wouldn't expect the first (smaller) one to be much faster than GP104 (GTX 1080) in gaming.
                    Last edited by juno; 22 July 2016, 11:29 AM.

                    Comment


                    • #40
                      Most likely the Vega chip will shine in VRAM hungry games at 4k due to faster HBM2. Pascal seems to focus more on color compression than extra high memory bandwidth for consumer cards. The Titan X (without GTX) has no competition, it is just there for marketing - even the slower cards might sell better. Some ppl will buy those, maybe even 2 and run benchmarks or use it for VR or Surround gaming. You can expect a very limited availability, nothing that AMD needs to fear right now. I am pretty sure nobody needs it for Linux gaming ;-)

                      Comment

                      Working...
                      X