Announcement

Collapse
No announcement yet.

NVIDIA Introduces $400 GeForce GTX 770 GPU

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • NVIDIA Introduces $400 GeForce GTX 770 GPU

    Phoronix: NVIDIA Introduces $400 GeForce GTX 770 GPU

    To join the GeForce GTX TITAN and GTX 780 as the newest high-performance NVIDIA GPUs, rolled out this morning was the GeForce GTX 770. NVIDIA has introduced the GTX 770 as a new high-performance graphics card that's priced at $399 USD...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Features of this new graphics card include a 1046MHz base core frequency with 1085MHz Boost frequency, 1536 CUDA cores, 2GB of GDDR5 memory, and other modern Kepler functionality like GPU Boost 2.0, 3D Vision, TXAA, SHIELD Ready, and supports OpenGL 4.3.
    Yeah, 1536 "crippled" CUDA cores, to force you to buy a $1000+ Tesla if you want to play with CUDA. Fuck you NVIDIA. AMD is going to crush you.

    Comment


    • #3
      Originally posted by wargames View Post
      Yeah, 1536 "crippled" CUDA cores, to force you to buy a $1000+ Tesla if you want to play with CUDA. Fuck you NVIDIA. AMD is going to crush you.
      AMD isn't a good option for linux, slow drivers, eternals bugs, opengl bugs etc etc...

      Comment


      • #4
        Originally posted by wargames View Post
        Yeah, 1536 "crippled" CUDA cores, to force you to buy a $1000+ Tesla if you want to play with CUDA.
        Put down the crack pipe.

        Comment


        • #5
          GPU prices are through the roof. Both from AMD as well as NVidia. Remember when the flagship cards used to cost around 400 dollars and the high-end models were going for about 300? Now we get an upper-end card for 400, the high-end one for 700 and the flagship card to a thousand.

          No, thank you.

          And there goes the theory that competition helps with prices. Unless of course this means that there actually is no competition.
          Last edited by RealNC; 30 May 2013, 11:26 AM.

          Comment


          • #6
            They will charge what the market will pay.

            At the end of the day they still have to make money. If they charge a million dollars and have no buyers, they lose money big-time. So they can't just set the prices arbitrarily.

            Comment


            • #7
              Nice! Now waiting for a decent 760 or 760 Ti with a nice price-tag, so that it makes sense for me to make the change.

              Comment


              • #8
                Originally posted by johnc View Post
                Put down the crack pipe.
                I don't smoke thanks. They "crippled" their cards when they realized some people were using them for CUDA, like the Blender graphic artists. They even sent a couple of Teslas to the guys who made "Tears of steel" in order to promote those overly expensive cards made by chinese people who earn $5/day. But then they realized the GTX 580 was "too" powerful for the price, that's why the 500 series are MUCH more powerful than the 600 series, this is a fact not a conspiracy theory. So again, FUCK YOU NVIDIA.

                Comment


                • #9
                  Originally posted by wargames View Post
                  I don't smoke thanks. They "crippled" their cards when they realized some people were using them for CUDA, like the Blender graphic artists. They even sent a couple of Teslas to the guys who made "Tears of steel" in order to promote those overly expensive cards made by chinese people who earn $5/day. But then they realized the GTX 580 was "too" powerful for the price, that's why the 500 series are MUCH more powerful than the 600 series, this is a fact not a conspiracy theory. So again, FUCK YOU NVIDIA.
                  That "crippling" was a necessary compromise in their new SMX architecture to get the power efficiency they wanted, and it was a wise decision because 99% of GTX buyers are interested in games, not CUDA and certainly not FP64 performance. And "crippled" is a bit of a stretch anyway.

                  And your initial statement implied that you couldn't "play with CUDA" on Kepler cards, which is a joke.

                  Comment


                  • #10
                    Originally posted by johnc View Post
                    That "crippling" was a necessary compromise in their new SMX architecture to get the power efficiency they wanted, and it was a wise decision because 99% of GTX buyers are interested in games, not CUDA and certainly not FP64 performance.
                    No, games need that now. Look at how much better DirectCompute runs on AMD cards. New games use that for stuff like global illumination.

                    Crippling the compute capabilities of the GPU can now hurt games severely.

                    Comment

                    Working...
                    X