Announcement

Collapse
No announcement yet.

GPU for Game Developer

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by knuke View Post
    Thank you all very much. (y)

    I think I will go with an NVidia Card.

    So on to the next question: Do you think that a GTX 780 would do well in a multimonitor setup? Especially when it comes to gaming.
    If you're going for a lot of monitors, then you'll want plenty of GDDR5 in the card, pushing the price higher. I'll need to check to see how much exactly.
    OR
    If you're going for a lot of monitors get two $250 GPUs that either crossfire or SLI

    Comment


    • #12
      Originally posted by profoundWHALE View Post
      If you're going for a lot of monitors, then you'll want plenty of GDDR5 in the card, pushing the price higher. I'll need to check to see how much exactly.
      OR
      If you're going for a lot of monitors get two $250 GPUs that either crossfire or SLI
      I see. The GTX780 has 3GB of GDDR5 storage. I also wanna use my rig for as long as possible. So most probably this Card is going to be good for me. And if it will be to slow in the next 2 years I'll buy another one and SLI them together.

      Comment


      • #13
        Originally posted by nadro View Post
        please use just tools like GPU ShaderAnalyzer for GLSL programs.
        Well, if it's only up to OpenGL 3.x, a quick run on llvmpipe or swrast or so will probably give some better indication too whether it isn't just syntactically correct but also look like something you tried to do.

        Comment


        • #14
          Originally posted by knuke View Post
          I see. The GTX780 has 3GB of GDDR5 storage. I also wanna use my rig for as long as possible. So most probably this Card is going to be good for me. And if it will be to slow in the next 2 years I'll buy another one and SLI them together.
          Nice, and probably the best choice. If it can't handle enough monitors, just get another. Heh.

          Comment


          • #15
            intel

            go intel, their open source drivers are good, and it'll encourage you to have efficient code that works well on most peoples computers.

            Comment


            • #16
              Intel IGP if you don't need very high performance (HD 4600, found on Haswell CPUs).

              GTX 770 for a NVIDIA card, GTX 780 has way worse performance/price.

              R9 270(X) or 280X for an AMD card. 290(X) is too noisy currently, and will be too expensive with custom coolers.

              Comment


              • #17
                Originally posted by profoundWHALE View Post
                Nice, and probably the best choice. If it can't handle enough monitors, just get another. Heh.
                Yeah it's the best thing I can do. I plan on using the computer for many years. And after a few I will have to upgrade the GPU. So the 780 would sure last longer without upgrade.

                Comment


                • #18
                  True

                  Having more than one monitor breaks your focus. I say just go with the largest thing you can afford that will still fit on your desk.

                  AMD/ATI Quality>
                  I started out coding OpenGL in Linux on a 3DFX Voodoo 2 then moved up to a TNT2 Ultra. When I finally upgraded again I went to a Radeon 9200 (Basically a 90SUMHundred because I had no idea of the performance band) shit when downhill fast. Who knew a 9200 was weaker than an 8500? With Nvidia it was next number better performance. Of course I know Nvidia now plays the numbers game but you know that GT, then GTX is top of the line. Point is, Radeon is the biggest waste of money if you're looking into Linux OpenGL development. You'll always be waiting for the 80% solution. Just around that proverbial conner.

                  You want to know why OpenGL fails on Radeon? Because you're missing 20% of the specification. I can code shit in Mesa with the software rendering which works perfect and then try to run it on a Radeon and see it fail miserably.

                  Awesome!

                  Nvidia just put out an update for their 6xxx series. Radeon deprecated their 9xxx and everything else legacy series 3 years ago.
                  To be a company that just won two of the biggest consoles contracts, they sure are greedy with their driver support.

                  Who loves you? Nvidia does baby!!!

                  Comment


                  • #19
                    Originally posted by ChrisXY View Post
                    Well, if it's only up to OpenGL 3.x, a quick run on llvmpipe or swrast or so will probably give some better indication too whether it isn't just syntactically correct but also look like something you tried to do.
                    llvmpipe is broken for many advanced shaders and best way for testing them it's Intel hardware imho.

                    Comment


                    • #20
                      Originally posted by squirrl View Post
                      Having more than one monitor breaks your focus. I say just go with the largest thing you can afford that will still fit on your desk.

                      AMD/ATI Quality>
                      I started out coding OpenGL in Linux on a 3DFX Voodoo 2 then moved up to a TNT2 Ultra. When I finally upgraded again I went to a Radeon 9200 (Basically a 90SUMHundred because I had no idea of the performance band) shit when downhill fast. Who knew a 9200 was weaker than an 8500? With Nvidia it was next number better performance.
                      Nividia has the exact same numbering system. For example, a 680 would be faster than a 760. How could you not know about the first number being generation, and the number after being model/tier number?

                      Comment

                      Working...
                      X