Page 2 of 3 FirstFirst 123 LastLast
Results 11 to 20 of 25

Thread: GPU for Game Developer

  1. #11
    Join Date
    Oct 2013
    Location
    Canada
    Posts
    312

    Default

    Quote Originally Posted by knuke View Post
    Thank you all very much. (y)

    I think I will go with an NVidia Card.

    So on to the next question: Do you think that a GTX 780 would do well in a multimonitor setup? Especially when it comes to gaming.
    If you're going for a lot of monitors, then you'll want plenty of GDDR5 in the card, pushing the price higher. I'll need to check to see how much exactly.
    OR
    If you're going for a lot of monitors get two $250 GPUs that either crossfire or SLI

  2. #12
    Join Date
    Jan 2014
    Posts
    7

    Default

    Quote Originally Posted by profoundWHALE View Post
    If you're going for a lot of monitors, then you'll want plenty of GDDR5 in the card, pushing the price higher. I'll need to check to see how much exactly.
    OR
    If you're going for a lot of monitors get two $250 GPUs that either crossfire or SLI
    I see. The GTX780 has 3GB of GDDR5 storage. I also wanna use my rig for as long as possible. So most probably this Card is going to be good for me. And if it will be to slow in the next 2 years I'll buy another one and SLI them together.

  3. #13
    Join Date
    Jun 2010
    Location
    ฿ 16LDJ6Hrd1oN3nCoFL7BypHSEYL84ca1JR
    Posts
    1,020

    Default

    Quote Originally Posted by nadro View Post
    please use just tools like GPU ShaderAnalyzer for GLSL programs.
    Well, if it's only up to OpenGL 3.x, a quick run on llvmpipe or swrast or so will probably give some better indication too whether it isn't just syntactically correct but also look like something you tried to do.

  4. #14
    Join Date
    Oct 2013
    Location
    Canada
    Posts
    312

    Default

    Quote Originally Posted by knuke View Post
    I see. The GTX780 has 3GB of GDDR5 storage. I also wanna use my rig for as long as possible. So most probably this Card is going to be good for me. And if it will be to slow in the next 2 years I'll buy another one and SLI them together.
    Nice, and probably the best choice. If it can't handle enough monitors, just get another. Heh.

  5. #15
    Join Date
    Oct 2010
    Posts
    91

    Default intel

    go intel, their open source drivers are good, and it'll encourage you to have efficient code that works well on most peoples computers.

  6. #16
    Join Date
    Nov 2012
    Location
    France
    Posts
    538

    Default

    Intel IGP if you don't need very high performance (HD 4600, found on Haswell CPUs).

    GTX 770 for a NVIDIA card, GTX 780 has way worse performance/price.

    R9 270(X) or 280X for an AMD card. 290(X) is too noisy currently, and will be too expensive with custom coolers.

  7. #17
    Join Date
    Jan 2014
    Posts
    7

    Default

    Quote Originally Posted by profoundWHALE View Post
    Nice, and probably the best choice. If it can't handle enough monitors, just get another. Heh.
    Yeah it's the best thing I can do. I plan on using the computer for many years. And after a few I will have to upgrade the GPU. So the 780 would sure last longer without upgrade.

  8. #18
    Join Date
    Aug 2009
    Location
    south east
    Posts
    341

    Default True

    Having more than one monitor breaks your focus. I say just go with the largest thing you can afford that will still fit on your desk.

    AMD/ATI Quality>
    I started out coding OpenGL in Linux on a 3DFX Voodoo 2 then moved up to a TNT2 Ultra. When I finally upgraded again I went to a Radeon 9200 (Basically a 90SUMHundred because I had no idea of the performance band) shit when downhill fast. Who knew a 9200 was weaker than an 8500? With Nvidia it was next number better performance. Of course I know Nvidia now plays the numbers game but you know that GT, then GTX is top of the line. Point is, Radeon is the biggest waste of money if you're looking into Linux OpenGL development. You'll always be waiting for the 80% solution. Just around that proverbial conner.

    You want to know why OpenGL fails on Radeon? Because you're missing 20% of the specification. I can code shit in Mesa with the software rendering which works perfect and then try to run it on a Radeon and see it fail miserably.

    Awesome!

    Nvidia just put out an update for their 6xxx series. Radeon deprecated their 9xxx and everything else legacy series 3 years ago.
    To be a company that just won two of the biggest consoles contracts, they sure are greedy with their driver support.

    Who loves you? Nvidia does baby!!!

  9. #19
    Join Date
    Sep 2013
    Posts
    171

    Default

    Quote Originally Posted by ChrisXY View Post
    Well, if it's only up to OpenGL 3.x, a quick run on llvmpipe or swrast or so will probably give some better indication too whether it isn't just syntactically correct but also look like something you tried to do.
    llvmpipe is broken for many advanced shaders and best way for testing them it's Intel hardware imho.

  10. #20
    Join Date
    Oct 2013
    Location
    Canada
    Posts
    312

    Cool

    Quote Originally Posted by squirrl View Post
    Having more than one monitor breaks your focus. I say just go with the largest thing you can afford that will still fit on your desk.

    AMD/ATI Quality>
    I started out coding OpenGL in Linux on a 3DFX Voodoo 2 then moved up to a TNT2 Ultra. When I finally upgraded again I went to a Radeon 9200 (Basically a 90SUMHundred because I had no idea of the performance band) shit when downhill fast. Who knew a 9200 was weaker than an 8500? With Nvidia it was next number better performance.
    Nividia has the exact same numbering system. For example, a 680 would be faster than a 760. How could you not know about the first number being generation, and the number after being model/tier number?

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •