Announcement

Collapse
No announcement yet.

25-Way Open-Source Linux Graphics Card Comparison

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Dukenukemx View Post
    Am I the only one who doesn't see value of resolutions past 720P, let alone 1080P? What I really don't understand is 1600P, which is just barely above 1080P. Graphic cards have gotten so powerful that the only way to stress them is to just turn up the resolutions. My monitor maxes out at 1680X1050 resolution, but most games I run at 1368X768. My 5-6 year old monitor that still works just fine today.

    As for dBm of video cards, that's really stupid. If the noise is an issue you would have just replaced the cooler yourself. None of my graphic cards keep their original coolers for too long, because the card usually runs too hot and loud from the factory anyway. I got a water block on my HD 6850, and that makes no noise.
    It all has to do with Pixels per Inch. If it's too low then you get aliasing, which yes you can use anti-aliasing to get rid of but then that muddies up the textures which then you have to use anisotropic filtering to fix, the proper solution is really just to increase the resolution, and that's just 3d games (2D games don't really benefit other than with possible upscaling issues) if we're to consider desktop activities fonts and pictures can have more detail in them in the same area which makes them look nicer assuming that the fonts and icons are vectors as opposed to pixmaps. The problem with pixmaps is that you may have to upscale them and that can degrade image quality. Alternatively lets say you took a picture with a 20MP camera the higher resolution lets you see more of the details of the picture.

    Comment


    • #12
      Originally posted by macemoneta View Post
      Do you have a smartphone?



      4K TVs are starting to be used as monitors, and with 39" 4K TVs available as low as $400, they are relevant.

      http://dissociatedpress.net/brief-re...-39-tvmonitor/
      They're 30hz...

      Comment


      • #13
        Originally posted by n3wu53r View Post
        They're 30hz...
        If you are using Nouveau, that might be overkill =P

        Comment


        • #14
          Originally posted by Dukenukemx View Post
          Am I the only one who doesn't see value of resolutions past 720P, let alone 1080P? What I really don't understand is 1600P, which is just barely above 1080P.
          Lol, I can still see pixels and font smoothing on my 2013 nexus 7 which is 1920 x 1200 for 7", when holding it at about 20 centimeters, so yes, I think you're alone.

          Comment


          • #15
            On the db issue....

            There are tables on the internet showing how loud the various coolers are.

            Comment


            • #16
              Originally posted by zanny View Post
              Higher pixel density actually has an added bonus not many people consider - once the pixels are small enough, you don't have to perform expensive font antialiasing or alignment to subpixels, and your games don't need AA either. Having to output to much larger screens has more performance overhead than AA on its own, but you get all the benefits of higher clarity and resolution and also get to reduce your AA effects. I think it is almost always a win - 4k is probably the first time you could use no AA in most games given good pixel density (25 - 30" panels) and not see any jagged edges.
              This is wrong. You always need AA.

              Comment


              • #17
                Originally posted by macemoneta View Post
                At this point, I'd like to see 4K performance comparisons and sound level dBm measurements added to the benchmarks for video cards.
                Not worth it yet since all of the screens are exspensive and can only handle 30hz at full resolution.

                Now multimonitor scaling would be more interesting. If purely gaming this is a much better setup then a 16x9 monitor, even if it's 4k. https://www.youtube.com/watch?v=IyQoFPO8RyA

                Comment


                • #18
                  Originally posted by Calinou View Post
                  This is wrong. You always need AA.
                  Yes, the remarks puzzled me as well. If anything the lower your pixel density, the less AA you need. The need for anti-aliasing is created by displaying objects at a higher resolution than they are processed at. It's true that lower pixel densities mean pixel alignment issues when trying to display non-native
                  resolutions (that is, lower resolutions that they are not multiples of), but there is much more to anti-aliasing than compensating for pixel alignment issues. (Icidentally, it's not that pixel alignment issues don't actually exist with high pixel density displays, it's just that if the densities are high enough the alignment issues are not noticeable.)
                  Last edited by CFWhitman; 27 January 2014, 12:23 PM.

                  Comment


                  • #19
                    Originally posted by Dukenukemx View Post
                    Am I the only one who doesn't see value of resolutions past 720P, let alone 1080P? What I really don't understand is 1600P, which is just barely above 1080P. Graphic cards have gotten so powerful that the only way to stress them is to just turn up the resolutions. My monitor maxes out at 1680X1050 resolution, but most games I run at 1368X768. My 5-6 year old monitor that still works just fine today.
                    I think that you're one of the few who at least sees it that way for desktop usage. I've got dual 23" 1080p screens at work, and a 1920x1200 24" at home. In both cases, I wish I had more vertical resolution to put code on. Dual 1600p screens might finally get me into a situation that I am happy with at work.

                    For a 13" laptop screen, 1366x768 is acceptable. Anything larger than that should be higher resolution if you want to get work done.

                    For games, it's a different story... depending on the game, and also depending on whether you're using AA.

                    Comment


                    • #20
                      Originally posted by Veerappan View Post
                      I think that you're one of the few who at least sees it that way for desktop usage. I've got dual 23" 1080p screens at work, and a 1920x1200 24" at home. In both cases, I wish I had more vertical resolution to put code on. Dual 1600p screens might finally get me into a situation that I am happy with at work.

                      For a 13" laptop screen, 1366x768 is acceptable. Anything larger than that should be higher resolution if you want to get work done.

                      For games, it's a different story... depending on the game, and also depending on whether you're using AA.
                      None of these resolutions are acceptable anymore. AMD GPUs can handle resolution outputs of 16000x16000 since the HD6000 series. Intel supports 4096x496. I dunno what Nvidia supports but it's gotta be somewhere between that.

                      You can now get a 5.5" cellphone screen with a 2560x1440 534.04 DPI res screen. If regular screens kept pace with cellphones then anything less then a 15" 301.89 DPI 3840x2400 WQUXGA would be considered old garbage.

                      Why you can't get a screen with a DPI anywhere even close to that on a desktop or laptop is an incomprehensible mystery to me.

                      Comment

                      Working...
                      X