Announcement

Collapse
No announcement yet.

Are Open-Source GPU Drivers Sufficient For 4K Linux Gaming?

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Espionage724 View Post
    It all depends on what game you're playing, along with personal preference.

    If I got 35fps on a game like Xonotic or StepMania for example, that's unplayable to me. 35fps on something like Guild Wars 2 or Skyrim though is fine for me. Generally speaking, anything competitive and/or requiring fast reaction times, I need the FPS above my monitor's refresh rate (60)
    Also this. Try 50 fps in Audiosurf or Guitar Hero and you're f...ed. The speedier the game, the higher frame rate you need to see it smooth.

    Comment


    • #12
      Originally posted by joh??n View Post
      I do not agree with the statement in the article that 35 fps is not playable and 60 fps is "roughly" playable. In my experience 35 fps is just fine and I can play any game without problems. And when you get 60 fps, then it's smooth as hell.

      Found the guy who never had a 120 Hz Monitor.

      Comment


      • #13
        Originally posted by joh??n View Post
        I do not agree with the statement in the article that 35 fps is not playable and 60 fps is "roughly" playable. In my experience 35 fps is just fine and I can play any game without problems. And when you get 60 fps, then it's smooth as hell.

        That depends entirely on whether frames are being repeated multiple times due to the 16.67ms refresh window on fixed refresh monitors. That's why Frame Latency benchmarking is a thing now, because it's possible to have a silky smooth 30 FPS, but a stuttery as hell 60 FPS. It all depends on how consistently you can create frames within a certain time window.

        Point being, it's quite possible to internally create 60 frames per second, but only display half of them to the screen. Which again, is why FPS isn't the be all end all anymore.

        Comment


        • #14
          And this is partly why I'm still using a Dell Professional 19" screen with 1280x1024 resolution. Even my "old" GTX-560ti-448 card can push very good frame rates on modern games.

          I don't really understand the need for mega-pixel screens like 4k. You need a 60+ inch display to even see the difference between 1080p and 4k at a normal viewing distance.

          Comment


          • #15
            Originally posted by Delgarde View Post
            I saw no mention of the Intel graphics you've been raving about a bunch lately. Is that hardware not capable of running a 4K screen?
            I can run a 4K 30Hz screen using an Intel Braswell and open source drivers. I suppose gamers want 60Hz though.

            Comment


            • #16
              Originally posted by lostdistance View Post

              I can run a 4K 30Hz screen using an Intel Braswell and open source drivers. I suppose gamers want 60Hz though.
              Still would have been nice to try it, and include it in the benchmarks to see how Intel's best compares. Certainly, I wouldn't expect much, but the comparison would be interesting.

              Comment


              • #17
                Originally posted by torsionbar28 View Post
                You need a 60+ inch display to even see the difference between 1080p and 4k at a normal viewing distance.
                At a normal TV-viewing distance, yes. People are generally much closer to their computer monitors.

                Comment


                • #18
                  If the Nouveau drivers do not detect the 4K resolutions, can they be added manually through gtf/cvt and using xrandr with the custom resolution? Just curious.

                  Comment


                  • #19
                    Originally posted by Espionage724 View Post
                    It all depends on what game you're playing, along with personal preference.

                    If I got 35fps on a game like Xonotic or StepMania for example, that's unplayable to me. 35fps on something like Guild Wars 2 or Skyrim though is fine for me. Generally speaking, anything competitive and/or requiring fast reaction times, I need the FPS above my monitor's refresh rate (60)
                    There is indeed a huge difference between 30 and 60fps for certain types of games.

                    On my Playstation 3, playing Project Diva at the PS's locked framerates (30-ish?) is not very satisfying and the icons appear to move faster (less time to hit them),
                    In the arcade machine that runs on 60fps, the game is silky smooth and the icons even appear to move a fair bit slower.

                    Comment


                    • #20
                      Originally posted by torsionbar28 View Post
                      I don't really understand the need for mega-pixel screens like 4k. You need a 60+ inch display to even see the difference between 1080p and 4k at a normal viewing distance.
                      - Stereo 3D. Depending on how stereo is implemented, it can eat half of your resolution (passive glasses) or half of your frame rate (shutter glasses)

                      - Virtual Reality: that's the extreme case where the viewing distance is short and display covers a very big field of view (as close as possible to full 180?). To avoid having a "pixelated" effect (it's usually called "screen door effect" by the VR guys), the higher the resolution, the better.

                      Thanks to VR, handheld-sized 4k screen are very likely coming soon, they will be used it future VR hardware, and that will require support from the GFX cards.
                      Next Gen occulus rift will very likely be 4k and ability to run it smoothly will very likely the main point on the feature check list of next gen Nvidia and AMD cards.
                      That not based on actually product announcement, but that's my own personnal prediction of where things are heading in the near future.

                      Comment

                      Working...
                      X