Announcement

Collapse
No announcement yet.

Open-Source Linux Driver Support For 4K Monitors

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Open-Source Linux Driver Support For 4K Monitors

    Phoronix: Open-Source Linux Driver Support For 4K Monitors

    While 4K resolution monitors are still extremely expensive, there's growing curiosity over support for 4K monitors by the open-source Linux graphics drivers...

    http://www.phoronix.com/vr.php?view=MTQ1NzE

  • #2
    Extremely expensive?

    $1,118.08 & FREE Shipping.
    Seiki Digital SE50UY04 50-Inch 4K UHD 120Hz LED HDTV:
    http://www.amazon.com/Seiki-Digital-...dp/B00BXF7I9M/

    $699.00 & FREE Shipping.
    Seiki Digital SE39UY04 39-Inch 4K Ultra HD 120Hz LED TV:
    http://www.amazon.com/Seiki-Digital-...dp/B00DOPGO2G/

    Comment


    • #3
      Why do 4K monitors/outputs have to be treated specially? I would have thought it simple came down to mode setting in the driver.

      Comment


      • #4
        "but for Intel hardware I believe there is also 4K monitor support in their latest DRM code."

        Did you forget your own article from August 07, 2013? - http://www.phoronix.com/scan.php?pag...tem&px=MTQzMDg
        You've missed an advertising opportunity there

        Note that HDMI 1.4a/b only supports 4K resolution at 24/25/30 frames per second. This is fine for watching movies, but not for gaming. 4K resolution at 60 frames per second requires the yet to be released HDMI 2.0. Or you could just use a DisplayPort 1.2 connection.

        Comment


        • #5
          Originally posted by FourDMusic View Post
          Why do 4K monitors/outputs have to be treated specially? I would have thought it simple came down to mode setting in the driver.
          The 4k resolutions need support for newest standards (Display Port 1.2), so you need to know that the hardware supports it or you may be running it in an out of spec configuration.

          Comment


          • #6
            Originally posted by macemoneta View Post
            $1,118.08 & FREE Shipping.
            Seiki Digital SE50UY04 50-Inch 4K UHD 120Hz LED HDTV:
            http://www.amazon.com/Seiki-Digital-...dp/B00BXF7I9M/

            $699.00 & FREE Shipping.
            Seiki Digital SE39UY04 39-Inch 4K Ultra HD 120Hz LED TV:
            http://www.amazon.com/Seiki-Digital-...dp/B00DOPGO2G/
            The ASUS and Sharp ones are around 3.5K $ which IS steep. And i believe the Samsung and Dell that are coming will be similarly priced.

            At least i know that my card can support this in the case they become cheap.

            Comment


            • #7
              Originally posted by FourDMusic View Post
              Why do 4K monitors/outputs have to be treated specially? I would have thought it simple came down to mode setting in the driver.
              Because they demand more bandwidth than your average video signal is capable of. Even doing 2 lower-resolution monitors will increase your GPU clocks whether you're rendering anything or not.

              Comment


              • #8
                Originally posted by schmidtbag View Post
                Because they demand more bandwidth than your average video signal is capable of. Even doing 2 lower-resolution monitors will increase your GPU clocks whether you're rendering anything or not.
                So? That's hardware, not software. The software should support any resolution setting under the sun. Its the drivers/hardware which need to be concerned about bandwidth, connectors, and whatnot.

                Comment


                • #9
                  Originally posted by gamerk2 View Post
                  So? That's hardware, not software. The software should support any resolution setting under the sun. Its the drivers/hardware which need to be concerned about bandwidth, connectors, and whatnot.
                  And how exactly do you expect the software to support something the hardware doesn't support? If every software group thought that way the computer industry would be destroyed. It doesn't matter what the software is capable of if the hardware can't do it, so it's better to not let the drivers say "hey look, I can do 4k screens!" and someone attaches one only to find out it doesn't work due to a hardware limitation. This isn't the same thing as having a CPU too slow to play a game, because the game will still run. If you buy a 4K screen because your drivers SAY they can support it, you're going to be pretty unhappy to find out the screen won't even leave standby.

                  Setting screen resolutions is a lot more complicated than most people are aware of - there's a lot more than width, height, color depth, and refresh rate. It isn't as simple as just flicking a switch and suddenly getting 4K resolutions.

                  Comment


                  • #10
                    Now that i think about it.



                    Michael you should buy a 4k monitor to test what the devs claim to work.

                    Comment


                    • #11
                      Somehow, I suspect that most of this "4k support" stuff involves two considerations;
                      1) That the necessary protocols (i.e. HDMI 2.0, DP 1.2) are actually implemented in software,
                      2) Making sure that things aren't *broken* and just not discovered with conventional displays. I.e., things that divide the screen into 4 quadrants and switch them around would be considered "broken".

                      Comment


                      • #12
                        I can tell you this, once those 4K monitors are available next year with the new HDMI 2.0, I will be doing some testing with Ubuntu (maybe Debian too depending on how I can set things up) gaming and media with the latest GPUs I can get my grubby little paws on and I will share the results with you wonderful penguin loving oddballs.

                        Comment


                        • #13
                          Originally posted by schmidtbag View Post
                          And how exactly do you expect the software to support something the hardware doesn't support?
                          Forward thinking specs. Why shouldn't the kernel support 10k resolutions? Maybe someone will announce a 10k TV tomorrow. There's no technical reason to limit screen resolution output within the kernel. Sure, no device on the planet may support then, yet, but they are there for the day they will be.

                          Now, if you want to limit what outputs you can use within the graphic cards driver layer, that's fine. But there's NO reason the kernel should be arbitrarily limited in this way. The Kernel should be able to support any possible screen resolution.

                          It doesn't matter what the software is capable of if the hardware can't do it, so it's better to not let the drivers say "hey look, I can do 4k screens!" and someone attaches one only to find out it doesn't work due to a hardware limitation.
                          Driver != Kernel. If the GPU can handle 4k output, if the device can handle 4k output, and the transport layer can handle 4k output, then it should be allowed as an output. Even if this isn't the case, the Kernel should have the ability to output 4k should these conditions become true.

                          This isn't the same thing as having a CPU too slow to play a game, because the game will still run. If you buy a 4K screen because your drivers SAY they can support it, you're going to be pretty unhappy to find out the screen won't even leave standby.
                          Again, Driver != Kernel. If the screen doesn't support 4k, the drivers should disable it as an option. The Kernel, however, should be fully capable of handling 4k. And 8k. And 200k. Its the driver layer which specifies what the actual output is going to be for a given device.

                          Setting screen resolutions is a lot more complicated than most people are aware of - there's a lot more than width, height, color depth, and refresh rate. It isn't as simple as just flicking a switch and suddenly getting 4K resolutions.
                          Which you can also handle. There is no technical reason why the Kernel shouldn't have support for every possible screen resolution, color depth, refresh rate, under the sun. If you want to limit what is output for a connected device in driver land, find, but the kernel itself shouldn't have any limitations because "Its not supported by hardware yet".

                          Comment


                          • #14
                            While someone could design hardware to potentially work with 10k modes or whatever, it's probably not a good use of resources. Support for arbitrary modes comes down to a few things:
                            - link support. You need to make sure the spec supports high enough clocks over the link to support the mode. E.g., HDMI uses single link TMDS. The original TMDS spec only supported clocks <=165 Mhz. Newer versions of HDMI have increased this.
                            - clock support. The clock hardware on the asic has to be validated against higher clock rates. If there is no equipment to use the higher clocks, it's harder to validate and increases validation costs to support something that may or may not come to be during the useful lifetime of the card.
                            - line buffer support. Once again, it takes die space for the line buffers. Extra die space costs money. You don't want to add and validate more line buffer than you need to cover a reasonable range of monitors that will likely come to be during the card's useful lifetime.

                            Also, the current kernel and even older kernels can handle 4k timings just fine using the standard mode structs. What got added in 3.12 was support for fetching 4K modes from the extended tables in the monitor's EDID.
                            Last edited by agd5f; 09-10-2013, 10:21 AM.

                            Comment


                            • #15
                              Originally posted by gamerk2
                              Forward thinking specs. Why shouldn't the kerne..
                              Originally posted by agd5f View Post
                              While someone could design hardware....
                              thank you for clearing this up!

                              Comment

                              Working...
                              X