Announcement

Collapse
No announcement yet.

Open-Source Linux Driver Support For 4K Monitors

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • FourDMusic
    replied
    Originally posted by gamerk2
    Forward thinking specs. Why shouldn't the kerne..
    Originally posted by agd5f View Post
    While someone could design hardware....
    thank you for clearing this up!

    Leave a comment:


  • agd5f
    replied
    While someone could design hardware to potentially work with 10k modes or whatever, it's probably not a good use of resources. Support for arbitrary modes comes down to a few things:
    - link support. You need to make sure the spec supports high enough clocks over the link to support the mode. E.g., HDMI uses single link TMDS. The original TMDS spec only supported clocks <=165 Mhz. Newer versions of HDMI have increased this.
    - clock support. The clock hardware on the asic has to be validated against higher clock rates. If there is no equipment to use the higher clocks, it's harder to validate and increases validation costs to support something that may or may not come to be during the useful lifetime of the card.
    - line buffer support. Once again, it takes die space for the line buffers. Extra die space costs money. You don't want to add and validate more line buffer than you need to cover a reasonable range of monitors that will likely come to be during the card's useful lifetime.

    Also, the current kernel and even older kernels can handle 4k timings just fine using the standard mode structs. What got added in 3.12 was support for fetching 4K modes from the extended tables in the monitor's EDID.
    Last edited by agd5f; 10 September 2013, 10:21 AM.

    Leave a comment:


  • gamerk2
    replied
    Originally posted by schmidtbag View Post
    And how exactly do you expect the software to support something the hardware doesn't support?
    Forward thinking specs. Why shouldn't the kernel support 10k resolutions? Maybe someone will announce a 10k TV tomorrow. There's no technical reason to limit screen resolution output within the kernel. Sure, no device on the planet may support then, yet, but they are there for the day they will be.

    Now, if you want to limit what outputs you can use within the graphic cards driver layer, that's fine. But there's NO reason the kernel should be arbitrarily limited in this way. The Kernel should be able to support any possible screen resolution.

    It doesn't matter what the software is capable of if the hardware can't do it, so it's better to not let the drivers say "hey look, I can do 4k screens!" and someone attaches one only to find out it doesn't work due to a hardware limitation.
    Driver != Kernel. If the GPU can handle 4k output, if the device can handle 4k output, and the transport layer can handle 4k output, then it should be allowed as an output. Even if this isn't the case, the Kernel should have the ability to output 4k should these conditions become true.

    This isn't the same thing as having a CPU too slow to play a game, because the game will still run. If you buy a 4K screen because your drivers SAY they can support it, you're going to be pretty unhappy to find out the screen won't even leave standby.
    Again, Driver != Kernel. If the screen doesn't support 4k, the drivers should disable it as an option. The Kernel, however, should be fully capable of handling 4k. And 8k. And 200k. Its the driver layer which specifies what the actual output is going to be for a given device.

    Setting screen resolutions is a lot more complicated than most people are aware of - there's a lot more than width, height, color depth, and refresh rate. It isn't as simple as just flicking a switch and suddenly getting 4K resolutions.
    Which you can also handle. There is no technical reason why the Kernel shouldn't have support for every possible screen resolution, color depth, refresh rate, under the sun. If you want to limit what is output for a connected device in driver land, find, but the kernel itself shouldn't have any limitations because "Its not supported by hardware yet".

    Leave a comment:


  • TheLexMachine
    replied
    I can tell you this, once those 4K monitors are available next year with the new HDMI 2.0, I will be doing some testing with Ubuntu (maybe Debian too depending on how I can set things up) gaming and media with the latest GPUs I can get my grubby little paws on and I will share the results with you wonderful penguin loving oddballs.

    Leave a comment:


  • droidhacker
    replied
    Somehow, I suspect that most of this "4k support" stuff involves two considerations;
    1) That the necessary protocols (i.e. HDMI 2.0, DP 1.2) are actually implemented in software,
    2) Making sure that things aren't *broken* and just not discovered with conventional displays. I.e., things that divide the screen into 4 quadrants and switch them around would be considered "broken".

    Leave a comment:


  • 89c51
    replied
    Now that i think about it.



    Michael you should buy a 4k monitor to test what the devs claim to work.

    Leave a comment:


  • schmidtbag
    replied
    Originally posted by gamerk2 View Post
    So? That's hardware, not software. The software should support any resolution setting under the sun. Its the drivers/hardware which need to be concerned about bandwidth, connectors, and whatnot.
    And how exactly do you expect the software to support something the hardware doesn't support? If every software group thought that way the computer industry would be destroyed. It doesn't matter what the software is capable of if the hardware can't do it, so it's better to not let the drivers say "hey look, I can do 4k screens!" and someone attaches one only to find out it doesn't work due to a hardware limitation. This isn't the same thing as having a CPU too slow to play a game, because the game will still run. If you buy a 4K screen because your drivers SAY they can support it, you're going to be pretty unhappy to find out the screen won't even leave standby.

    Setting screen resolutions is a lot more complicated than most people are aware of - there's a lot more than width, height, color depth, and refresh rate. It isn't as simple as just flicking a switch and suddenly getting 4K resolutions.

    Leave a comment:


  • gamerk2
    replied
    Originally posted by schmidtbag View Post
    Because they demand more bandwidth than your average video signal is capable of. Even doing 2 lower-resolution monitors will increase your GPU clocks whether you're rendering anything or not.
    So? That's hardware, not software. The software should support any resolution setting under the sun. Its the drivers/hardware which need to be concerned about bandwidth, connectors, and whatnot.

    Leave a comment:


  • schmidtbag
    replied
    Originally posted by FourDMusic View Post
    Why do 4K monitors/outputs have to be treated specially? I would have thought it simple came down to mode setting in the driver.
    Because they demand more bandwidth than your average video signal is capable of. Even doing 2 lower-resolution monitors will increase your GPU clocks whether you're rendering anything or not.

    Leave a comment:


  • 89c51
    replied
    Originally posted by macemoneta View Post
    $1,118.08 & FREE Shipping.
    Seiki Digital SE50UY04 50-Inch 4K UHD 120Hz LED HDTV:


    $699.00 & FREE Shipping.
    Seiki Digital SE39UY04 39-Inch 4K Ultra HD 120Hz LED TV:
    http://www.amazon.com/Seiki-Digital-...dp/B00DOPGO2G/
    The ASUS and Sharp ones are around 3.5K $ which IS steep. And i believe the Samsung and Dell that are coming will be similarly priced.

    At least i know that my card can support this in the case they become cheap.

    Leave a comment:

Working...
X