If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
While someone could design hardware to potentially work with 10k modes or whatever, it's probably not a good use of resources. Support for arbitrary modes comes down to a few things:
- link support. You need to make sure the spec supports high enough clocks over the link to support the mode. E.g., HDMI uses single link TMDS. The original TMDS spec only supported clocks <=165 Mhz. Newer versions of HDMI have increased this.
- clock support. The clock hardware on the asic has to be validated against higher clock rates. If there is no equipment to use the higher clocks, it's harder to validate and increases validation costs to support something that may or may not come to be during the useful lifetime of the card.
- line buffer support. Once again, it takes die space for the line buffers. Extra die space costs money. You don't want to add and validate more line buffer than you need to cover a reasonable range of monitors that will likely come to be during the card's useful lifetime.
Also, the current kernel and even older kernels can handle 4k timings just fine using the standard mode structs. What got added in 3.12 was support for fetching 4K modes from the extended tables in the monitor's EDID.
Last edited by agd5f; 10 September 2013, 10:21 AM.
And how exactly do you expect the software to support something the hardware doesn't support?
Forward thinking specs. Why shouldn't the kernel support 10k resolutions? Maybe someone will announce a 10k TV tomorrow. There's no technical reason to limit screen resolution output within the kernel. Sure, no device on the planet may support then, yet, but they are there for the day they will be.
Now, if you want to limit what outputs you can use within the graphic cards driver layer, that's fine. But there's NO reason the kernel should be arbitrarily limited in this way. The Kernel should be able to support any possible screen resolution.
It doesn't matter what the software is capable of if the hardware can't do it, so it's better to not let the drivers say "hey look, I can do 4k screens!" and someone attaches one only to find out it doesn't work due to a hardware limitation.
Driver != Kernel. If the GPU can handle 4k output, if the device can handle 4k output, and the transport layer can handle 4k output, then it should be allowed as an output. Even if this isn't the case, the Kernel should have the ability to output 4k should these conditions become true.
This isn't the same thing as having a CPU too slow to play a game, because the game will still run. If you buy a 4K screen because your drivers SAY they can support it, you're going to be pretty unhappy to find out the screen won't even leave standby.
Again, Driver != Kernel. If the screen doesn't support 4k, the drivers should disable it as an option. The Kernel, however, should be fully capable of handling 4k. And 8k. And 200k. Its the driver layer which specifies what the actual output is going to be for a given device.
Setting screen resolutions is a lot more complicated than most people are aware of - there's a lot more than width, height, color depth, and refresh rate. It isn't as simple as just flicking a switch and suddenly getting 4K resolutions.
Which you can also handle. There is no technical reason why the Kernel shouldn't have support for every possible screen resolution, color depth, refresh rate, under the sun. If you want to limit what is output for a connected device in driver land, find, but the kernel itself shouldn't have any limitations because "Its not supported by hardware yet".
I can tell you this, once those 4K monitors are available next year with the new HDMI 2.0, I will be doing some testing with Ubuntu (maybe Debian too depending on how I can set things up) gaming and media with the latest GPUs I can get my grubby little paws on and I will share the results with you wonderful penguin loving oddballs.
Somehow, I suspect that most of this "4k support" stuff involves two considerations;
1) That the necessary protocols (i.e. HDMI 2.0, DP 1.2) are actually implemented in software,
2) Making sure that things aren't *broken* and just not discovered with conventional displays. I.e., things that divide the screen into 4 quadrants and switch them around would be considered "broken".
So? That's hardware, not software. The software should support any resolution setting under the sun. Its the drivers/hardware which need to be concerned about bandwidth, connectors, and whatnot.
And how exactly do you expect the software to support something the hardware doesn't support? If every software group thought that way the computer industry would be destroyed. It doesn't matter what the software is capable of if the hardware can't do it, so it's better to not let the drivers say "hey look, I can do 4k screens!" and someone attaches one only to find out it doesn't work due to a hardware limitation. This isn't the same thing as having a CPU too slow to play a game, because the game will still run. If you buy a 4K screen because your drivers SAY they can support it, you're going to be pretty unhappy to find out the screen won't even leave standby.
Setting screen resolutions is a lot more complicated than most people are aware of - there's a lot more than width, height, color depth, and refresh rate. It isn't as simple as just flicking a switch and suddenly getting 4K resolutions.
Because they demand more bandwidth than your average video signal is capable of. Even doing 2 lower-resolution monitors will increase your GPU clocks whether you're rendering anything or not.
So? That's hardware, not software. The software should support any resolution setting under the sun. Its the drivers/hardware which need to be concerned about bandwidth, connectors, and whatnot.
Why do 4K monitors/outputs have to be treated specially? I would have thought it simple came down to mode setting in the driver.
Because they demand more bandwidth than your average video signal is capable of. Even doing 2 lower-resolution monitors will increase your GPU clocks whether you're rendering anything or not.
Leave a comment: