Originally posted by Pallokala
View Post
Announcement
Collapse
No announcement yet.
How to disable underscan when using kms?
Collapse
X
-
Originally posted by pingufunkybeat View PostNope, it only offers "Wide", "Zoom" and "4:3". No "Screen Fit", it was probably added to later models.
Comment
-
Originally posted by droidhacker View PostNo offence Alex, we all really appreciate all the work you've done and continue to do on the radeon driver, but this policy of "always underscan" is just plain broken. As I've mentioned, it isn't just the problem of the black borders, but more importantly, the fact that the output is being scaled down.
Originally posted by droidhacker View PostI guess the main question is whether the specific panel has a full 1920x1080 visible area. I believe that all new panels do (regardless of whether it is possible to map a 1080p input signal directly to native resolution), possibly some older panels do not and overscan by cropping the edges off an image that is too large for the native resolution of the panel (which, I believe, is essentially how CRT's work...). In these cases, I believe that a better method to underscan would be to apply some voodoo to the xserver... by voodoo, I mean that the underscan mode should not scale the image, but rather the desktop resolution should be smaller than the output resolution (i.e. 1856x1016), and the desktop image should be extended to the output resolution by adding a black border around the outside.
This would have the desired underscan effect without the side effect of scaling the image (which is extremely ugly). Now this would probably be best implemented as part of the xserver rather than the GPU driver, and would make it simpler for the user to adjust the degree of underscan and other properties of the image, such as alignment... or, I suppose that it could be implemented in the GPU driver, but then it might be a little more awkward to configure.
If you set some cropped mode like 1856x1016 like you suggested, we'd get a ton of bug reports about the driver not detecting the proper mode, "my tv is 1920x1080, but the driver detects 1856x1016", etc.
As I've said before, there no good solution to this problem that works for every case.
Comment
-
Originally posted by agd5f View PostJust because it's broken for you doesn't mean it's broken for everyone. The average user sees an image with the edges cut off on their TV and thinks it's a driver bug. If you don't scale you lose pixels.
I.e., you are breaking the driver to sort-of (hackishly) fix hardware that really isn't designed to do this, whereas there IS a lot of hardware that IS designed to be able to do this.
I'm really not certain how you can justify breaking support for COMPATIBLE hardware in favor of INCOMPATIBLE hardware, because that is what is happening.
Panels have fixed timing and a fixed number of pixels in the panel. Unless the vendor decided to hide some of the pixels behind the bezel, they are all visible on the screen.
For every mode other that the native mode, TVs use a scaler to scale the image so that the panel always displays the native timing. For overscan, the TV scales the image up slightly; if the panel is 1920x1080 and you feed it 1280x720, it scales the image up to 1920x1080, etc.
Here's the thing though... if the panel has a native mode of 1920x1080 and it ALWAYS overscans, then there is no input in the world that will actually be displayed without some kind of scaling. Feeding it 1920x1080, the input would be scaled up to OVER 1920x1080 and cropped (or cropped first and then scaled)... which appears to be what most newer TV's actually do in their default state (mine does). The ONLY way for the TV to map its input to panel pixel-to-pixel without any kind of scaling AND STILL achieve overscan is to either have a native panel resolution LOWER than 1920x1080 and crop, OR to hide pixels around the outside.
So... do ALL TV's scale up and crop for overscan? Or do some have a smaller native resolution and JUST crop?
The point is, that you don't necessarily KNOW the actual native mode of the panel based on what you get out of the EDID since the TV can do its own scaling without telling you.
The second point is obviously that it is BETTER to feed in some form of unscaled image for devices that *are* able to map input to panel 1:1, which includes all TV's where you have the option to display native mode and have a REAL resolution of 1920x1080, as well as all devices that use either a smaller panel resolution or hiding pixels methods for achieving overscan.... and these two options certainly must cover the majority of digital TV's, since it would be just braindead to develop a TV that CAN'T operate without scaling the image.
If you set some cropped mode like 1856x1016 like you suggested,
we'd get a ton of bug reports about the driver not detecting the proper mode, "my tv is 1920x1080, but the driver detects 1856x1016", etc.
As I've said before, there no good solution to this problem that works for every case.
The only people it won't satisfy are those who have two TV's with differing input requirements, or those whose TV's have a visible resolution lower than the *reported* native mode who COULD have 1:1 output. And these people, of course, are already unsatisfied.
In the very least, there really should be support for people whose hardware is actually compatible....
Comment
-
Scaling is how both the tv and the driver deal with overscan, it just depends who does the scaling. It's not some crazy hack. That's how you compensate for it if you want to retain the entire image. Before I added the underscan option, users with TVs with overscan were unhappy and now users with TVs without overscan are unhappy. You can disable underscan if you don't like it, that's why I added a knob. Now everyone can be happy.
Comment
-
Over here in Japan, i'd say 75% of the monitors in the computer shops are 16:9 1080p. All of the ones I have experience with have issues with being underscanned for no reason. This is not just on the Radeon driver, but on the Catalyst driver as well. With Catalyst you can disable underscan once and forget about it. I support the notion that the Radeon driver should be able to do the same, via one of the methods mentioned above. Doing it every time via xrandr is just a pain. Newer panels just don't need it.
Comment
Comment