Announcement

Collapse
No announcement yet.

How to disable underscan when using kms?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by Pallokala View Post
    I would be happy with (temporary) module parameter "underscan" which with f.ex "underscan=0" would disable underscan.
    ... radeon.underscan=off would be a nice option on the kernel command line.

    Comment


    • #22
      Originally posted by pingufunkybeat View Post
      Nope, it only offers "Wide", "Zoom" and "4:3". No "Screen Fit", it was probably added to later models.
      On my Samsung LE40A656 the correct option to disable over-/underscan is to set picture size to "Just scan", and it's only available on HDMI2.

      Comment


      • #23
        Originally posted by droidhacker View Post
        No offence Alex, we all really appreciate all the work you've done and continue to do on the radeon driver, but this policy of "always underscan" is just plain broken. As I've mentioned, it isn't just the problem of the black borders, but more importantly, the fact that the output is being scaled down.
        Just because it's broken for you doesn't mean it's broken for everyone. The average user sees an image with the edges cut off on their TV and thinks it's a driver bug. If you don't scale you lose pixels.

        Originally posted by droidhacker View Post
        I guess the main question is whether the specific panel has a full 1920x1080 visible area. I believe that all new panels do (regardless of whether it is possible to map a 1080p input signal directly to native resolution), possibly some older panels do not and overscan by cropping the edges off an image that is too large for the native resolution of the panel (which, I believe, is essentially how CRT's work...). In these cases, I believe that a better method to underscan would be to apply some voodoo to the xserver... by voodoo, I mean that the underscan mode should not scale the image, but rather the desktop resolution should be smaller than the output resolution (i.e. 1856x1016), and the desktop image should be extended to the output resolution by adding a black border around the outside.

        This would have the desired underscan effect without the side effect of scaling the image (which is extremely ugly). Now this would probably be best implemented as part of the xserver rather than the GPU driver, and would make it simpler for the user to adjust the degree of underscan and other properties of the image, such as alignment... or, I suppose that it could be implemented in the GPU driver, but then it might be a little more awkward to configure.
        Panels have fixed timing and a fixed number of pixels in the panel. Unless the vendor decided to hide some of the pixels behind the bezel, they are all visible on the screen. For every mode other that the native mode, TVs use a scaler to scale the image so that the panel always displays the native timing. For overscan, the TV scales the image up slightly; if the panel is 1920x1080 and you feed it 1280x720, it scales the image up to 1920x1080, etc.

        If you set some cropped mode like 1856x1016 like you suggested, we'd get a ton of bug reports about the driver not detecting the proper mode, "my tv is 1920x1080, but the driver detects 1856x1016", etc.

        As I've said before, there no good solution to this problem that works for every case.

        Comment


        • #24
          Originally posted by agd5f View Post
          Just because it's broken for you doesn't mean it's broken for everyone. The average user sees an image with the edges cut off on their TV and thinks it's a driver bug. If you don't scale you lose pixels.
          Saving pixels at the expense of having to scale... not a good tradeoff since then NOBODY can get a nice clear image... and that *is* broken.

          I.e., you are breaking the driver to sort-of (hackishly) fix hardware that really isn't designed to do this, whereas there IS a lot of hardware that IS designed to be able to do this.

          I'm really not certain how you can justify breaking support for COMPATIBLE hardware in favor of INCOMPATIBLE hardware, because that is what is happening.

          Panels have fixed timing and a fixed number of pixels in the panel. Unless the vendor decided to hide some of the pixels behind the bezel, they are all visible on the screen.
          That fact is not in dispute. The issue is regarding what the panel *actually* is, which is not necessarily related to the input you feed into it. If the panel itself is ACTUALLY 1856x1016 (or whatever), it doesn't mean that you can't feed it 1920x1080. The TV will do its voodoo and output whatever it feels like -- which is actually a SMARTER way to do overscan since it doesn't require scaling to begin with. It results in superior picture quality than you would end up with by taking a 1920x1080 input, cropping the edges, and scaling it back up to 1920x1080.

          For every mode other that the native mode, TVs use a scaler to scale the image so that the panel always displays the native timing. For overscan, the TV scales the image up slightly; if the panel is 1920x1080 and you feed it 1280x720, it scales the image up to 1920x1080, etc.
          Right...

          Here's the thing though... if the panel has a native mode of 1920x1080 and it ALWAYS overscans, then there is no input in the world that will actually be displayed without some kind of scaling. Feeding it 1920x1080, the input would be scaled up to OVER 1920x1080 and cropped (or cropped first and then scaled)... which appears to be what most newer TV's actually do in their default state (mine does). The ONLY way for the TV to map its input to panel pixel-to-pixel without any kind of scaling AND STILL achieve overscan is to either have a native panel resolution LOWER than 1920x1080 and crop, OR to hide pixels around the outside.

          So... do ALL TV's scale up and crop for overscan? Or do some have a smaller native resolution and JUST crop?

          The point is, that you don't necessarily KNOW the actual native mode of the panel based on what you get out of the EDID since the TV can do its own scaling without telling you.

          The second point is obviously that it is BETTER to feed in some form of unscaled image for devices that *are* able to map input to panel 1:1, which includes all TV's where you have the option to display native mode and have a REAL resolution of 1920x1080, as well as all devices that use either a smaller panel resolution or hiding pixels methods for achieving overscan.... and these two options certainly must cover the majority of digital TV's, since it would be just braindead to develop a TV that CAN'T operate without scaling the image.

          If you set some cropped mode like 1856x1016 like you suggested,
          That's not quite what I suggested. I suggested sending the TV an actual 1080p that IS underscanned, but withOUT scaling anything, and there are two ways to accomplish this that I can see; either by the driver presenting a FAKE SMALLER MODE and performing voodoo, *OR* by getting the xserver to deal with it.

          we'd get a ton of bug reports about the driver not detecting the proper mode, "my tv is 1920x1080, but the driver detects 1856x1016", etc.
          Invalid bug reports are part of the business, aren't they? I figure that as long as it is documented clearly, it should be obvious to anyone but the most complete morons.

          As I've said before, there no good solution to this problem that works for every case.
          True, you can never totally satisfy everyone, but the way it is now vs 2.6.35 and older is probably 45/45/10 (with that last 10 not liking either option). You can get that up to 90/10 simply by giving the user the ability to choose one or the other. And it is EASY... and can allow the DEFAULT behavior to remain as it is right now.... just a simple module parameter to globally disable underscan. That would satisfy everyone with a proper display that can operate without any form of scaling or cropping, and everyone who doesn't mind scaled input.

          The only people it won't satisfy are those who have two TV's with differing input requirements, or those whose TV's have a visible resolution lower than the *reported* native mode who COULD have 1:1 output. And these people, of course, are already unsatisfied.

          In the very least, there really should be support for people whose hardware is actually compatible....

          Comment


          • #25
            Scaling is how both the tv and the driver deal with overscan, it just depends who does the scaling. It's not some crazy hack. That's how you compensate for it if you want to retain the entire image. Before I added the underscan option, users with TVs with overscan were unhappy and now users with TVs without overscan are unhappy. You can disable underscan if you don't like it, that's why I added a knob. Now everyone can be happy.

            Comment


            • #26
              I would be happy if I could disable it in some permanent way in xorg.conf not xrandr every time I start X

              Comment


              • #27
                +1 for kernel parameter, I often don't go into X at all...

                Comment


                • #28
                  Over here in Japan, i'd say 75% of the monitors in the computer shops are 16:9 1080p. All of the ones I have experience with have issues with being underscanned for no reason. This is not just on the Radeon driver, but on the Catalyst driver as well. With Catalyst you can disable underscan once and forget about it. I support the notion that the Radeon driver should be able to do the same, via one of the methods mentioned above. Doing it every time via xrandr is just a pain. Newer panels just don't need it.

                  Comment


                  • #29
                    At least you can recompile your kernel with all instances of "UNDERSCAN_AUTO" replaced with "UNDERSCAN_OFF".... but that IS a REAL PAIN.


                    Broken, broken, broken....

                    Comment


                    • #30
                      Note: Everybody who has this problem should create a bug report at somewhere that will be read, like kernel or freedesktop.

                      Comment

                      Working...
                      X