Announcement

Collapse
No announcement yet.

How to disable underscan when using kms?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Wouldn't a kernel option that sets the *default* behavior (but still allows overriding with xrandr) catch most of these problems?
    I can't imagine there being many users with multiple HDMI screens that require different underscan settings, much less users in that group that drive these monitors with framebuffer.

    Of course sysfs sounds like a cleaner solution, but it's also more complicated to implement and use.

    Comment


    • #12
      Originally posted by agd5f View Post
      Most TV's, if not all, underscan by default since cable and satellite assume overscan and there's often garbage on the edges of the blanking areas due to vbi data. On the good ones you can turn it off, but most average users don't even know what underscan is, and I before I added the feature I got tons of bugs along the lines of "radeon crops image" or "radeon chooses wrong mode", etc. Unfortunately, there is no way for the driver to know whether the TV is overscanning or not.

      The problem with a module parameter is that it's global and would either apply to all digital outputs or you'd need to have some hacky way to tell it what connectors you wanted it to apply to. I'm sure there are users out there that would want it disabled on one connector, but enabled on another, etc. As I've mentioned before, the best solution would be to expose the drm connector properties (tv standard, underscan, scaling, etc.) generically through sysfs. It shouldn't be too hard a task we already have code to expose the connectors and dpms info via sysfs, it would just need to be extended. Unfortunately, I won't have time to tackle it for a while.
      Giving the option to apply to ALL connectors is better than not giving any option to apply to ANY connectors... Especially since underscan applies mainly (only?) to TV's, and *most* users would only have ONE TV connected at a time.


      @pingufunkybeat: Is there a button on your remote that says "P.SIZE"? It should have an option to set to "Screen Fit" when you have a 720P or higher signal feeding it. I took a random sampling of samsung TV manuals, and they all showed this option.

      Comment


      • #13
        Originally posted by egon2003 View Post
        How old is your LCD TV?
        About 3 years, roughly.

        And are you 100% sure you dont have that setting? It sometimes can have very confusing names in the menu on the TV.
        Pretty sure. All the Samsung-related menu settings I found on the internet either didn't exist or were disabled.

        And most internet threads revolve around "It's a TV, not a monitor, and it's not supposed to be exact to a pixel".

        Comment


        • #14
          Originally posted by droidhacker View Post
          @pingufunkybeat: Is there a button on your remote that says "P.SIZE"? It should have an option to set to "Screen Fit" when you have a 720P or higher signal feeding it. I took a random sampling of samsung TV manuals, and they all showed this option.
          Nope, it only offers "Wide", "Zoom" and "4:3". No "Screen Fit", it was probably added to later models.

          Comment


          • #15
            I have a samsung TV as well and there is no button the remote, and the option to disable underscan was well hidden and poorly named deep in the display configuration menus.

            Comment


            • #16
              I have a LG Plasma screen here and the default setting depends on the naming of inputs. HDMI-input which is specified as "PC" shows things differently and has different options enabled than input specified as "Game" or "DVD".

              Comment


              • #17
                _ALL_ new TV I have seen have a PC mode or somthing similar that has a unscaled picture. THAT is a fact. This auto underscan is just stupid now.

                Comment


                • #18
                  Originally posted by tiltkoko View Post
                  _ALL_ new TV I have seen have a PC mode or somthing similar that has a unscaled picture. THAT is a fact.
                  I'll believe it when I see it. Is there any one enforcing it? Probably not. Also, it doesn't change the fact they still default to overscan and it's often a complex maze of menus to disable it.

                  Comment


                  • #19
                    I would be happy with (temporary) module parameter "underscan" which with f.ex "underscan=0" would disable underscan.

                    Comment


                    • #20
                      No offence Alex, we all really appreciate all the work you've done and continue to do on the radeon driver, but this policy of "always underscan" is just plain broken. As I've mentioned, it isn't just the problem of the black borders, but more importantly, the fact that the output is being scaled down.

                      I guess the main question is whether the specific panel has a full 1920x1080 visible area. I believe that all new panels do (regardless of whether it is possible to map a 1080p input signal directly to native resolution), possibly some older panels do not and overscan by cropping the edges off an image that is too large for the native resolution of the panel (which, I believe, is essentially how CRT's work...). In these cases, I believe that a better method to underscan would be to apply some voodoo to the xserver... by voodoo, I mean that the underscan mode should not scale the image, but rather the desktop resolution should be smaller than the output resolution (i.e. 1856x1016), and the desktop image should be extended to the output resolution by adding a black border around the outside.

                      This would have the desired underscan effect without the side effect of scaling the image (which is extremely ugly). Now this would probably be best implemented as part of the xserver rather than the GPU driver, and would make it simpler for the user to adjust the degree of underscan and other properties of the image, such as alignment... or, I suppose that it could be implemented in the GPU driver, but then it might be a little more awkward to configure.

                      Comment

                      Working...
                      X