System: MSI megabook, Intel 945gm / gma950 integrated graphics.
The dream: tear-free video playback.
For those who've not seen tearing / don't know what it is, perhaps you best stop reading, lest you find out your system suffers from it as well.
As with some other problems, once you notice this one, you always will and it will really annoy you where perhaps it didn't before.
First question: if you have a laptop with Intel GMA950 graphics and you know tearing when you see it, do you see it on your laptop also?
In short, I've had this laptop for over 2 years, and have messed with linux since 10, and so I know more about modelines than all youngsters who shout "you should switch to Ubuntu" (I'm using Mandriva, thanks for asking) do, but apparently not enough...
The problem:
video playback is showing tearing, which is only clearly visible as usual during sideways panning camera shots or large objects moving sideways.
This happens with mplayer and xine, with opengl and xv, and naturally whenever I go to slow motion playback, the problem is alleviated, at pause it is naturally absent.
The things I've tried:
- changing display refresh rates
- played with the relatively new LVDSFixedMode option - see below for the reasoning and manpage excerpt
- tried to select only the external vga connection (didn't manage)
First indication of LVDSFixedMode in the man page (man intel):
Seems like that's what I may need; later there is more info on this:
Well, seems like that could explain what goes wrong for me.
Now the problem is: I can't seem to get any modeline accepted by the driver if I turn off LVDSFixedMode, the xorg log file just reports any ddc found modelines and uses that, but the resulting X still shows tearing, or without ddc xorg runs into the 'no screens found' situation...
The dream: tear-free video playback.
For those who've not seen tearing / don't know what it is, perhaps you best stop reading, lest you find out your system suffers from it as well.
As with some other problems, once you notice this one, you always will and it will really annoy you where perhaps it didn't before.
First question: if you have a laptop with Intel GMA950 graphics and you know tearing when you see it, do you see it on your laptop also?
In short, I've had this laptop for over 2 years, and have messed with linux since 10, and so I know more about modelines than all youngsters who shout "you should switch to Ubuntu" (I'm using Mandriva, thanks for asking) do, but apparently not enough...
The problem:
video playback is showing tearing, which is only clearly visible as usual during sideways panning camera shots or large objects moving sideways.
This happens with mplayer and xine, with opengl and xv, and naturally whenever I go to slow motion playback, the problem is alleviated, at pause it is naturally absent.
The things I've tried:
- changing display refresh rates
- played with the relatively new LVDSFixedMode option - see below for the reasoning and manpage excerpt
- tried to select only the external vga connection (didn't manage)
First indication of LVDSFixedMode in the man page (man intel):
Code:
Option "LVDSFixedMode" "boolean" Use a fixed set of timings for the LVDS output, independent of normal xorg specified timings. The default value if left unspecified is true, which is what you want for a normal LVDS- connected LCD type of panel. If you are not sure about this, leave it at its default, which allows the driver to automati‐ cally figure out the correct fixed panel timings. See further in the section about LVDS fixed timing for more information.
Seems like that's what I may need; later there is more info on this:
Code:
HARDWARE LVDS FIXED TIMINGS AND SCALING Following here is a discussion that should shed some light on the nature and reasoning behind the LVDSFixedMode option. Unlike a CRT display, an LCD has a "native" resolution corresponding to the actual pixel geometry. A graphics controller under all normal cir‐ cumstances should always output that resolution (and timings) to the display. Anything else and the image might not fill the display, it might not be centered, or it might have information missing - any man‐ ner of strange effects can happen if an LCD panel is not fed with the expected resolution and timings. However there are cases where one might want to run an LCD panel at an effective resolution other than the native one. And for this reason, GPUs which drive LCD panels typically include a hardware scaler to match the user-configured frame buffer size to the actual size of the panel. Thus when one "sets" his/her 1280x1024 panel to only 1024x768, the GPU happily configures a 1024x768 frame buffer, but it scans the buffer out in such a way that the image is scaled to 1280x1024 and in fact sends 1280x1024 to the panel. This is normally invisible to the user; when a "fuzzy" LCD image is seen, scaling like this is why this happens. In order to make this magic work, this driver logically has to be con‐ figured with two sets of monitor timings - the set specified (or other‐ wise determined) as the normal xorg "mode", and the "fixed" timings that are actually sent to the monitor. But with xorg, it's only possi‐ ble to specify the first user-driven set, and not the second fixed set. So how does the driver figure out the correct fixed panel timings? Normally it will attempt to detect the fixed timings, and it uses a number of strategies to figure this out. First it attempts to read EDID data from whatever is connected to the LVDS port. Failing that, it will check if the LVDS output is already configured (perhaps previ‐ ously by the video BIOS) and will adopt those settings if found. Fail‐ ing that, it will scan the video BIOS ROM, looking for an embedded mode table from which it can infer the proper timings. If even that fails, then the driver gives up, prints the message "Couldn't detect panel mode. Disabling panel" to the X server log, and shuts down the LVDS output. Under most circumstances, the detection scheme works. However there are cases when it can go awry. For example, if you have a panel with‐ out EDID support and it isn't integral to the motherboard (i.e. not a laptop), then odds are the driver is either not going to find something suitable to use or it is going to find something flat-out wrong, leav‐ ing a messed up display. Remember that this is about the fixed timings being discussed here and not the user-specified timings which can always be set in xorg.conf in the worst case. So when this process goes awry there seems to be little recourse. This sort of scenario can happen in some embedded applications. The LVDSFixedMode option is present to deal with this. This option normally enables the above-described detection strategy. And since it defaults to true, this is in fact what normally happens. However if the detection fails to do the right thing, the LVDSFixedMode option can instead be set to false, which disables all the magic. With LVDSFixed‐ Mode set to false, the detection steps are skipped and the driver pro‐ ceeds without a specified fixed mode timing. This then causes the hardware scaler to be disabled, and the actual timings then used fall back to those normally configured via the usual xorg mechanisms. Having LVDSFixedMode set to false means that whatever is used for the monitor's mode (e.g. a modeline setting) is precisely what is sent to the device connected to the LVDS port. This also means that the user now has to determine the correct mode to use - but it's really no dif‐ ferent than the work for correctly configuring an old-school CRT any‐ way, and the alternative if detection fails will be a useless display. In short, leave LVDSFixedMode alone (thus set to true) and normal fixed mode detection will take place, which in most cases is exactly what is needed. Set LVDSFixedMode to false and then the user has full control over the resolution and timings sent to the LVDS-connected device, through the usual means in xorg.
Well, seems like that could explain what goes wrong for me.
Now the problem is: I can't seem to get any modeline accepted by the driver if I turn off LVDSFixedMode, the xorg log file just reports any ddc found modelines and uses that, but the resulting X still shows tearing, or without ddc xorg runs into the 'no screens found' situation...
Comment