If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
INI style configuration files have the advantage of being both easy to parse and easy to edit by hand. If wayland doesn't see widespread adoption it won't be because of a INI file in the reference compositor that the entire thing can function without.
I was hoping that Wayland/Weston would make better use of KMS and be less than X in the configuration side of things.
Weston will happily run with default settings (native resolution for the monitors, etc) without being configured. The problem comes when you want to do something that isn't default. For example, my monitors are auto-configured to have the one that is physically on the left be on the right according to X. This is slightly less than desirable, so there is a configuration file that sets it up properly. The weston one does similar things.
Yes I read the comments, and I have perfectly valid point here, if you don't like it, then that's your problem. But I think the people who comment here are actually the trolls, not me.
In the time you spend here you could have easily installed wayland + weston and tried it yourself. When I start weston I have a perfectly find left/right screen setup with both displays at their preferred/native resolution.
What do you do instead? asdx just rages for a while.
I don't have a problem with using an INI file. My understanding is that it will be manipulated by software but of course the settings have to be kept somewhere. My problem is with the rather simplistic view on multi-monitor support that the example INI file shows. Multi-monitor is all about, well, multiple monitors. Each multi-monitor configuration is different than the other. Say I have a notebook. I can use it as a single screen for most of the time. But when I get to work I plug an external monitor. When I get back home I plug a different monitor. The system has to keep track of the different combinations of monitors I have used and use the last setup for the particular monitor combination until I change the settings (and yes, the settings include cloning or the placement of the displays relative to each other)! That's what Mac OS X was doing back in 2006 when I first tried it and I guess it has been doing it for much longer. I'm not saying that we should copy Mac OS X because it's Mac OS X. We should copy the behaviour because it's the best usability-wise.
Multi-monitor support is a a pain the ass on Linux these days. It seems the basic support is there - you can set up resolutions, although those are not remembered by X. But the window managers' support is a mess. KWin shoots up to 100% CPU when I plug the external display to my notebook. Same with Gnome 3. There were problems with XFCE and LXDE (although I don't remember the exact nature). The only one that worked reasonably well is Unity. But plug a different display and the setting for your old display are gone. And, in any case, the display manager has a different idea of what the monitor layout should look like.
I think the multi-monitor layout configuration should be handled by X/Wayland. When the graphics system is started it should be configured the same way as the last time for that particular combination of monitors. Anything that runs on top, including display managers should just take notice and configure itself accordingly. It really is a pity that Linux multi-monitor support is such usability nightmare considering that technically the problem is already solved. If only someone would implement the sane behaviour at the right place. And use INI files if he wants to
Will Wayland/Weston "just work" if my screens are already detected by KMS?
Yes, it will just work.
Originally posted by asdx
Why can't multiples monitors "just work" also when connected. Say I plug a new monitor, can't Wayland/Weston configure itself and just clone/enlarge the screen in that case?
Yes, "just work" is the very goal here. However, there is no other way to really know, which side a new monitor is to an existing monitor or whether you want a clone or extend the desktop, than letting the user tell us.
Originally posted by asdx
Why do we need a INI file for that?
It is just the first way to let the user tell us what we simply cannot detect.
The defining philosophy of Unix is that "everything is a file." In the context of configurations, this philosophy means that configurable settings should be stored in a file somewhere (as opposed to a Windows Registry, say). This file can be both readable to a person using a text editor or parsable by a GUI (Gnome or KDE Display Settings, for example). Stored in the filesystem, these settings persist between reboots, saving a lot of hassle.
The display settings of monitors attached to a computer certainly warrant configurable options. While KMS might be able to detect the default resolution of a monitor(s), the user might not want this default for reasons that shouldn't need to be mentioned. A few:
Personal aesthetic preference
Visual impairment (simply making the icon / text size bigger is often not the solution granny wants)
Spatial (left, right, up, down) ordering of monitors; I trust I don't need to explain why this can't be detected by KMS
Alternatively, whether a certain subset of monitors should be clones of each other
Note that all of these settings are something that are typically controlled from Display Settings GUI, and I'm sure that a Wayland version of Gnome, KDE, or whatever would rewire their Display Settings GUI to the Wayland configuration file.
... it would be nice also if things just worked for multiples monitors and for most things (by default). Say, I attach a new monitor and then the system detects it automatically and Weston "adapts" to it on the fly, then I could add that option to the INI file if I wanted to, or in the next reboot, KMS already detects the new monitor automatically...
That would be nice, and I assume it will work that way. My guess would be that the default way to handle multiple monitors would be something like this. On boot up, make a naive spatial ordering of the connected monitors, e.g.
from left to right. The leftmost monitor would be the "main" screen, and the others would exist off to the right. Then any other monitors connected after boot up would be stuck on the far right. This is a simple and perfectly usable default behavior. If you already had some configurations in the config file, this could be changed to
(Defined in .ini) -> Undefined LVDS1 -> Undefined LVDS 2 -> etc.
Anything of the extra undefined monitors are stuck to the right of the layout specified by the user's saved configuration. I guess if the saved configuration is 2D, there would have to be some arbitrary choice about the vertical location of these extra screens, but that's trivial.
EDIT: I think the job of asking the user if he/she wants to clone or reorient layout when a new monitor is added would best be a job for the desktop environment. Wayland(Weston?) would pick the default, something like the schematic I made above, but then Gnome/KDE/other could, if they wanted, pop up their Display Settings GUI to let the user change settings and save them (to the config file) right away.
Can you answer this question from the OS' point of view?
I have 3 video cards and 6 screens. Since that is all the OS knows, please determine the primary screen as well as correct rotation, orientation, and position of each screen to prove that we can have the OS do this auto-magically.
Too hard? How about this one?
I have one video card and 2 screens. Where do you put the second screen in relation to the primary, and at what rotation?
I ABSOLUTELY agree with you: INI is NOT THE SOLUTION to this problem. Something computer parse-able, like XML (written as human friendly as possible) is. Next step is to improve Weston to be able to write this file, and provide some kind of GUI configuration application for Weston. Config files = a plus. Those config files being written by humans (during normal use cases) instead of the application that reads them = fail.