If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
Ah, I don't think you understand. Libinput doesn't do anything with the input, all it does is collect the input and report it with very minimal abstractions. The abstractions are on the level of "user 1 pressed the A key" or "user 2 swiped right". Libinput doesn't assign any meaning to the input beyond that, and doesn't know anything about your applications or your desktop environment. It's up to Gnome to say "swipe right means switch apps" or for Tinder to say "Swipe right means I'm DTF."
This isn't like Wayland where it's a protocol that has to be implemented in code. An API is actual code that does stuff, it's just that libinput is lower in the stack than you think it is. I suppose it's likely that they'll implement gestures in Weston without any configuration, but then Weston is only intended to be a proof of concept.
Comment