Announcement

Collapse
No announcement yet.

Relative Pointer Motion Patches Published For Wayland's Weston

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Same with pointer lock. This is specific to desktop, and actually specific to desktop games.
    Pointer lock (assuming we're talking about the same thing) is actually a poor workaround, precisely because on some platforms there was no relative mouse motion vector available. This method resets the pointer to the center of the screen every frame so 1) it doesn't go off-screen and 2) a vector can be computed by using the last known position from the previous frame. It never was a 'feature', but a lame hack for a incredibly outdated input handling (so severe it should have been considered a bug IMO).

    The funny thing is that many OpenGL Windows games did the same, because developers didn't want to link against DirectInput (which on Windows was the only way to get the raw vector). In the end this hack worked relatively well though, and I'm sure there's plenty games being shipped that still use it.

    Comment


    • #22
      Originally posted by carewolf View Post
      That is nonsense. On touch based device minimize is called not on screen, and is very central concept to all major touch based operating systems
      And I repeat, the minimize action in these UI is always managed by the compositor itself, not the app, so there is no need for a protocol between the two. See the minimize button on your phone apps? no? There you are.

      Comment


      • #23
        Originally posted by erendorn View Post
        And I repeat, the minimize action in these UI is always managed by the compositor itself, not the app, so there is no need for a protocol between the two. See the minimize button on your phone apps? no? There you are.
        You are wrong. Going off screen just like minimize is very much an application action. It is requested by the "compositor"(sic), but it still informs the application and often makes specific demands on what the application needs to do, you can't have application not on screen on a phone running pointless animations in the background for instance, for exactly the same reason minimize was invented on desktops in the 80s and 90s.

        Comment


        • #24
          Not sure if it's on purpose, but I think you are mixing two separate issues (app telling comp "minimize me", and comp telling app "you're minimized").

          In one case, the app already knows that it will be minimized, it just needs to know when it can consider itself actually minimized, so that it can stop rendering (if it wants).
          In the other case, the compositor is telling the app "Hey, you're no longer on screen". There is no longer any need for the app and the compositor to communicate. Whether the app stops rendering or not is not of any concern to the compositor.

          They are similar; in both cases the app eventually stops rendering. However, only in one case does the app not know implicitly that it is off screen. That is, in the desktop scenario. I don't know what additional information needs to be conveyed, or how it would be conveyed, but the two scenarios are not "the same". It may just be that it is unimportant to implement them in Weston, or it may be that it is so compositor specific that they want to avoid implementing it in Weston. Either way, it's not for you or I to say "they're stupid for taking so long writing such a simple feature," especially when it's clearly possible (as mentioned before, Enlightenment supports "minimize") and yet neither you nor I have done it ourselves.

          Comment


          • #25
            I don't know any applications that minimize themselves. It is usually a button in window decoration or in the taskbar menu, in both cases something the compositor would tell the application to do.

            Comment


            • #26
              Actually any application using Client Side Decorations would have to request to be minimized, otherwise that "button in the window decoration" would do noting. And Wayland encourages the use of CSD.

              Comment


              • #27
                Originally posted by carewolf View Post
                I don't know any applications that minimize themselves. It is usually a button in window decoration or in the taskbar menu, in both cases something the compositor would tell the application to do.
                Oh there definitely are, and use is very widespread actually. It is not often a button, that is true. But most messaging applications have an option to start minimized. Or applications that minimize when you click the close button, such as download managers or chat applications. It is stuff like that.

                The same issues exists under X11 for maximizing windows. It is not possible in X to programatically maximize a window. I've tried countless suggested code samples (some posted by X developers) but seemingly no one knows how to do this *. That's why a lot of Linux applications (such as Blender) just set the windows size to the desktop resolution ('faux maximized') and call it a day. In Win32, this functionality is one or two lines of code. X11 code suggestions I've tried are usually ~100 lines of C code and it still doesn't work.

                * If you can post a proof of concept code, it is very welcome. I've never seen a working sample though.
                Last edited by Remdul; 04 December 2014, 12:17 PM.

                Comment


                • #28
                  Sure, here is an example of a basic text editor using Qt4 that will maximize itself:

                  Code:
                  #include <QtGui/QApplication>
                  #include <QtGui/QTextEdit>
                  
                  int main(int argv, char **args) {
                          QApplication app(argv, args);
                  
                          QTextEdit textEdit;
                          textEdit.setWindowState(Qt::WindowMaximized);
                          textEdit.show();
                  
                          return app.exec();
                  }

                  Comment


                  • #29
                    And just for fun, here's how it's done with Xlib directly, slightly messier:

                    Code:
                    #include <X11/Xlib.h>
                    
                    int main() {
                            Display *dpy = XOpenDisplay(NULL);
                            Window w = XCreateSimpleWindow(dpy, DefaultRootWindow(dpy), 0, 0, 200, 100, 0, WhitePixel(dpy, DefaultScreen(dpy)), WhitePixel(dpy, DefaultScreen(dpy)));
                            XMapWindow(dpy, w);
                    
                            XEvent xev = {0};
                            xev.type = ClientMessage;
                            xev.xclient.window = w;
                            xev.xclient.message_type = XInternAtom(dpy, "_NET_WM_STATE", False);;
                            xev.xclient.format = 32;
                            xev.xclient.data.l[0] = 2;
                            xev.xclient.data.l[1] = XInternAtom(dpy, "_NET_WM_STATE_MAXIMIZED_VERT", False);;
                            xev.xclient.data.l[2] = XInternAtom(dpy, "_NET_WM_STATE_MAXIMIZED_HORZ", False);
                            XSendEvent(dpy, DefaultRootWindow(dpy), False, SubstructureNotifyMask, &xev);
                    
                            for(;;) {
                                    XEvent e;
                                    XNextEvent(dpy, &e);
                                    if (e.type == DestroyNotify) {
                                            XDestroyWindow(dpy, w);
                                            break;
                                    }
                            }
                            return 0;
                    }

                    Comment


                    • #30
                      Yeah, that's one of the bits of code that doesn't work in my application on XFCE (I copied your code verbatim just to be sure). I think it fails because 'X11/XFCE doesn't guarantee a window to ever become visible, even after forcing it to become visible' (paraphrasing a bizarre comment from XFCE source). This code may work in ideal (simple) situations while the application is started, but not in many others, presumably due to race conditions (window 'show' command was pushed but not yet processed by X11, and the 'maximize' instruction is processed before the window has become visible, and thus the command is silently ignored). Maybe in my application the window first paint is delayed because the graphics stack is sluggish due to OpenGL usage (SDL boilerplate code, nothing fancy). It's a deep mess and I gave up on this a long time ago.
                      Last edited by Remdul; 04 December 2014, 02:15 PM.

                      Comment

                      Working...
                      X