Announcement

Collapse
No announcement yet.

Surface Suspension Protocol Proposed For Wayland

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by murraytony View Post
    Wayland was designed to be extended to add desired features to be implemented by compositors. This way extensions can be deprecated and removed when they are no longer needed and are only added when needed. The reason we are still seeing extensions added regularly is because Wayland compositors are still maturing. Baking functionality into the base system was a mistake X11 made, the designers chose not to repeat it.
    You have never programmed an X11 program, did you?
    Nothing is baked in the base system. There is this function called XQueryExtension

    and the software can adapt to using the extension or not. If you type
    xdpyinfo
    you get the list of extensions that the X11 server supports. It is quite normal for some extensions to be missing on older X11 servers. Software should work without it, providing its own functionality or suffer some performance/functionality penalty. But it should still work.
    If that is not the case, it is the fault of application programmer, not the bad design of X11.

    Extensions become "baked in" simply by being used. So people expect them for completeness of the implementation. Same thing, which happened on X11 will also happen on Wayland. If it will ever become as popular as X11.

    Comment


    • #22
      Originally posted by kiffmet View Post
      I.e. the protocol mandates that information going from one wl client to another has to go through the wl server and use a defined protocol. Just hacking around it by using dbus or something else is violating the spec. Hence you need a wayland clipboard copy and paste protocol extension… You want to place a tooltip in a non-awkward position on the screen? new protocol it is! Previewing minimized windows for things like alt+tab? New protocol! Non GLES OpenGL or Vulkan on Wayland? New protocol… Taking screenshots and screen capture… new protocol…
      Screenshots/screencapture being a new protocol is required to fix the security without implementing a full authentication system in Wayland what would basically be duplicating what dbus and polkit does anyhow. This would be duplication risking more faults.

      Originally posted by kiffmet View Post
      It is so anemic that It can't even properly account for physical monitor sizes (DPI) in single and multi monitor setups. Got a 1080p and a 1440p monitor, both at 24''? The 1440p is "larger" because it has more pixels. Good luck having the user hack around the protocol for fixing scaling while avoiding a net loss of usable screen surface and blurry Xwayland instances, which only fixes single monitor HiDPI use btw.
      Just mandate from GUI toolkits that window sizes have to be specified in mm or in multiples of some predefined physical length and let users set a per-screen DPI. Boom, problem solved, perfect multimonitor DPI scaling, w/o artifacts and it wouldn't even need a new protocol or refactoring, had it been thought of in the beginning.
      This is you have not read wayland. Wayland support the compositor asking the application to change the scale of what it providing. https://wayland-book.com/surfaces-in-depth/hidpi.html

      Xwayland due to being X11 you screwed you cannot ask a X11 application on the fly to change the scale its output at all. Even if you specified in mm the window side you still need the application to be able to provide different scaled versions of output to match the DPI of the screen not to be blury because like it or not screens are made up if pixels.

      DPI/scale factor means there is already a form of windows sizes in mm in place its inches but hey its in place.

      Real issue here with wayland how to handle window split between two or more screens. It was decide that you could not ask a application window for two or more copies of its output at the same time mostly because if it was a complex game or something you could run straight out of GPU time doing this so one monitor will be right and the other monitors will be scaled giving a percentage of blur in this case. This is in fact a trade off because of the limited processing power of GPUs. Yes the only way to correctly solve the 1080p and 1440p on the same size monitors at the same time with zero blur requires application to generate output twice and we just don't have the GPU power to reliably do this. The reality is no matter how good the compositor scales if its not a integer scale it will always add a percentage of higher blur then asking the application to reproduce the frame and the required scale the fact you cannot do that with X11 is a major weakness of X11. Yes wayland application can refuse to scale when asked as well this could be claimed as a weakness of wayland but again this can be that the game/application output rendering is already exceeding the GPU processing on hand at it current scale. Wayland limitations here really do align to how limited GPUs really are and we cannot have everything due to this limitation but wayland is less limiting than X11 in this department.

      Comment


      • #23
        Originally posted by dpeterc View Post
        You have never programmed an X11 program, did you?
        Nothing is baked in the base system. There is this function called XQueryExtension

        and the software can adapt to using the extension or not.
        There are a lot of things baked in the base system, like "a visible window is painted directly in the buffer". You can have the Composite and the Damage extensions that add a new way of painting windows, but you can't remove the original painting system because it is baked inside. Also, by default, you work without transparency/alpha channel. You can have XRender for adding alpha channel support, but you can't remove the old non-alpha channel painting. You can have SHM extension to choose passing a picture using shared memory, but you can't remove passing a picture over the socket, even if today it's rarely used. You can use your own font renderer to paint letters from the client side, but if you claim to support X11 you can't remove the server-side font support. You can work with Cairo or AAG or OpenGL to paint elements and paste pictures from client side, but you can't remove from the server the 80's style paint primitives like bresenham lines and circles...

        All those elements are barely, if ever, used today, but if you claim to speak X11, you must implement them.

        Comment


        • #24
          Originally posted by rastersoft View Post
          You can have SHM extension to choose passing a picture using shared memory, but you can't remove passing a picture over the socket, even if today it's rarely used.
          Maybe you quickly googled some stuff, but since I specifically use those in my software, I know what I am talking about.
          Suppose you have rendered your image buffer in XImage, which you want to display in drawable (Window or Pixmap).
          If you don't have MIT-SHM extension you use XPutImage() instead of XShmPutImage(). So it goes over socket (if X server and client are on different machines) or over pipe, if on the same computer. That is the whole point of network transparency. Application does not deal with it. Low overhead and efficient.
          Application only uses MIT-SHM for faster display, if X client and server are on same computer and extension is present.
          Originally posted by rastersoft View Post
          You can work with Cairo or AAG or OpenGL to paint elements and paste pictures from client side, but you can't remove from the server the 80's style paint primitives like bresenham lines and circles...
          You should not remove 80's style Bresenham style of paint primitives, they are part of core protocol, they are not extensions. So again, no toolkit baked in, it is just base protocol. And why would you want to remove them? They are small and efficiently coded. They come from the Donald Knuth era, when programmers actually could do math and develop a sane algorithm.
          Modern toolkits have deteriorated to the level, where Line(x1,y1,x2,y2) is no longer equal to Line(x2,y2,x1,y1). Or drawing a line will miss a first point in some quadrants, but that is OK and not a bug worth fixing.

          https://forum.qt.io/topic/101825/lin...aw-correctly/2
          All effort goes into antialiased line drawing, but basic Bresenham from 1965 is beyond reach of today's programmers.
          https://en.wikipedia.org/wiki/Bresen...line_algorithm
          So I am supposed to use a fancy multiplatform Qt tooklit, so software will easily port to Windows, Mac, Linux with X11 backend rendering or Wayland rendering, but if I want to draw a mathematically correct line, I need to code my own version of Bresenham line, circle and curve algorithm. Thank you for simplifying my life.


          Comment


          • #25
            Originally posted by dpeterc View Post
            FYI, X11 was released in October 1987, gained popularity in the nineties, so it is hardly a 70s design. Unix is 70s design. Now replace that, just because it is old design.
            We did replace Unix. It is called Linux.

            Comment


            • #26
              Originally posted by jrch2k8 View Post
              Small correction, Wayland core was always functional like Vulkan, the issue you refer is not actually Wayland missed design or anything of the like but desktops environments wanting to make it behave like X11 so they can support both easier to avoid backlashes after the fact.

              See it this way, Instead of going the Apple way and focus and redesign the desktops to use Wayland, everyone went the Linux way and suddenly decided Wayland had to support all of X11(as possible) as well because you know, we have to support everything forever.
              It all comes down to application support. Apple provide XQuartz when they went Metal, Linux DEs implemented Xwayland when they went Wayland. I don't see the difference.
              DEs that don't support wayland will slowly die because even X11 updates are now confined to Xwayland and Xorg is essentially done..

              Originally posted by jrch2k8 View Post
              Same thing happens with compositors, instead of going full Modern C++/ Wayland / Vulkan and exploit thousand of new ways of handling desktop rendering, low latency, efficient hardware usage, etc. they are focused on OpenGL 3 still because some moron with a 20 year old GPU will cry murder in a forum because he is too cool to use a Centos derivative and too cheap to get even an used decent GPU. Hence we now have to deal with a Frankenstein that kinda works like on X11 but has bugs and kinda runs on Wayland but is not that much better than X11 but with different bugs because the 20+ years codebase they are using is full of hacks and half broken and was designed before threads were a thing but Hey, Linux support everything!!! so be happy
              You are describing a interesting future direction. However, Vulkan was not an option when people started to implement Wayland compositors.
              As of a year ago, it is possible in theory to implement a Wayland on Vulkan DE but there is not even a proof-of-concept out there. Also, I don't know how many Vulkan drivers implement the necessary Vulkan extensions. So it is essentially a research project.

              Re: Linux supports everything, I don't actually understand the point. Yes, there is Xwayland but does not impact Wayland. Any app developer can switch to wayland as their apps get ready. So if an app does run in Xwayland, it is more on the app developers, not the compositor.
              That being said, not having Xwayland would have killed Wayland altogether. No DE could make it the default because the apps did not support it and no apps would support it because it had no adoption.. Apple (and MS) always provides compatibility layers when they make big switches, e.g. XQuartz, rosetta, etc..

              Comment


              • #27
                Originally posted by dpeterc View Post
                So I am supposed to use a fancy multiplatform Qt tooklit, so software will easily port to Windows, Mac, Linux with X11 backend rendering or Wayland rendering, but if I want to draw a mathematically correct line, I need to code my own version of Bresenham line, circle and curve algorithm. Thank you for simplifying my life.
                Bad news
                https://forum.qt.io/topic/101825/lin...aw-correctly/2
                The lines here are mathematically correct.

                Problem is they are not the algorithm you want. That effect with the lines is what happens when you use "naive algorithm" or "digital differential analyzer (DDA)" or "Gupta Sproull's algorithm" as all these are in fact directionally drawn.

                Lot of cases you want Xiaolin Wu's line algorithm because it normally looks better than Bresenham line algorithm. Yes Xiaolin Wu is also directionally drawn but does not show the defect as much and looks generally due to how it works.

                Its been a on going annoyance for me that toolkits give you a line draw command with no way to set the drawing algorithm. Yes there is 5 common ways to draw a line on the screen all giving slightly different results all technically mathematically correct lines. Its like the same way you have 3 different common averages in maths.

                Sorry just because a toolkit is not using the algorithm for line you expect does not mean what it doing is mathematically incorrect.

                There is a fun issue here off loading line drawing to the gpu is part of the problem here remember when gpus did not have any integer processing turns Bresenham is not designed to work in floating point. Also horrible is we don't have Xiaolin Wu's common used because it was created in 1991 and when a lot of toolkits started this was not 20 years old yet so they avoid it in case of patents.

                There has been a need for a new Bresenham set of formulars that are properly floating point compatible.

                Another fact there is a difference between a line drawing like Bresenham that is designed to remain balanced around the centre point and a line that designed that if you extend it that you don't have to move the prior placed pixels. Yes start one end go to the other is the second type and results 1 pixel by self at end of lines this is effect you want if you want to be able to extend line without moving points.

                So depending on the line you are attempting to draw alters what line drawing algorithm you want.

                Comment


                • #28
                  Originally posted by dpeterc View Post
                  You should not remove 80's style Bresenham style of paint primitives, they are part of core protocol, they are not extensions. So again, no toolkit baked in, it is just base protocol. And why would you want to remove them? They are small and efficiently coded. They come from the Donald Knuth era, when programmers actually could do math and develop a sane algorithm.
                  And that's exactly all the point. Quoting your comment: "they are part of core protocol". THAT is the problem: there are a lot of elements of the core protocol that aren't used anymore and could have been discarded, except for the fact that they are in the core protocol. The fact that it happens that you use them still today doesn't make them more necessary or valuable.

                  And no, I didn't "googled that quickly". I wrote my own X11 window manager some time ago and worked with bare XCB several times. And I reaffirm that a lot of the core protocol in X11 is completely obsolete, and Wayland makes a lot of sense.

                  And, BTW: first you state that you can manually check whether you can send a picture through shared memory and manually decide which path to use, and after that, you complain that removing bresenham lines makes your life more complex... when you have tons of libraries to paint bresenham lines, and choosing whether to use shared memory or the socket should be transparent to the programmer!!!!

                  Bravo!
                  Last edited by rastersoft; 21 June 2021, 03:46 AM.

                  Comment


                  • #29
                    Originally posted by mppix View Post
                    We did replace Unix. It is called Linux.
                    For most practical purposes, Linux is a re-implemtation of Unix, it uses the same function calls with the same programming interface.
                    Application software can be "ported" with minimum changes, if any. And that is a good thing.
                    So it is hard to call Linux a replacement of Unix from the design point of view.

                    Comment


                    • #30
                      Originally posted by dpeterc View Post

                      For most practical purposes, Linux is a re-implemtation of Unix, it uses the same function calls with the same programming interface.
                      Application software can be "ported" with minimum changes, if any. And that is a good thing.
                      So it is hard to call Linux a replacement of Unix from the design point of view.
                      That's true and false at the same time: Linux has a lot of API calls unavailable in other UNIXes (like the BSD family), which means that software made to take advantage of those calls isn't portable into other UNIXes. In fact, that is one of the criticism for systemd (although systemd people is tired of noting that the interface in systemd is in DBus, but that's another history).

                      Comment

                      Working...
                      X