Announcement

Collapse
No announcement yet.

X.Org Could Use More Help Improving & Addressing Its Security

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by dec05eba View Post
    x11 has always supported HDR. Not only are colors up to 10-bits, they are even up to 16-bits (per component). It works on both amd and nvidia.
    There is a way you can get mixed DPI scaling working with it looking nice, while with wayland there is currently no wayland compositor that properly works with x11 scaling. X11 windows under xwayland are scaling as a texture, making it blurry. There are forks of xwayland and wlroots to fix this but who knows if these changes will ever be accepted into wlroots and xwayland.
    There is work going on on regarding HDR and VR that does not break the system. This was simply not highest priority until now..

    Originally posted by dec05eba View Post
    As birdie mentioned, wayland people need to address these issues: https://gitlab.freedesktop.org/wayla...d/-/issues/233
    and even if they do, wayland ends up being as "unsafe" as x11 because you can then keylog and things like that. You could implement a system where you have global hotkeys and things like that without risking security, but wayland is badly designed from the start so I doubt that would happen. X servers could be easily modified to do that without affecting any programs.
    Did you see the response to that post? What a polite way to call the OP an "ignorant idiot".

    BTW, global hotkeys work great on Wayland. If your DE does not support it, you might want to file a bug report.

    Comment


    • #42
      Originally posted by dec05eba View Post

      x11 has always supported HDR. Not only are colors up to 10-bits, they are even up to 16-bits (per component). It works on both amd and nvidia.
      There is a way you can get mixed DPI scaling working with it looking nice, while with wayland there is currently no wayland compositor that properly works with x11 scaling.


      yeah.. you have no clue thanks for proving that

      Comment


      • #43
        Originally posted by mppix View Post
        Most do actually. However, you don't seem to be able to differentiate between an issue with Wayland and issue with KDE.
        Actually I know the difference and the point is that the design of Wayland created this differentiation when it didn't exist before. Wayland's design forces each DE to re-implement the same functionality themselves and that is a problem that Wayland introduced.

        Which is fine, but then don't go claiming that Wayland is fine to use when evidently because of the way it is designed its only fine for a specific DE that happened to implement it (and also don't be surprised that Wayland is not the default for many Linux distro's for this same reason).

        In the end, sensible distros will only make Wayland a default if it actually works for the DE environments that this distro supports, not sure why this is hard for people to comprehend.

        Comment


        • #44
          Even tho I'm pretty much on board with Wayland there is _one_ thing I'm concerned about it when it comes to security: the protocol may be more secure, but maybe having many different implementations of the compositor is not a great idea. I know some of it is mitigated because there tend to be "clusters" of compositors sharing a same base library, but nonetheless there's fewer eyes per compositor than X.org may have had in its golden days. I mean, many things aren't inherently insecure on paper, yet security bugs happen at the implementation level.

          Comment


          • #45
            Security issue aside, I don't think Wayland is that well-designed for mixed monitors. Subpixel antialiasing or font hinting do not coexist with naive upscale/downscale of rasterised framebuffer. However, the design of Wayland assumes one buffer per surface and do the upscaling/downscaling by compositor, when the vector information has been lost.

            A perfect design would have been letting client surface (optionally) output multiple buffers for each monitor profile, in accommodation to each monitors' unique framerate, dpi, colour gamut, dynamic range etc. 3D intensive applications or video players may output one layer for the 3D / video part, and add additional UI / subtitle layer on top that provide different dpi for each monitor.

            Comment


            • #46
              Originally posted by sinepgib View Post
              Even tho I'm pretty much on board with Wayland there is _one_ thing I'm concerned about it when it comes to security: the protocol may be more secure, but maybe having many different implementations of the compositor is not a great idea. I know some of it is mitigated because there tend to be "clusters" of compositors sharing a same base library, but nonetheless there's fewer eyes per compositor than X.org may have had in its golden days. I mean, many things aren't inherently insecure on paper, yet security bugs happen at the implementation level.
              IOW, we need https://gitlab.freedesktop.org/wayla...d/-/issues/233

              Comment


              • #47
                Originally posted by billyswong View Post
                Security issue aside, I don't think Wayland is that well-designed for mixed monitors. Subpixel antialiasing or font hinting do not coexist with naive upscale/downscale of rasterised framebuffer. However, the design of Wayland assumes one buffer per surface and do the upscaling/downscaling by compositor, when the vector information has been lost.

                A perfect design would have been letting client surface (optionally) output multiple buffers for each monitor profile, in accommodation to each monitors' unique framerate, dpi, colour gamut, dynamic range etc. 3D intensive applications or video players may output one layer for the 3D / video part, and add additional UI / subtitle layer on top that provide different dpi for each monitor.
                A million times this.

                Comment


                • #48
                  Originally posted by mdedetrich View Post

                  Actually I know the difference and the point is that the design of Wayland created this differentiation when it didn't exist before. Wayland's design forces each DE to re-implement the same functionality themselves and that is a problem that Wayland introduced.

                  Which is fine, but then don't go claiming that Wayland is fine to use when evidently because of the way it is designed its only fine for a specific DE that happened to implement it (and also don't be surprised that Wayland is not the default for many Linux distro's for this same reason).

                  In the end, sensible distros will only make Wayland a default if it actually works for the DE environments that this distro supports, not sure why this is hard for people to comprehend.
                  Wayland was seemingly designed by Gnome developers for Gnome. I know it's not true but Weston totally irrevocably sucks as a reference compositor. It's simply unusable for any serious work, aside from running a web browser and nothing else, and doing nothing but browsing the web (without trying to upload/download files).

                  Comment


                  • #49
                    No, X.org doesn't need any help as it's dead, deprecated and the security issues can't be fixed, according to most members here.

                    Comment


                    • #50
                      Originally posted by billyswong View Post
                      A perfect design would have been letting client surface (optionally) output multiple buffers for each monitor profile, in accommodation to each monitors' unique framerate, dpi, colour gamut, dynamic range etc. 3D intensive applications or video players may output one layer for the 3D / video part, and add additional UI / subtitle layer on top that provide different dpi for each monitor.
                      I wouldn't call that "perfect". Not saying it's bad either, but as most design choices it looks like a compromise. In this case, memory use could increase terribly, same as with GPU/CPU use if rendering twice. Of course, that's assuming they do proper use of this multiple buffering. But if they don't, we have the far from perfect current state.

                      Originally posted by birdie View Post
                      I guess so? It's not entirely clear, as what it proposes seems to be:
                      1) Protocol standardization, which would be good (not necessarily inside of the Wayland protocol itself tho);
                      2) A reference implementation for this.

                      That doesn't change the fact that the recommended approach is using multiple implementations, it just makes them more compatible with one another.

                      Originally posted by birdie View Post
                      It's simply unusable for any serious work
                      Well, it's supposed to be a reference for how a compositor is supposed to be implemented, not an end product. It should implement all the features from the core protocol (does it?), but not much more. As long as it can handle windows as expected, it's fulfilling its purpose.

                      Comment

                      Working...
                      X