Announcement

Collapse
No announcement yet.

KDE Plasma 5.26 To Allow Crisper XWayland Apps With New Scaling Option

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by oiaohm View Post
    The core Wayland protocol was only designed to support integer scalers because they don't generate blurly but they also don't have the windows match in side between monitors. At least integer scaling allows you to keep application window readable when there is a huge dpi difference between monitors.

    Doing 200% then downscaling as need is something wayland compositor could choose to pull as well. This is not a protocol limitation but limitation of what compositor has implemented.
    Obviously these can't be both true at the same time.

    This is really simple: Wayland has acknowledged the need for scaling (otherwise it wouldn't support scaling at all, leaving it completely in the hands of compositors), it just doesn't do the hard work.

    Comment


    • #32
      Originally posted by oiaohm View Post

      ...
      I'm with you wanting a proper solution. I just don't like how Wayland was forced in our throats before it managed even more basic stuff than MultiDPI, which we used happily a decade ago.

      Comment


      • #33
        Originally posted by bug77 View Post
        Obviously these can't be both true at the same time.

        This is really simple: Wayland has acknowledged the need for scaling (otherwise it wouldn't support scaling at all, leaving it completely in the hands of compositors), it just doesn't do the hard work.
        Both statements can be true at the same time. Mac OS also uses integer scaling steps with the compositor with the compositor down-scaling. So what in the original Wayland protocol matches up to what Mac OS does.



        There is a difference here early wayland protocol has presumed that incorrect size on screen for some windows of an application will be acceptable. Of if you stick to integer scaling only you avoid the blurry and artefact problem. As soon as you do any form of fractional scaling you will either have blurry or different artefact problems.

        Understanding what the problem is. Wayland also support asking application to provide two buffers at different res for the same window in the early protocol. As in tell the application that we are on different screens. Those making toolkits kind of complain about this early on leading to the built in integer scaling.

        Originally posted by Ladis View Post
        I'm with you wanting a proper solution. I just don't like how Wayland was forced in our throats before it managed even more basic stuff than MultiDPI, which we used happily a decade ago.
        Part of the issue here is being aware that what was used decades ago had a set of major problems. Yes hacks. Problem here as you attempt to do proper solutions lot of devil in the details come out.

        The reality is depending on the application depends if you want it fractional scaled(this is application where blur and artefacts is not problem), integer scaled(this is where blur and artefacts is a problem if the fonts a little jagged you don't care and if scale to real world is wrong you don't care or Application rendered to match the DPI of the screens.

        All three options have downsides the first 2 are simple. The third one application rendering windows to match the DPI of the screen/screens can end up in a lot of extra processing for the application and possible horrible bad math errors also leading to artefacts.

        The more I look the MultiDPI problem the more I keep on coming around this problem may not have a valid correct global answer.

        Its really simple to say it worked a decade ago and skip over all the cases where it really did not work. Yes wayland having integer scaling I see as part of the solution. You need the options to do all three.

        1) You will always have the application that will not fractional scale well.
        2) You will always have the application where being integer scaled and saying stuff matching to real world cm on screen as in use intentionally incorrect DPI match is not going working.
        3) You are always going to have cases where applications rendering themselves are screwing up perfectly. Yes there are applications under X11 you ask for either high or low dpi and they basically overlap their menus/toolbars or other key interface parts so making the application useless. This is like the do not theme my app problem because you will break it.

        This is really this problem is close to the game of rock paper scissors. Each solution causes a problem while fixing a other problems. No solution fixes all problems with scaling applications to display output on different screens.

        I see the current change as a step in the right direction do I class it as a complete step in the right direction no. For X11 applications it still missing lock to integer scaling stuff being perfect dpi to get decent out and per application settings.

        MultiDPI problem is true problem. Worst part it a true headache from hell. Windows does not get it right, Mac OS does not get it right. Wayland compositor are not getting it right yet. There is a lot of foundation stuff need to be done to fix it completely. Yes per application setting this brings in the problem how to track what application does a process own to something standard posix does not help you with.

        Yes I can see needing to add custom meta data to cgroups to say any process running in this cgroup needs to be rendered to screen using this form of scaling.

        Comment


        • #34
          Originally posted by oiaohm View Post

          Its really simple to say it worked a decade ago and skip over all the cases where it really did not work.
          Fortunately Linux ecosystem is based on opensource, so unlike in Windows and macOS, we could actually fix the apps, we used. The base set of pre-installed apps was patched by the distribution.

          Yes, I know Wayland uses the logic from macOS, but unlike macOS it has to run on slow iGPUs in the x86 ecosystem.

          Comment


          • #35
            Originally posted by Ladis View Post
            Fortunately Linux ecosystem is based on opensource, so unlike in Windows and macOS, we could actually fix the apps, we used. The base set of pre-installed apps was patched by the distribution.

            Yes, I know Wayland uses the logic from macOS, but unlike macOS it has to run on slow iGPUs in the x86 ecosystem.
            Yes we might have a lot of software as open source. But we really don't have the resources to be recoding it all the time either.

            You will always have the application that will not fractional scale well.

            1) You will always have the application that will not fractional scale well.
            2) You will always have the application where being integer scaled and saying stuff matching to real world cm on screen as in use intentionally incorrect DPI match is not going working.
            3) You are always going to have cases where applications rendering themselves are screwing up perfectly. Yes there are applications under X11 you ask for either high or low dpi and they basically overlap their menus/toolbars or other key interface parts so making the application useless. This is like the do not theme my app problem because you will break it.
            This is from my prior post. You will always have cases where recoding application someone will make typo. We may not notice this until someone is using a monitor combination of a particular dpi set. So we do need compositor scaling.

            Yes the running on slow iGPUs. This might mean you want to use simpler to process scaling like integer scaling instead of fractional scaling for performance reasons even that this means giving up 1cm on screen being 1cm on paper on particular monitors.

            Also the history of distributions custom patching applications have a long history of distribution particular bugs. We really do want a clean of a solution as possible. The reality when you get into what problem space is the solution is not clean. This also explains why Mac OS and Windows with their solutions still have problems.

            Yes this really does come a case of attempt to have 1 size fits all and at some point you have to admit the reality that in this particular use case that not going to work..

            Comment


            • #36
              Originally posted by oiaohm View Post
              Both statements can be true at the same time. Mac OS also uses integer scaling steps with the compositor with the compositor down-scaling. So what in the original Wayland protocol matches up to what Mac OS does.



              There is a difference here early wayland protocol has presumed that incorrect size on screen for some windows of an application will be acceptable. Of if you stick to integer scaling only you avoid the blurry and artefact problem. As soon as you do any form of fractional scaling you will either have blurry or different artefact problems.

              Understanding what the problem is. Wayland also support asking application to provide two buffers at different res for the same window in the early protocol. As in tell the application that we are on different screens. Those making toolkits kind of complain about this early on leading to the built in integer scaling.
              That... makes no sense at all.
              Wtf anyone thinks scaling in more than one place is a good idea? And the argument in favor of that is that macOS does it too? :facepalm:

              Comment


              • #37
                Originally posted by bug77 View Post
                That... makes no sense at all.
                Wtf anyone thinks scaling in more than one place is a good idea? And the argument in favor of that is that macOS does it too? :facepalm:
                History of this development. At the time Wayland was adding into protocol scaling support the OSs producing the best results was iOS and macOS.

                There are a lot of historic facepalms in the X11 protocol as well.

                bug77 what makes sense now is not exactly the same as what made sense roughly 15 years ago. Yes roughly15 years ago down-scaling was not thought able to introduce major artefacts about 4 years after that can clear that that was not exactly true.

                There is a key change https://freetype.org/patents.html Its 2010 when the bytecode hinting in fonts patents expire. This makes fonts not scale the same way the use to.

                FreeType version 2.4 is basically bugger we have blurry downscaling with Wayland. Yes and before that people were complaining about fonts under Linux not being rounded enough and being too jagged. How one issue hides another issue basically.

                Multi DPI problem is a history of different stuff up. With many cases where believed solutions turn out to be the wrong one.

                Comment


                • #38
                  Originally posted by oiaohm View Post

                  History of this development. At the time Wayland was adding into protocol scaling support the OSs producing the best results was iOS and macOS.

                  There are a lot of historic facepalms in the X11 protocol as well.

                  bug77 what makes sense now is not exactly the same as what made sense roughly 15 years ago. Yes roughly15 years ago down-scaling was not thought able to introduce major artefacts about 4 years after that can clear that that was not exactly true.

                  There is a key change https://freetype.org/patents.html Its 2010 when the bytecode hinting in fonts patents expire. This makes fonts not scale the same way the use to.

                  FreeType version 2.4 is basically bugger we have blurry downscaling with Wayland. Yes and before that people were complaining about fonts under Linux not being rounded enough and being too jagged. How one issue hides another issue basically.

                  Multi DPI problem is a history of different stuff up. With many cases where believed solutions turn out to be the wrong one.
                  Again throwing random facts at me trying to distract from the matter at hand: it's an important feature and Wayland only offers it a very rudimentary and inefficient form.
                  The solution is also very simple: scaling should be done in one place and that place is the one that can make use of subpixels to provide proper anti-aliasing. Everything else is just noise.

                  Comment


                  • #39
                    Originally posted by bug77 View Post
                    The solution is also very simple: scaling should be done in one place and that place is the one that can make use of subpixels to provide proper anti-aliasing. Everything else is just noise.
                    This is you not understanding the problem space. Gamescope from valve does not have the scaling in the place you just defined. The only place will full data to allocate subpixels 100 percent correctly all the time is the Application. Of course the application being something legacy may not support the current screen dpi requirements.

                    Remember all applications written for modern X11 has client side font rendering.

                    Anti-aliasing is a curse to scaling. DLSS and FSR is used for games these are guesses that are not 100 percent correct even with application involvement.. The reality of the problem is the application itself may not be using subpixels correctly.

                    bug77 for the worst case is how do you want the subpixels/output wrong instead of how do you want the output to be right because they will not be right.

                    Yes bug77 here you arguing that scaling must be done in one place where it could be done perfectly this argue for application side scaling only and ignoring that applications can be doing this scaling wrong or have some other issue. Like games having a game output a X res and gamescope scale can result in lower CPU and GPU usage so making the game playable at the cost of output quality.

                    This is absolutely not a 1 answer solves it all problem. Remember the higher the dpi monitor the higher the processing cost to render applications perfectly.

                    This is the big problem everyone attempts with this scaling problem to have a single answer. I don't believe there is. User need to be able to trade quality for performance. User need to be able to work around applications with defective scaling built in as well because the defect of output by the applications on renderer could make the application not usable.

                    Application not usable by it own internal render is like too high of CPU load or GPU load resulting in bad/unusable performance next is simple just having a math error that at a particular scale does something stupid like making the programs controls not accessible finally not support X scale at all because application will not render that scale.

                    bug77 like it or not do scaling only in the one place it can be done perfectly simply does not fly because that one place will not always do it perfectly. We need the level of imperfect non application scaling to be not too bad if it is used and user controllable.

                    Comment


                    • #40
                      Originally posted by oiaohm View Post

                      Yes we might have a lot of software as open source. But we really don't have the resources to be recoding it all the time either.

                      You will always have the application that will not fractional scale well.


                      >> 1) You will always have the application that will not fractional scale well.
                      >> 2) You will always have the application where being integer scaled and saying stuff matching to real world cm on screen as in use intentionally incorrect DPI match is not going working.
                      >> 3) You are always going to have cases where applications rendering themselves are screwing up perfectly. Yes there are applications under X11 you ask for either high or low dpi and they basically overlap their menus/toolbars or other key interface parts so making the application useless. This is like the do not theme my app problem because you will break it.


                      This is from my prior post. You will always have cases where recoding application someone will make typo. We may not notice this until someone is using a monitor combination of a particular dpi set. So we do need compositor scaling.

                      Yes the running on slow iGPUs. This might mean you want to use simpler to process scaling like integer scaling instead of fractional scaling for performance reasons even that this means giving up 1cm on screen being 1cm on paper on particular monitors.

                      Also the history of distributions custom patching applications have a long history of distribution particular bugs. We really do want a clean of a solution as possible. The reality when you get into what problem space is the solution is not clean. This also explains why Mac OS and Windows with their solutions still have problems.

                      Yes this really does come a case of attempt to have 1 size fits all and at some point you have to admit the reality that in this particular use case that not going to work..
                      While I agree there is no one-size-fit-all algorithm that can be dumped onto every applications to make them all work, I think we should clarify which kind of applications is the well-behaved "standard", while which other kinds are legacy that require "clutches".

                      The "1cm on screen being 1cm on paper" illusion should have been long gone in 2022 as this is impossible even in utopia. There are smartphone screens that people expect text there be displayed smaller than computer screen in order to fit more. There are projectors for many decades that the connected computers don't know how big their projection are.

                      People who want "1cm on screen being 1cm on paper" will buy monitors or setup their projectors accordingly. It should be a well known fact that dpi scaling, when applied to all UI elements, don't need arbitrary dpi value support. Or at least a well known fact after we left Win XP days and every commercial OSes offer coarse stepped scaling but not as coarse as 1x to 2x jumps. Users are uninterested in entering custom numbers as long as they are given a 0.25x or maybe at best 0.1x grained choices. The fractional scaling is for users to read and use computers comfortably no matter what hardware they are thrown at, not for artistic purpose. For artists, they will buy monitors that fulfill screen=paper in 1x UI scale, or in 2x/3x UI scale for hiDPI models. Of course they don't need fractional scaling.

                      (This is probably the priority mismatch between the Wayland team and the commoners. Those denying the demand or urgency of native fractional scaling keep repeating their concern of fractional scaling introduce unfixable glitches.)

                      With such knowledge in mind, we can setup several "popular" fractional scales that applications are recommended to test against, and provide clutches for UI toolkits or frameworks that fail to perform a specific fractional scaling natively in a case by case basis. Since we don't need true arbitrary fractional scaling, we don't need to store any coordinates or dimensions in floating point but can instead used a fixed point binary fractional format. (Common floating point data format can't store perfect 1/3 anyway.) Such scheme should not trigger any worry of computer performance, as what Wayland compositors doing right now is a bigger performance hit - compositor based rescale for more programs than necessary.

                      For that fixed point format, the minimal unit choice is open. 1/4 logical pixel allows a minimal support of Windows-grade fractional scaling, which is also the "popular" scale list that applications shall test against. 1/16 logical pixel is >1500dpi, already beyond the resolution of common commercial printings. Even if one want to play absolutely safe, I don't think there are any need of forward compatibility for a fixed point format finer than 1/256 px, which means >24000dpi, when those drawing tablets so far advertise some 5000lpi for the most professional / exaggerating.

                      Wayland was supposed to be a clean break from X and such fixed point protocol could have been done. UI toolkits that find such support still too hard for them can be offered clutches (e.g. compositor based upscale / downscale, snap to grid of mouse coordinate) as secondary choices. Their complaint shouldn't have been taken as an excuse to support only integer scaling in the initial protocol. Applications or UI libraries may support the scaling in use directly, or provide a higher / lower scaled output to the compositor for the compositor to re-scale. Or if the users find an application's native scaling or choice of higher / lower scale buggy or less performative then desired, users may request the compositor to lie or lock out the less supported native scaling from that application.

                      My new Android 12 phone is providing me separate sliders for display size and font size. There are only 4 steps for font size and 5 steps for display size available. Obviously the 1st step and the 2nd step of them are not as big a difference as 1x vs 2x. (I looked up and found Android used density qualifiers of 0.75x, 1x, 1.5x, 2x, 3x, 4x for bitmap resources, with 1x="160dpi") There are probably 4x5=20 combinations for applications to auto-test. But Android app developers still managed it.

                      Offtopic: My old phone (5+ years old, stopped seeing the SIM card yesterday 🙁) reports a clean 2x96 "192dpi" for webpages but my new phone reports a "300dpi" (firefox reports 303) for webpages. I feel the number un-optimised for web as it is no longer a multiple of 96. But no change of display size and font size give me an "optimal" 288=3x96 or 320=2x160. Even if I switch back to default display size and font size, the chrome browser reports "252dpi" (firefox reports 250), which has nothing to do with the phone screen physical 457dpi reported by 3rd party phone spec database nor any clean multiple of Android standard dpi or webpage standard dpi 😕️

                      Comment

                      Working...
                      X