Announcement

Collapse
No announcement yet.

KDE Plasma 5.26 To Allow Crisper XWayland Apps With New Scaling Option

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    One approach to doing different XWayland scaling per screen: either use different DISPLAY values (DISPLAY=:0.0, DISPLAY=:0.1), or else use entirely separate Xwayland instances per screen. But a downside of that would be that you couldn't move such a window from one monitor to another.

    Comment


    • #42
      Originally posted by billyswong View Post
      While I agree there is no one-size-fit-all algorithm that can be dumped onto every applications to make them all work, I think we should clarify which kind of applications is the well-behaved "standard", while which other kinds are legacy that require "clutches".
      Problem here its not just legacy that require clutches. You will have new applications that have a coding error that they don't work at a particular dpi. So its legacy and buggy current day applications. Some of those bugs come about due to maths rounding errors why fractional scaling is not exactly liked because you increase risk of math error.

      Originally posted by billyswong View Post
      My new Android 12 phone is providing me separate sliders for display size and font size. There are only 4 steps for font size and 5 steps for display size available. Obviously the 1st step and the 2nd step of them are not as big a difference as 1x vs 2x. (I looked up and found Android used density qualifiers of 0.75x, 1x, 1.5x, 2x, 3x, 4x for bitmap resources, with 1x="160dpi") There are probably 4x5=20 combinations for applications to auto-test. But Android app developers still managed it.
      This compare runs into hell.
      https://developer.android.com/traini...creendensities
      This has the DPI values. Yes the 0.75x with android 12 is 120dpi your 1x is 160 dpi yes your 4x is 640 dpi. Do note there is a 7 value that is tvdpi that you did not see ~213dpi yes one value android has for all tvs in existence. Anyone else feeling a horrible problem here.. What is the monitor sitting in front of me its a 1080p monitor that is 96 dpi can I buy a new monitor like this yes. Does android have any mode that is 96 dpi the answer no it does not. Also android dpi and desktop dpi does not exactly align either when you render stuff will get into that after the next bit.

      https://web.dev/high-dpi/ when you go to web standards you find out that the old monitor majority are either 72 or 96 dpi. 96dpi is still in production.

      Wayland started with 96dpi on desktop monitors like html does there is a important reason why that I will be covering.

      Originally posted by billyswong View Post
      Offtopic: My old phone (5+ years old, stopped seeing the SIM card yesterday 🙁) reports a clean 2x96 "192dpi" for webpages but my new phone reports a "300dpi" (firefox reports 303) for webpages. I feel the number un-optimised for web as it is no longer a multiple of 96. But no change of display size and font size give me an "optimal" 288=3x96 or 320=2x160. Even if I switch back to default display size and font size, the chrome browser reports "252dpi" (firefox reports 250), which has nothing to do with the phone screen physical 457dpi reported by 3rd party phone spec database nor any clean multiple of Android standard dpi or webpage standard dpi 😕️
      To know what going on here its worse than you think. Remember html 96 dpi is based on sitting in front of a monitor at arm length away that is not where you hold your mobile phone. You know that 120dpi/ 160dpi of android that turns out to be a fairly close match to 75/96 dpi on a standard computer monitor yes the human eye distance factor that dpi does not include. Yes the arm length distance with computer monitors use to be important for radiation exposure off crt monitors(not all electrons stopped at the front of the screen) and of course you don't use you mobile phone with a full arms length between you and it. Please note its a arm length of a 6 foot tall person with a closed fist to have had technically the correct distance from a computer monitor yes this means if you were shorter than 6 foot and you were applying correct ohs desktop crt monitor distances to a android phone screen you would need a selfie stick to use your phone ~74-5 cm away if you were shorter than 6 foot.

      The reason why html and wayland use 96dpi on a computer monitor as a base is this is fully defined when you did deep enough. Yes correct viewing distance to computer monitors was defined in Occupational Health and Safety standards and is still defined in these standards. Modern versions OHS standards support down to 40cm but that is for hidpi monitors.

      Yes due to having distance and dpi this is why projectors work. Most projectors at wall are a very low dpi. Most projectors at the 74-75cm are 96 dpi.

      At this point when you know this stuff you wake up that we have missing number problem. This explains why on mobile phones firefox and chrome end up reporting different dpi values. They are attempting to fudge factor from 96 dpi on computer monitor that has a74-75cm eye distance to mobile device has a shorter not defined eye distance and a higher dpi value. Yes the shorter the eye distance the higher the dpi you have to use to get the same result so 250dpi at 74cm is close to each to 457dpi at 40cm. Yes since you are performing a stack of maths convert between 74cm and 40cm eye distances there are going to be rounding errors.

      Notice something no where in android is there a place to store eye distance. Wayland also has this problem.

      dots per inch is a 2d number for what is in fact a 3d problem. Yes fact is a 3d problem is what made android dpi and desktop dpi for screens not in fact line up with each other because eye distance is different. Think all person with the standard acceptable version past this point: a person with a 4k monitor on desk at 74-75cm can read text on that screen that a person that is 1.5 meters cannot read same with another person who is 40 cm off that screen can read stuff on that screen that the person sitting at 74-75 cannot. Eye distance is a very interesting factor.

      What you wrote might have seamed off topic but it really does bring in why this problem is so complex. Why people would be wanting fractional scaling. Distance to screen does effect what level of blur you can see. So if you are decent distance back scaling up and adding a bit of blur may matter because you cannot see the level of detail to see the blur anyhow. As you get closer to the screen the level of detail you can see goes up so at the same rate your tolerance to blur lowers. Yes higher dpi screens to make the blur smaller/less visible to human user.

      Yes the eye length being 74-75cm of the 96 dpi of HTML explains why it has to be scaled up on a phone that you are going to be holding at 40cm or less eye distance.

      There is no defined eye distance for hand held devices. There is no where to set a value to store eye distance in wayland, X11, android, windows and mac os for a screen. Remember this eye distance information being missing results in firefox and chrome and others needing to scale having to guess and at times use different eye distance values. Most people are not aware that 96dpi html has a eye distance and this is factor that is over looked.

      The years of 96 or 75 dpi monitors only really was a golden age.

      Comment


      • #43
        oiaohm

        When I wrote "legacy" and "clutches", I were taking the whole GTK ecosystem as legacy there. They and the Wayland team took unevenness and border displacement caused by rounding a disgusting point of Windows, prefer blurry shrink from integer scale as the more "correct" solution. On the other hand, there are a bunch of people who disagree with that. They want their applications, perhaps using other toolkits, embrace the unevenness and displacement by performing a nearest neighbour scaling to the UI borders. They also couldn't care less about gaps between different applications not aligning perfectly as long as everything look sharp. GTK can keep their stubbornness of treating UI interface as one piece of rigid picture without ever any systematic displacement, but there is no ground for Wayland to dictate everyone to wait for GTK.

        In the comment that you didn't quote, I wrote how the concept of "logical pixel" could be kept for seamless drag or mouse movement between screen spaces with different scaling factor, without banning all forms of native fractional scaling. Only if Wayland team (and perhaps also GTK team) recognize there are people out there who prefer uneven but sharp over even but blurry and stop insisting evenness is the absolute criteria of UI scaling, then we may have a chance of escaping the muddy puddle of fractional scale and get over it.

        ----

        It turns out 96dpi were NOT meant to be a match of physical reality in the beginning. Microsoft confessed that such setting was deliberately applied to 72dpi monitors such that documents displaying in "100%" will be physically 133% or 4/3 x. in the linked article you can see they justified it by monitors on desk is generally put farther away than usual books and documents on desk, in your word, for "Occupational Health". Microsoft dominated the personal computing world in 1990s thus the web adopted the "96dpi" number. The justification Microsoft gave does make sense but most people bought in to it only because 72dpi is too low to display documents with small text clearly. (They admit it was one of the motivations of such hack in the article too.) Therefore when monitors actually reached 96dpi physically, most people didn't tune up to "120dpi" but stay in "96dpi" instead.

        Speaking of the origin of "96dpi", I start to remember Windows has provided a setting to switch to "120dpi" since long time ago. But most of us didn't realize why it was provided when nearly nobody were owning a 120dpi screen. A hack was forgotten to be a hack and everyone applied it literally. Reading on desktop monitor screen also overwhelmed reading on paper. Now comes the smartphone. Smartphone reading distance and screen size constraint pushed a need of new hack. Just like how Microsoft lied in the past such that document sizing defined for paper printing were adopted creatively for desktop monitors, browsers in smartphone need to lie again.

        In this aspect the dpi number my phone give out make more sense now. The phone by default tries to squeeze roughly 1.8 times the physical content density of desktop display into phone screen. Meanwhile my preference ends up squeezing only roughly 1.5 times the density.

        ----

        I think it takes around 300dpi for "even but blurry" and "uneven but sharp" to converge and look similar to human beings in general. Before that, giving the choices back to users is a better strategy.

        Comment


        • #44
          Originally posted by billyswong View Post
          In the comment that you didn't quote, I wrote how the concept of "logical pixel" could be kept for seamless drag or mouse movement between screen spaces with different scaling factor, without banning all forms of native fractional scaling. Only if Wayland team (and perhaps also GTK team) recognize there are people out there who prefer uneven but sharp over even but blurry and stop insisting evenness is the absolute criteria of UI scaling, then we may have a chance of escaping the muddy puddle of fractional scale and get over it.
          Wayland development team presumed uneven but sharp would be preferred over even that either blurry or has artefacts that why the protocol has integer only scaling. The Reality is you cannot do fractional scaling without in fact causing problems.

          Originally posted by billyswong View Post
          It turns out 96dpi were NOT meant to be a match of physical reality in the beginning.
          No past here you disappear down a Microsoft written rabbit hole. 72 dpi of early apple comes directly from the Xero Star(1981) computer that was exactly 72 dpi Xero alto(1973) was close to the 72dpi. Crt monitor with range of 74-75 to users eyes forced by law of "Occupational Health" around the world in the time of the Xero Alto. The distance is legally enforced standard not a by chance thing this is 1972 by the way well before Microsoft. When Microsoft appears in the the "Occupational Health" laws and standards are well and truly set in stone. The 72dpi in the html standards is the Xero Star for pixels to human eyes. Yes 1981 when first version of MS Dos appears the 74-75cm to screen for CRT monitors is already law in many countries we are talking "work place health and safety"/"Occupational Health" that come into law between 1974 and 1978. The change to this laws does not happen again until after LCD then another change with led back-lighting.

          96dpi first appears not with Microsoft but with X11 workstations. Like the dec workstations they had 96 dpi monitors in 1984. That true 96dpi monitors. Of course this leads to problems having old X11 applications on Vga because wrong dpi. Yes you were needing svga monitors in the 1990s with Linux so applications looked right to get to the 96dpi.

          Yes the X11 workstations and Apple both did DPI basically the same way. Microsoft goes of and does their own creative mess. Now Android has done its own creative mess as well.

          The Windows solution was somewhat controversial. The decision was made to report the resolution of displays on Windows as about 1/3 greater than actual resolution.
          This is in the write up because Microsoft solution matched no body else and there was existing define Microsoft was choosing to ignore of course they are very careful not talking about the preexisting X11/Xerox standard for screen dpi in a big way . The HTML standard sorry to say uses X11 workstation and Apple DPI based on the older Xerox bit. 72/96 dpi is x1 in css. This is 74-75cm to user eyes


          Originally posted by billyswong View Post
          Speaking of the origin of "96dpi", I start to remember Windows has provided a setting to switch to "120dpi" since long time ago. But most of us didn't realize why it was provided when nearly nobody were owning a 120dpi screen. A hack was forgotten to be a hack and everyone applied it literally.
          No use on Linux/Unix in that time frame were very aware we need 120dpi on windows and 96dpi on Linux/BSD/Unix on the same monitor to basically have the same height in text in applications.

          Originally posted by billyswong View Post
          Reading on desktop monitor screen also overwhelmed reading on paper.
          The idea of scale up 1/3 you will notice that a lot of word processing programs attempt to get back to 100% size as in matching exactly on screen the size that will be printed out the printer(you can check how close with a bit of paper). Then you will notice something interesting if you look at 12 point font in the word process that is scaled to how it will be printed. This does not matter if you look on Windows, Mac OS and X11 on a desktop monitor that 12 point font size is basically the size of all the menus in default settings. So Microsoft hacked the DPI numbers then application developers hacked it back the other way. Yes the standard size recommend for printed document that is not a book Text is 12 point. This is one of those wacky things why is this all the same if you put sheet of paper on a desk and are reading it most of the time it will turn out to be 74-75 cm to users eyes. Think about your copy stands they hold the page next to monitor at the same distance as monitor.

          Reality is is reading on screen desktop and reading on paper is not as different as one would think. The smallest font in a book you will find to read generally is 10 point.

          Originally posted by billyswong View Post
          Now comes the smartphone. Smartphone reading distance and screen size constraint pushed a need of new hack. Just like how Microsoft lied in the past such that document sizing defined for paper printing were adopted creatively for desktop monitors, browsers in smartphone need to lie again.
          Not exactly right because you missed that Microsoft muck around with their DPI really did not change anything because the application developers just altered stuff so they would basically still be putting close to a 12 point font on the screen from the users point of view.

          Originally posted by billyswong View Post
          In this aspect the dpi number my phone give out make more sense now. The phone by default tries to squeeze roughly 1.8 times the physical content density of desktop display into phone screen. Meanwhile my preference ends up squeezing only roughly 1.5 times the density.
          Yes they have worked out this conversion factor without look at books. By books you want roughly 1.2 times the density of a computer monitor this is because your paper back books and the like will use a 10 point font. 1.8 times is basically taken you to 6-7 point font this is kind as fine print/hard to read this is you food ingredient list can font size.. Yes your 1.5 is more reasonable 8 point font.

          Yes when you ignore dpi and look at what real world font you are getting on the output things get horrible really quickly.

          Originally posted by billyswong View Post
          I think it takes around 300dpi for "even but blurry" and "uneven but sharp" to converge and look similar to human beings in general. Before that, giving the choices back to users is a better strategy.
          Horrible its higher 600dpi the start of diminishing returns for items you can get as close as have a phone though history. Printers have shown this making packaging and so on. 300dpi it still possible to pick up with normal human version blurry/uneven sharp quite well. Depending on the upscale/downscale artefact issue you may need to get all the way to 1200dpi to hide it this is also documented by people doing printed packaging and the like. There is decades of data here. Yes those making mobile phones have done a really good job at ignoring all this data. Microsoft also ignored all this data. Wayland developers were not people doing printing so they have also disregard some of this data.

          Its common to think that we are now not reading on paper that the lessons from paper don't apply. The reality us humans have not changed that much so the what we learnt from paper does still apply. Of course it working out what paper item is the best match to your problem space to be able to raid for data.

          billyswong there are a lot of write ups about dpi and the like that when you look closely are kind of garbage like the Microsoft one. Like Microsoft hiding the dpi then applications still making their applications have 12 point fonts on physical screen like every other OS. So there is really nothing we can learn from the way Microsoft did the DPI other than how to make applications do more guess me maths.

          Comment


          • #45
            https://gitlab.freedesktop.org/wayla...e_requests/143

            Here someone mentioned a suggestion of 120 or 360 as denominator. While the proposal there is of course very different from what I wrote here, 120 got a nice feature of being able to handle DE that want to let people scale in 10% steps exactly or those who want "133%" (1 1/3) or "167%" (1 2/3).

            The "160dpi" definition of Android can be regarded as a recommendation for phone makers to squeeze roughly 160/96 = 1 2/3 times of desktop PC content density into phone screen. The system was designed in the era a lot different from now. At that time smartphone screens were in general a lot smaller than now and screen space were a lot more precious. The first Android phone, HTC Dream, equipped a 3.2" screen (about 1x Android pixel density). The classical Android phone that many took a similar design for a long time, Samsung Galaxy S, equipped a 4.0" screen (about 1.5x Android pixel density). Now almost every new phone screens are 6-inches-plus. Text and images can afford a size closer to those printed on books. The 1 2/3 times shrinkage recommendation is also only a recommendation that no phones nor users are bounded to follow.

            If my proposal is to expand to handle Android devices, we will probably take "96dpi" in PC as 1x and "160dpi" in Android as also 1x and treat them equivalent. There are evidences in that too as both settings recommended a same 48x48 icon size. The "tvdpi" in this regard is just "167%" or 1 2/3 x, which is an okay choice for "smart" TVs even if some of those TVs are 1080p 24", as we don't sit in front of TVs as close to when in front of computers. It is the scaling factor that truly define, never the dpi numbers. Else my phone browsers outputting any dpi numbers outside "~457dpi" would be utter nonsense. Now they are doing 3 1/8 x in Chrome and 3 5/32 x in Firefox. Turns out infobyip.com computes my phone screen as 1080 width in Firefox (which is correct) and 1081.25 width in Chrome (which is silly). Firefox is the more honest browser here~

            p.s. Early Windows GUI use a 32x32 icon size. We can regard those days as the "72dpi" or 0.75x era despite they were never declared so within their system setting. There could have been an option to auto-scale legacy Windows applications from those days, but programmers at that time chose applying new internal font size and internal icon size manually. I wonder what would happen if such practice are done for hiDPI era.
            Last edited by billyswong; 23 June 2022, 04:27 AM.

            Comment


            • #46
              Originally posted by billyswong View Post
              https://gitlab.freedesktop.org/wayla...e_requests/143

              Here someone mentioned a suggestion of 120 or 360 as denominator. While the proposal there is of course very different from what I wrote here, 120 got a nice feature of being able to handle DE that want to let people scale in 10% steps exactly or those who want "133%" (1 1/3) or "167%" (1 2/3).

              The "160dpi" definition of Android can be regarded as a recommendation for phone makers to squeeze roughly 160/96 = 1 2/3 times of desktop PC content density into phone screen. The system was designed in the era a lot different from now. At that time smartphone screens were in general a lot smaller than now and screen space were a lot more precious. The first Android phone, HTC Dream, equipped a 3.2" screen (about 1x Android pixel density). The classical Android phone that many took a similar design for a long time, Samsung Galaxy S, equipped a 4.0" screen (about 1.5x Android pixel density). Now almost every new phone screens are 6-inches-plus. Text and images can afford a size closer to those printed on books. The 1 2/3 times shrinkage recommendation is also only a recommendation that no phones nor users are bounded to follow.

              If my proposal is to expand to handle Android devices, we will probably take "96dpi" in PC as 1x and "160dpi" in Android as also 1x and treat them equivalent. There are evidences in that too as both settings recommended a same 48x48 icon size. The "tvdpi" in this regard is just "167%" or 1 2/3 x, which is an okay choice for "smart" TVs even if some of those TVs are 1080p 24", as we don't sit in front of TVs as close to when in front of computers. It is the scaling factor that truly define, never the dpi numbers. Else my phone browsers outputting any dpi numbers outside "~457dpi" would be utter nonsense. Now they are doing 3 1/8 x in Chrome and 3 5/32 x in Firefox. Turns out infobyip.com computes my phone screen as 1080 width in Firefox (which is correct) and 1081.25 width in Chrome (which is silly). Firefox is the more honest browser here~

              p.s. Early Windows GUI use a 32x32 icon size. We can regard those days as the "72dpi" or 0.75x era despite they were never declared so within their system setting. There could have been an option to auto-scale legacy Windows applications from those days, but programmers at that time chose applying new internal font size and internal icon size manually. I wonder what would happen if such practice are done for hiDPI era.
              https://blog.icons8.com/articles/cho...mat-for-icons/
              Icons are not the best measure because there are some designer choices in there. 96 dpi most X11 stuff goes with 64x64 icons. Mac OS goes with 32x32 on 72dpi and 64x64 icons for most stuff on 96dpi but there are a stack of other sizes required based on design choice.

              Far better to measure the text height on the screen. On normal PC monitor the general text size will be 12 point or 4.23333mm or close (close is since not all monitors have perfect dpi values.) This turns out to be true on Windows, Mac OS, X11 desktops current day even when using different icon sizes. Yes even the old 72dpi screens of apple you would put a rule then and there is 12 point again.

              It would be interesting to measure the height text in like the Android setting screen and see how it compares to 12 point. Yes 12 for general text on a PC monitor was also decided in Xerox Parc. Lot of things were decided in Xerox Parc before Microsoft was even a idea sizes of icons particular dpi was not one of those things.

              Interesting enough the feature phone I use has 9 point font as you standard reading font or 3.175mm height that about 1 1/3 scale from you standard desktop PC monitor font heights.

              96/160 gives you 0.6 9/12=0.75 that is my feature phone. 8/12 gives you the 0.6666. 8/12 gives you 0.66666 If the 167% is true I would be expecting default text to be 8point/2.82222mm in height or close to.

              The real world font height measurement is more working out what the human is really seeing. It is useful that standard computer monitor is 12 point and standard printed page is 12 point. Yes 96dpi on a PC with html is meant to be displaying close to 12 point on a PC screen on 72 dpi it can be over size.

              Your guess of "160/96 = 1 2/3 times of desktop PC content" could be correct yes html did have some design consideration for being printed. We do have a font height value to work with that is basically a historic constant on paper and desktop computer monitors.

              Originally posted by billyswong View Post
              Turns out infobyip.com computes my phone screen as 1080 width in Firefox (which is correct) and 1081.25 width in Chrome (which is silly). Firefox is the more honest browser here~
              This is just the problem of math rounding errors when you do floating point maths. Yes firefox and chrome are different toolkits you asked them to perform a floating point maths operation(dpi scale calc for css in html) and one screws it up. How big of artefact will that generate that a kind of unknown. Think about the problem if a website was provided to you at exactly 1081 with chrome because that what chrome reported now it did not fit on screen correctly.

              Math errors is why if you can avoid applications doing fractional scaling you want to. No matter the toolkit you go floating point/factional scaling you are going to have rounding errors.

              Fractional scaling we will have artefacts and will have blurring. Yes even in libreoffice and MS Office basically all word processors on same monitor taking the scale of output up and down if you watch closely even that the fonts are being rendered there are placement artefacts. When its a integer scale point there is no placement artefacts.

              I do believe we need Fractional scaling. But we also need people to accept that fractional scaling will not be perfect because it cannot be perfect no matter how much we would wish. Yes fractional scaling has not been done anywhere in computer history without artefacts or bluring being a problem. Yes integer scaling has the same history of making items blocky/pixelated/jagged though all of history.

              Scaling we basically have to pick out poison. I would prefer to be able to pick the scaling between integer and fractional. Integer fixes the placement artefacts so when placement is critical internal windows parts I want integer. When placement is not critical but relative scale to other windows on screen is important that Fractional scale. When attempting to fit as much as possible on screen this is fractional scale again but with a price of a bit of blur.

              Scaling there is priceless option.

              Comment


              • #47
                Originally posted by oiaohm View Post
                Wayland development team presumed uneven but sharp would be preferred over even that either blurry or has artefacts that why the protocol has integer only scaling. The Reality is you cannot do fractional scaling without in fact causing problems.
                What they ended up doing, by refusing to deal with the real world full of monitors needing fractional scaling, means they chose the path of even but blurry effectively. Or at least that's what the early implementations of Wayland compositors did.

                Originally posted by oiaohm View Post
                No past here you disappear down a Microsoft written rabbit hole...
                It doesn't matter if 96dpi monitors exist before Windows. I was discussing where the practice of deliberately lying about monitor dpi begin. And you agree with me here.

                Originally posted by oiaohm View Post
                No use on Linux/Unix in that time frame were very aware we need 120dpi on windows and 96dpi on Linux/BSD/Unix on the same monitor to basically have the same height in text in applications.
                This is slowly changing over time as one can see the new non-win32 Windows UI usually have a bigger font now. Even the legacy Windows UI switched font size from "8pt" to "9pt" secretly. The Windows UI has become a Frankenstein.

                Originally posted by oiaohm View Post
                The idea of scale up 1/3 you will notice that a lot of word processing programs attempt to get back to 100% size as in matching exactly on screen the size that will be printed out the printer(you can check how close with a bit of paper). Then you will notice something interesting if you look at 12 point font in the word process that is scaled to how it will be printed. This does not matter if you look on Windows, Mac OS and X11 on a desktop monitor that 12 point font size is basically the size of all the menus in default settings. So Microsoft hacked the DPI numbers then application developers hacked it back the other way. Yes the standard size recommend for printed document that is not a book Text is 12 point. This is one of those wacky things why is this all the same if you put sheet of paper on a desk and are reading it most of the time it will turn out to be 74-75 cm to users eyes. Think about your copy stands they hold the page next to monitor at the same distance as monitor.

                Reality is is reading on screen desktop and reading on paper is not as different as one would think. The smallest font in a book you will find to read generally is 10 point.
                I have only a very faint memory of "application developers hacked it back the other way" part now you mentioned it. What I remember more strongly is that in CRT era, people usually zoom UI not by font setting, not by dpi setting, but by selecting screen resolution. 1 logical inch in buffer usually doesn't equal to 1 physical inch on screen when an application says it is displaying documents in "100%". It was more practical to do some test on that particular application and figure out the true percentage required, then to buy monitors and fix their resolutions in the "correct" way. It is introduction of LCD monitors that forced consumers to care font setting and dpi setting seriously.

                Originally posted by oiaohm View Post
                Yes they have worked out this conversion factor without look at books. By books you want roughly 1.2 times the density of a computer monitor this is because your paper back books and the like will use a 10 point font. 1.8 times is basically taken you to 6-7 point font this is kind as fine print/hard to read this is you food ingredient list can font size.. Yes your 1.5 is more reasonable 8 point font.

                Yes when you ignore dpi and look at what real world font you are getting on the output things get horrible really quickly.
                Speaking of 10pt font, this forum is running most text in 10.5pt. 10.5pt font when displayed in mobile phone is of course smaller than 10.5pt, effectively 7pt on my phone when only the 1.5 shrinkage is applied. If under the "default" setting, the text could have been below 6pt, unsuitable for any long-time reading. Fortunately the text size setting also takes effect to further enlarge text independent of the layout scaling.

                Originally posted by oiaohm View Post
                Horrible its higher 600dpi the start of diminishing returns for items you can get as close as have a phone though history. Printers have shown this making packaging and so on. 300dpi it still possible to pick up with normal human version blurry/uneven sharp quite well. Depending on the upscale/downscale artefact issue you may need to get all the way to 1200dpi to hide it this is also documented by people doing printed packaging and the like. There is decades of data here. Yes those making mobile phones have done a really good job at ignoring all this data. Microsoft also ignored all this data. Wayland developers were not people doing printing so they have also disregard some of this data.
                600dpi is too much a burden for power usage. Because we can pinch-zoom phone screens, "food ingredient list" text is not a case that phone screens need to serve a lot. If 600dpi is truly the point of diminishing return, the market should have been flooded with 1440p phones as middle end and 2160p phones as high end. But so far 1440p is the high end and 1080p is the middle end. I would rather say the turning point is somewhere between 300dpi and 600dpi and could be closer to 300dpi. Phone screens in 1080p can claim full capacity in playing HDTV, so there is an extra incentive outside text clarity to go beyond 720p.

                The 600dpi rule may be true for printing though, as printers don't print exact colours and require dithering.

                Comment


                • #48
                  Originally posted by billyswong View Post
                  This is slowly changing over time as one can see the new non-win32 Windows UI usually have a bigger font now. Even the legacy Windows UI switched font size from "8pt" to "9pt" secretly. The Windows UI has become a Frankenstein.
                  The Micorsoft 1/3 up scaling means that 8pt for a long time when you measured it on screen was 12pt. They mention that in the post. Yes that to 9pt was then dropping the scaling up slowly heading in the direction of closer to real.

                  Originally posted by billyswong View Post
                  I have only a very faint memory of "application developers hacked it back the other way" part now you mentioned it. What I remember more strongly is that in CRT era, people usually zoom UI not by font setting, not by dpi setting, but by selecting screen resolution.
                  Except before the LCD in big way we got in CRT we got https://en.wikipedia.org/wiki/Display_Data_Channel. This is 1994 where you start getting from new CRTs what physical size they were. The start of software being able to find out what the physical screen is so seeing past selecting screen resolution hacks.

                  Originally posted by billyswong View Post
                  Speaking of 10pt font, this forum is running most text in 10.5pt. 10.5pt font when displayed in mobile phone is of course smaller than 10.5pt, effectively 7pt on my phone when only the 1.5 shrinkage is applied. If under the "default" setting, the text could have been below 6pt, unsuitable for any long-time reading. Fortunately the text size setting also takes effect to further enlarge text independent of the layout scaling.
                  This a setting I would love in a mobile phone website browser. Min screen font size. If a font on a website is going to be under that straight up ask to scale up.

                  Originally posted by billyswong View Post
                  600dpi is too much a burden for power usage. Because we can pinch-zoom phone screens, "food ingredient list" text is not a case that phone screens need to serve a lot. If 600dpi is truly the point of diminishing return, the market should have been flooded with 1440p phones as middle end and 2160p phones as high end. But so far 1440p is the high end and 1080p is the middle end. I would rather say the turning point is somewhere between 300dpi and 600dpi and could be closer to 300dpi. Phone screens in 1080p can claim full capacity in playing HDTV, so there is an extra incentive outside text clarity to go beyond 720p.
                  There is a difference between what a user can see if asked and what they will tolerated. Remember for years we had printers that only went up to 300dpi. Lot of shopping dockets you get they are 300dpi.

                  Originally posted by billyswong View Post
                  The 600dpi rule may be true for printing though, as printers don't print exact colours and require dithering.
                  Those who did the work on printers were doing like film and ohp protection as wall. Yes printed film and printed ohp... Printing happens to cover a very broad field. remember some of you oled screens in fact come off printers.

                  You are right on one hand users start not caring about the difference at 300dpi. But if you serous set a person up in a side by side its 600dpi and the printing people test back-lighted items and well as paper backed items. The change of detection happens somewhere between 550dpi and 600dpi. This is when you are asking the person to look closely to see if they can tell the difference without any aids. When you are normally using a phone you are not exactly looking that closely. 300-400dpi is the point were you will get people who have not been told to look closely not to notice the difference until the artefact is particularly bad.

                  600dpi you get away with all forms of scaling artefacts almost 100 percent of the time on close up items. 300-400 dpi you start get away with some artefacts is about 70%. There is huge jump from 150 to 300. I am talking from about 10-15% of artefacts of scaling users are not noticing at 150dpi on a close up item.

                  The number is higher than where you were guessing but you were not completely out the ball park at 300 dpi. 300dpi is a point on the scale diminishing returns turns to be over the turning point that you are having to put in more resources to get improvements. Yes 600dpi is when you basically hit rock bottom on a item in hand. Yes printing 700+dpi on a package labelling unless you have insanely small print that you are expecting a person to use a magnifying glass to read it waste of printing time and ink.

                  600 dpi is the negative returns point of diminishing returns of hand held items. 250dpi is you 50% there point. Yes there is a huge change between 250 to 300dpi. as in 20% of detection defect fir 50 dpi change. 30% detection change is spread over the 300dpi from 300dpi to 600dpi with more of it closer to the 300dpi. Basically this is a curve.

                  I should have been more exact on this that 600 dpi is for a item in hand basically the absolute diminishing returns.point this is where you are not gaining anything at all and just wasting money.

                  300dpi is the start of where it could be good enough. So cost of production, power usage and screen quality all come into play here. 600dpi being the true absolute diminishing returns point visual quality does not equal market will be flooded with that.

                  Thing to remember most items in most market are not the absolute best. Generally a good enough quality product is somewhere between 60-80 percent of the best quality version of that product and is the market dominate one. So 300dpi lands exactly in the good quality with the 70% for a hand held item. Not the best quality but good enough quality.

                  Comment


                  • #49
                    Originally posted by oiaohm View Post
                    Except before the LCD in big way we got in CRT we got https://en.wikipedia.org/wiki/Display_Data_Channel. This is 1994 where you start getting from new CRTs what physical size they were. The start of software being able to find out what the physical screen is so seeing past selecting screen resolution hacks.
                    DDC is a good tool for OS to detect and select default resolution. But I don't think it is generally available to user applications in Win9x era. I did a search and numerous Win32 APIs for detecting monitors information & capacity started in Win2k and Vista. In Win9x, applications may need to know how to talk to the graphic card directly for DDC.

                    I don't even know if DDC is a good tool for OS to know the physical size of a monitor. I suspect they also lie or the mainstream 1080p 15" laptops nowadays won't get a 125% "recommendation" in Windows monitor dpi setting. They should have been 150%.

                    Originally posted by oiaohm View Post
                    This a setting I would love in a mobile phone website browser. Min screen font size. If a font on a website is going to be under that straight up ask to scale up.
                    Because pinch-zoom on phone is easy, only text of long paragraphs cause inconvenience when too small. Pinch-zooming them will cause one to sliding left and right all the time to read the paragraph. Short text don't have such concern.

                    Google Chrome does have a min font size setting in Android. Not all text on webpages are subjected to it and the magical detection may or may not fit one's wish. Meanwhile Firefox in Android allows a font size override independent of OS font setting and dpi setting, but it enlarges every text on screen.

                    Originally posted by oiaohm View Post
                    There is a difference between what a user can see if asked and what they will tolerated...
                    There's nothing wrong with you knowledge but... the "diminishing return" I understand is different from how you use the term. To me, "diminishing return" is where benefit start plateauing, not where benefit reach the plateau.

                    Comment


                    • #50
                      Originally posted by billyswong View Post
                      DDC is a good tool for OS to detect and select default resolution. But I don't think it is generally available to user applications in Win9x era. I did a search and numerous Win32 APIs for detecting monitors information & capacity started in Win2k and Vista. In Win9x, applications may need to know how to talk to the graphic card directly for DDC.
                      Windows 95 the API were not standard Win32. At Windows 9x and Windows Nt at the time had totally different ways of getting the DDC information. Windows 95 was the first to come with vesa driver with DDC support from Microsoft. Third party drivers has the same driver call interfaces on Windows 95 as the provided vesa driver as this was in the Microsoft documentation how to write graphics drivers for Windows 9x. Yes horrible. Good part is application were expected to fall back to guess monitor size by resolution and this was also in Microsoft book how to program for Windows 95 on. So yes there is way it is directly call to driver using particular interfaces with 95 good part those interfaces were stable from 95-ME then superseded with Win32 APi so you would not need to use direct to driver calls.

                      Of course not all applications in the Windows 95-98-ME time frame did check for real screen size. This is why thing standard getting problem with I will just change the screen size to change scale the applications. The application in that time frame that did check would not scale that way. So majority of you applications would scale by resolution guess for the 9x time frame but there were the minority that scaled the DDC EDID information.

                      Windows 95 push for plug and play for monitors was that the information would come over DDC and if it did not it fell back on resolution settings guess.

                      Originally posted by billyswong View Post
                      I don't even know if DDC is a good tool for OS to know the physical size of a monitor. I suspect they also lie or the mainstream 1080p 15" laptops nowadays won't get a 125% "recommendation" in Windows monitor dpi setting. They should have been 150%.
                      Direct connected displays you can skip out on having a proper DDC chip. Yes a lot of those laptops that don't scale also don't have a scaler so if you give non native resolution to them they don't work right either. Yes you find in vendors providing drivers they install a inf file with e edid file to give windows the edid inforamtion without using DDC to get it https://docs.microsoft.com/en-us/win...-monitor-edids . Welcome to the winmodems of screen detection.

                      Yes some vendors do manage to send up laptops with the wrong models monitor inf installed(so wrong edid information) so this now means Windows has the wrong mm x mm for the screen so has the wrong dpi for the screen so causing that problem as well. Yes people go mucking around overriding dpi and so on when the simple fix is install the inf so have correct edid information so windows scales right and curse the vendor. People fail to check if the OS is reporting the right x/y mm of there screen before messing with DPI settings lot of those laptops with the problem you wrote about is a defective edid information and the item does not in fact have a DDC chip.

                      Optional parts and cost cutting for you.

                      Originally posted by billyswong View Post
                      Because pinch-zoom on phone is easy, only text of long paragraphs cause inconvenience when too small. Pinch-zooming them will cause one to sliding left and right all the time to read the paragraph. Short text don't have such concern.
                      That why I don't see pinch-zoom as a solution.

                      Originally posted by billyswong View Post
                      Google Chrome does have a min font size setting in Android. Not all text on webpages are subjected to it and the magical detection may or may not fit one's wish. Meanwhile Firefox in Android allows a font size override independent of OS font setting and dpi setting, but it enlarges every text on screen.
                      I did not know chrome has a min. Being auto magic where you cannot say no because some sites have really small text you don't want to read. This is one of these things you need user controllable on a site by site base. The first time you visit a site that has really small text informing user that it might be unreadable and where the adjustment is would be useful.

                      Basically a zoom button that is click/tap it and this triggers the browser to look at the site and look for the smallest text and scale accord.

                      https://www.w3schools.com/css/css_font_size.asp

                      font size in html as define in css is a total curse. Yes you can scale a font up and down on screen width. So this is problem needing some wacky work around. Because what displays right on larger screens will not on smaller.

                      Originally posted by billyswong View Post
                      There's nothing wrong with you knowledge but... the "diminishing return" I understand is different from how you use the term. To me, "diminishing return" is where benefit start plateauing, not where benefit reach the plateau.
                      This is like the 3 averages. I should have stated that 600dpi was the end of returns. So a hand held device up 600dpi makes sense when being used directly by human eye. Past that does not make sense at all.

                      At least from all those people working on printing for packaging and so on we know where the end of returns for different use cases is and what the curve looks like for each use case. Horrible part is this information is not getting into application design that much.

                      Comment

                      Working...
                      X