Announcement

Collapse
No announcement yet.

GTK 4.14 To Provide Crisper Font Rendering, Better Fractional Scaling

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by You- View Post
    As for "being slower", that is reaching 1300fps in the benchmarks instead of 2000fps. It should make no difference to applications. Where it does make a difference, the reason needs to be identified and fixed. Its how development happens.
    wrong, read the thread and check the links in the first post. Even basic window resizing is extremely laggy, even on 11th gen Intel iGPUs. This new renderer is broken, GTK devs live in a fantasy land where everything they do is perfect.

    Comment


    • #42
      Originally posted by devling View Post

      I have to admit that MacOS have extremely good font rendering, and have had for many years. I'm not sure exactly what they are doing, but it looks noticeably better than any windows or linux installation I've seen so far. So I'd say it is solved, do that Apple does and you're good. I'm no Apple fan by any means, but this one thing they got nailed down really well!
      Font rendering on MacOS is pure shit. Trust me, I've had my dosis. Fonts look great on their HiDPI screens. Of course! But just try to connect a normal DPI monitor as a secondary screen and you can clearly see just how bad their font rendering actually is, it's just not visible on HiDPI screens. My employer handed out HiDPI screens
      not Apple's because they really are not worth their price) to us without any questions asked, that's how grave it is.

      Comment


      • #43
        Originally posted by devling View Post

        I have to admit that MacOS have extremely good font rendering, and have had for many years. I'm not sure exactly what they are doing, but it looks noticeably better than any windows or linux installation I've seen so far. So I'd say it is solved, do that Apple does and you're good. I'm no Apple fan by any means, but this one thing they got nailed down really well!
        Not only doesn't current gen macOS have subpixel rendering, they disable hinting, and even before they disabled subpixel rendering they had already mostly disabled hinting. Here is a simple fact: MacOS has ALWAYS had the absolute WORST on screen text rendering. They have made a choice that text should look like print at the cost of readibiility and sharpness on screen. This defect is mostly solved these days with high resolutions screens that hide how blurry their fonts is, but it is still technically terrible.

        Comment


        • #44
          I prefer the way macOS and to a certain extent GTK/GNOME handles text rendering; by retaining font shape over snapping to physical (sub)pixels. So it's a bit unfair to say "worst" when it's more of a subjective/philosophical choice.

          Comment


          • #45
            Originally posted by access View Post
            I prefer the way macOS and to a certain extent GTK/GNOME handles text rendering; by retaining font shape over snapping to physical (sub)pixels. So it's a bit unfair to say "worst" when it's more of a subjective/philosophical choice.
            If you prefer to look at blurry characters, where your eyes try to fix on and get them "in focus", although of course destined to fail in doing so, because the character itself is blurry, without ruining your eyesight, more power to you.

            I prefer super crisp sharp characters and would even disable any antialiasing as long as the font is crystal clear.

            Windows is horrible, Mac even worse (if not HiDPI, although technically it is still horrible, you just don't notice in normal text, but very much in structured texts), Linux Qt/etc with full hinting however is great.

            Only with hinting the characters are actually the same, without they get smeared across the pixel raster: IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII IIIIIIII
            On the Mac here I can see all the "I" have a different width even on HiDPI
            Last edited by reba; 08 March 2024, 07:24 AM.

            Comment


            • #46
              Originally posted by jabl View Post

              Apple assumes everyone has a HiDPI screen and acts accordingly. As others mentioned, they have removed subpixel rendering from macOS. If you can't afford a HiDPI screen, you're not worthy of being an Apple customer. Go back to Windows, peon.

              Originally posted by reba View Post

              Font rendering on MacOS is pure shit. Trust me, I've had my dosis. Fonts look great on their HiDPI screens. Of course! But just try to connect a normal DPI monitor as a secondary screen and you can clearly see just how bad their font rendering actually is, it's just not visible on HiDPI screens. My employer handed out HiDPI screens
              not Apple's because they really are not worth their price) to us without any questions asked, that's how grave it is.
              Originally posted by carewolf View Post

              Not only doesn't current gen macOS have subpixel rendering, they disable hinting, and even before they disabled subpixel rendering they had already mostly disabled hinting. Here is a simple fact: MacOS has ALWAYS had the absolute WORST on screen text rendering. They have made a choice that text should look like print at the cost of readibiility and sharpness on screen. This defect is mostly solved these days with high resolutions screens that hide how blurry their fonts is, but it is still technically terrible.
              Their font rendering isn't bad per se as they don't really do anything Apple-specific with the fonts, they just take the glyphs and output them on the screen as-is, which means they simply align themselves to the screen's pixels, which means that in order for them to keep their proper shape they need a pixel grid dense enough that they will not get bent and distorted out of proportion.

              On a lower DPI screen (i.e. on most screens on the market since the dawn of time) this would look atrocious, and that's why antialiasing and hinting and subpixel rendering were created. But to be fair, Apple do have total control of the hardware that their software will run on, and they do provide you with the HiDPI screens you need to enjoy their font rendering style, and that is a conscious technological and marketing choice, not a "solution" for a "defect". And of course most Apple customers would never deign to connect a non-Apple, low-DPI external screen to their precious Mac, and even if they did, they would blame the non-Apple screen for being of low quality and destroying the beautiful fonts of their Mac. There's a reason Apple and their user base have always been perceived as a cult.​​​

              Originally posted by access View Post
              I prefer the way macOS and to a certain extent GTK/GNOME handles text rendering; by retaining font shape over snapping to physical (sub)pixels. So it's a bit unfair to say "worst" when it's more of a subjective/philosophical choice.
              This "philosophical choice" can only work if you have a HiDPI screen, otherwise fonts will look more than hideous. Apple can (and do) make the choice to disable subpixel rendering because they are a hardware vendor and thus they can (and do) provide you with the required HiDPI screens, that's what you pay them big $$$ for. Gnome and GTK, on the other hand, are just software vendors and thus have NO control over what kind of screens their users will be buying and using, which means that if they want to be taken seriously, they need to get off their high horse, understand that they are NOT and will NEVER be Apple, and provide their users with sane options for all common use cases, especially the most common case of them all: low DPI screens.

              Their stance is even more ridiculous when you realize that subpixel rendering is togglable. Providing subpixel rendering doesn't mean that the OS (or the fonts themselves) can't disable it at higher DPIs, which means that there really is no valid reason for them refusing to provide it, other than either stupidity/stubbornness, technical incapability, or probably a mix of both.

              Comment


              • #47
                Originally posted by Nocifer View Post
                On a lower DPI screen (i.e. on most screens on the market since the dawn of time) this would look atrocious, and that's why antialiasing and hinting and subpixel rendering were created. But to be fair, Apple do have total control of the hardware that their software will run on, and they do provide you with the HiDPI screens you need to enjoy their font rendering style, and that is a conscious technological and marketing choice, not a "solution" for a "defect".​​​
                Apple had this kind of rendering (though with sub-pixel rendering, but still no hinting), _long_ before they had HiDPI screens. And mac's were historically lower res than PCs, defaulting to 72DPI where the standard for PCs was 96DPI (which also made it much easier for them to implement 2x HiDPI only needing 144DPI, where Window and Linux PCs had to get much higher res screen or use fractional scaling). So Apple have always had a policy of preferring blurry screen text that looked more like print, than sharp text that was easy to read on the computer. And Apple did invent many of the font technologies involved, they actually shaped text rendering on many fronts. They just made choices that made their text look good if you squinted or needed it to look like it would later in print, but were bad if you needed to use it all day on screen.
                Last edited by carewolf; 08 March 2024, 08:15 AM.

                Comment


                • #48
                  Still sounds like preferences and philosophies. BTW I use Debian testing with GNOME 45 grayscale (non-subpixel) AA without hinting all day long on a couple of 32" 4k screen running native res without problem which works formy eyes and preferences. I find Windows CT font rendering to be way to jagged and fringed to be enjoyable.

                  ... also looks like OLED and other esoteric subpixel layouts are not a solved issue on Windows if this issue is to be believed:

                  Description of the new feature / enhancement NOTE: I do paid work with display manufacturers. Repost of incorrectly-closed github item that someone else posted: ClearType alters anti-aliasing assum...


                  which lead me to this stuff:

                  A better way to configure ClearType font smoothing on Windows 10. - bp2008/BetterClearTypeTuner


                  and apparently the main monitor dictates the CT algo (which would be quite jarring for my setup where the laptop display is OLED and the other two TFT, again; personal circumstances)

                  So looks like a technical and philosophical quagmire whichever way you look at it. So I understand why GTK might want to do like Apple and just go with the good enough (for various levels of good enough) solution of "grayscale" only AA.

                  Comment


                  • #49
                    Originally posted by access View Post
                    Subpixel AA brings a shitload of complexity though...
                    Damn right it's complex. But consider all of the vast complexity involved in io_uring, or eBPF, or javascript engines, or a 3D game engine that uses vulkan.

                    Hour for hour, eyeball for eyeball, 2D text rendering is one of the most important things a client machine does. It's worthwhile to spend as much complexity as necessary to provied the maximum amount of legibility and beauty for the largest fraction of the display market and install base (as it actually is, not as you wish it were).

                    and not always with better result. Especially with non-traditional sub-pixel layouts. How well does e.g. ClearType work with BGR or WRGB OLED.
                    You're mostly thinking of old-school subpixel AA, before it was understood that there is no increasing resolution in the subpixel axis, you still have to obey the Nyquist sampling criterion, and what subpixel AA actually does is correct for predictable convergence error in the output hardware. See the Freetype documentation about the Harmony renderer.

                    To be fair, WRGB is still a problem because subpixel offsets probably vary with saturation and lumiance.

                    IIRC it brings a performance penalty too as it trashes the glyph caches.
                    AIUI, GTK has already chosen to 4x the glyph cache footprint for sub-pixel horizontal positioning. Instead of doing that, why not rasterize at physical subpixel offsets and swizzle the color channels based on character position? That way they'd only be 3x-ing the glyph cache.

                    Comment


                    • #50
                      Originally posted by yump View Post
                      You're mostly thinking of old-school subpixel AA, before it was understood that there is no increasing resolution in the subpixel axis, you still have to obey the Nyquist sampling criterion, and what subpixel AA actually does is correct for predictable convergence error in the output hardware. See the Freetype documentation about the Harmony renderer.

                      To be fair, WRGB is still a problem because subpixel offsets probably vary with saturation and lumiance.
                      Thanks for the link, interesting read. 👍

                      Comment

                      Working...
                      X