Announcement

Collapse
No announcement yet.

Fedora Enables ClearType Subpixel Font Rendering Thanks To Microsoft

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by GrayShade View Post
    Isn't software moving away from sub-pixel to grayscale antialiasing?
    math for that is pretty simple. only difference is that moving away is not 1 and 0 type of answer

    if you can discern separate pixels on screen you more or less need subpixel. as soon as your dpi is so high that you can't discern them, subpixel will only produce worse results and have a lot of pointless utilization to blur what would be clear anyway

    subpixel will be pretty much default as long as majority of user monitors won't have dpi high enough

    Comment


    • #12
      Originally posted by kokoko3k View Post
      So you are calling everyone who prefers the the new interpreter an insane idiot.. fine.
      The new antialiasing method looks great on high DPI screens (where you can actually don't use subpixel antialiasing at all) but looks completely awful on usual monitors. This sentiment is shared across many communities and forums. Perhaps, "insane idiots" wasn't the best moniker but "the filthy rich" who can afford high-DPI monitors/laptops.

      Comment


      • #13
        Originally posted by birdie View Post
        This is not enough as you still need to have sane defaults and thanks to the idiocy of new freetype releases you also have to have this:

        Code:
        /etc/profile.d/freetype.sh
        
        export FREETYPE_PROPERTIES=truetype:interpreter-version=35
        Otherwise you'll have something a tad too blurry, dirty and bold by default.
        I just tested both, and there's no difference (both produce the same pixels.) This is on freetype 2.9.1 build with FT_CONFIG_OPTION_SUBPIXEL_RENDERING and TT_CONFIG_OPTION_SUBPIXEL_HINTING 2.

        The test is starting kwrite (KDE text editor) twice:

        $ kwrite
        $ env FREETYPE_PROPERTIES="truetype:interpreter-version=35" kwrite

        Both have exactly the same font rendering.
        Last edited by RealNC; 31 October 2018, 10:41 AM.

        Comment


        • #14
          I went to great lengths in the past to get fonts to render acceptably on Linux. Recent versions of Freetype haven't required any tweaking beyond making sure subpixel is enabled in whatever tool the desktop provides. I remain unconvinced about grayscale.

          Font rendering in uptodate distributions is good here on a 27-inch 2560 pixel display. I have a 4096 display on a shelf because I've been less that happy about integer-only HiDPI scaling. I may drag it down and see what fractional scaling on Fedora 29 Wayland looks like. But, that's not a Freetype thing.

          Comment


          • #15
            I looked at the (cleartype) patents listed on the page and I'm pretty confident that over half of them have already expired. You can verify this with the online patent calculators.
            Last edited by caligula; 31 October 2018, 10:58 AM.

            Comment


            • #16
              Originally posted by Zyklon View Post
              Does this make a difference on UHD screens?
              Depends on how big your text is, and where you're sitting. If you're scaling to 2x (and the angular area of a pixel is quartered), it's generally not necessary, but it doesn't hurt much (aside from *some* performance loss, which in my opinion is negligible but people differ on that)

              Comment


              • #17
                Wasn't this already included in Ubuntu?

                Comment


                • #18
                  Originally posted by RealNC View Post
                  I just tested both, and there's no difference (both produce the same pixels.)
                  OK, it seems it doesn't work when using "slight" hinting in fontconfig. You need to switch to "full" hinting, and truetype:interpreter-version=35 will work. However, this produces horrible results. The word "game" for example looks normal by default, but when switching to v35, it is rendered as "ga me" (there's too much space between the "a" and "m".)

                  So to me this looks like it makes things worse, not better...

                  Comment


                  • #19
                    Well, the trap is setup...

                    Comment


                    • #20
                      Actually, subpixel font rendering is a story that is very problematic. The issue is that there is no single preferred font rendering that is universally accepted as such. Windows users detest Mac rendering and vice versa. It's a matter of getting used to. There is no real truth here. More on this topic:

                      I've finally determined What's Wrong With Apple's Font Rendering. As it turns out, there actually wasn't anything wrong with Apple's font rendering, per se. Apple simply chose a different font rendering philosophy, as Joel Spolsky explains: Apple generally believes that the goal of the algorithm should be to preserve the



                      The topic is getting more moot as we approach higher density screens. The only thing that is certain is that no one can claim best rendering algorithm on low ppi (lets say under 140ppi) screens. It's in the eye of beholder.

                      Comment

                      Working...
                      X