Why oh why is the truetype bytecode interpreter turned off?

Posted:
in macOS edited January 2014
I'm not a fan of anti-aliasing. It tends to make everything blurry as heck. So I have used that defaults write line to fix Safari to not anti-alias text 24 points and under. It does it anyway sometimes for certain 'system' fonts like Lucida Grande.



I spent a while trying some settings out on the MacBook.. use the 'strong' setting, not the medium. It does look slightly better.



And for web browsing with AA turned off, I recommend Verdana 22 point, and for the monospace, Bitstream Vera Sans (free) also at 22 point.



Anyways, there are a # of fonts out there that would look better when AA is turned off, if only they had access to a truetype bytecode interpreter. But Apple isn't using that technology anymore Windows and Linux both still use it.



I decided to try using some old windows type 1 postscript fonts I had lying around because they came with bitmaps to see if that would look better on the screen, except they don't seem to work on OS X.. Who knew Type 1 files came in a Mac format.



Seriously. Why do people think blurry fonts are good for the eyes?? At large font sizes it's a good idea, but not at smaller sizes.

Comments

  • Reply 1 of 15
    chuckerchucker Posts: 5,089member
    Quote:
    Originally Posted by Technarch


    Seriously. Why do people think blurry fonts are good for the eyes?



    Because it makes it possible to display higher detail than the resolution actually supports, especially with subpixel rendering.
  • Reply 2 of 15
    placeboplacebo Posts: 5,767member
    Text in Windows looks a ton more crisp.
  • Reply 3 of 15
    I'm not entirely convinced.. people rave about Cleartype on XP, but I always turned it off and used nothing at all.



    The other thing is, when you have a working bytecode interpreter, letters are spaced properly apart. But when it's turned off, you will have letters touching a /, being too close to each other, etc. As well as some fonts that have no hinting at all, and get these horrible black regions without the interpreter to 'shape' them even if no AA is performed.



    The thing is though.. in the system preferences there is an option to 'turn off AA below a certain size'.. and the max supported size is 12 point.. so I feel that if it is being supported, there should also be a feature to enable bytecode interpretation (which is a technology Apple used for quite some time in System 7 to 9) which would give better shaped letters at 12 point and under.



    I realize noone here can do anything about it. I'm just saying, is all.
  • Reply 4 of 15
    almalm Posts: 111member
    I use antialiasing whenever it available. ClearType and subpixel rendering is one of few things Microsoft did really good. I personally just can't stand pixelated fonts. But I know some people who like it or just can't tell difference. When I switched to Mac (after 15 years of PC) I was disappointed on how Mac OS render fonts. They looked weird. But after year of living on the Mac they all look fine. I still see defference between Windows fonts and OSX, but it don't disturb me anymore. So guess it's just a matter of personal preference.



    And anyways I think Windows use better technology of subpixel font rendering than OSX. Especially for small fonts.



    But again - it's all in mind
  • Reply 5 of 15
    It's all so subjective that it's not worth discussing...



    But here are some of the variables involved:



    Quality of the LCD or CRT display

    Visual acuity of the user

    Tolerance level of the user

    "Font smoothing style" level (in OS X)
  • Reply 6 of 15
    mr. memr. me Posts: 3,221member
    Quote:
    Originally Posted by ALM


    ...



    And anyways I think Windows use better technology of subpixel font rendering than OSX. Especially for small fonts.



    But again - it's all in mind



    FWIW, subpixel rendering was invented and patented by Apple Computer and implemented first on the Apple ][. Microsoft released the technology under the name ClearType after Apple's patent expired.



    As for the superiority of of Windows text rendering, I have yet to see it. Most monitors connected to Windows computers that I have seen are too blurry to render fine details. The documents that I receive from my Windows-using colleagues reinforce my observations. Typically, spaces must be added between a large percentage of words and extraneous spaces must be deleted before documents can go forward.
  • Reply 7 of 15
    Quote:
    Originally Posted by kim kap sol


    It's all so subjective that it's not worth discussing...



    But here are some of the variables involved:



    Quality of the LCD or CRT display

    Visual acuity of the user

    Tolerance level of the user

    "Font smoothing style" level (in OS X)



    And personal preference.



    I can't stand working on XP mainly because none of the fonts are antialiased.
  • Reply 8 of 15
    Quote:
    Originally Posted by Technarch


    Seriously. Why do people think blurry fonts are good for the eyes?? At large font sizes it's a good idea, but not at smaller sizes.



    Percieved legibility and true legibility are different things. At about 12 pts and higher for sans-serif fonts, and a little bit higher for 14 pts for more complex serif fonts, anti-aliasing increases legibility. (This is for hinted fonts. For non-hinted fonts, AA almost always increased legibility at any size.)



    Since most people read san-serif fonts at 12 pts, it's a win. You can turn off anti-aliasing for fonts up to size 12, meaning that in any possible case where the anti-aliasing slows you down, you can disable it.



    For fonts sized 12-14 and up, anti-aliasing always increased legibility—aliasing errors in non-anti-aliased text become distractions.







    Quote:
    Originally Posted by Technarch


    And for web browsing with AA turned off, I recommend Verdana 22 point, and for the monospace, Bitstream Vera Sans (free) also at 22 point.



    I hope you're talking about one of those super high resolution Windows screens.



    You know that reading jumbo-humungo-sized type all the time greatly reduces legibility, right?



    And plus, at 22 points, not having AA is just stupid.
  • Reply 9 of 15
    placeboplacebo Posts: 5,767member
    Well, I just turned on Windows ClearType, and I'm startingto prefer it to unantialiased XP. Looks great.
  • Reply 10 of 15
    almalm Posts: 111member
    Quote:
    Originally Posted by Mr. Me


    FWIW, subpixel rendering was invented and patented by Apple Computer and implemented first on the Apple ][. Microsoft released the technology under the name ClearType after Apple's patent expired.



    As for the superiority of of Windows text rendering, I have yet to see it. Most monitors connected to Windows computers that I have seen are too blurry to render fine details. The documents that I receive from my Windows-using colleagues reinforce my observations.





    I don't know much about patents, but isn't it granted for like 50 years or so?





    As for blurry monitors - I compared it on the same Apple Cinema 23" monitor. But say it again - whole thing is very subjective.







    Quote:
    Originally Posted by Mr. Me


    Typically, spaces must be added between a large percentage of words and extraneous spaces must be deleted before documents can go forward.



    Font engine have very little to do with it. Windows GDI is pretty old-fashioned, it's all based on integers so there's no fine-detailing. And all rounding errors very noticeable on small fonts. Application must take care of kerning and all other font features. Adobe for example use it's own font engine (which btw use it's own subpixel rendering too). If you use Notepad or Write you'll get bad kerning (if any at all). Word supposed to do this job better. I don't know how OSX grapahics engine built so can't tell. As for extra spaces between words - it's user's fault, not application. And definitely not OS.
  • Reply 11 of 15
    almalm Posts: 111member
    I did little research. Apple never patented subpixel rendering, but they used it in Apple ][ "to enchance screen resolution". I don't know what does it mean, but word "font" wasn't used. This technique was also used by IBM, Xerox and many others. But Microsoft seem to be first who patented it.
  • Reply 12 of 15
    Quote:
    Originally Posted by ALM


    I don't know much about patents, but isn't it granted for like 50 years or so?





    As for blurry monitors - I compared it on the same Apple Cinema 23" monitor. But say it again - whole thing is very subjective.











    Font engine have very little to do with it. Windows GDI is pretty old-fashioned, it's all based on integers so there's no fine-detailing. And all rounding errors very noticeable on small fonts. Application must take care of kerning and all other font features. Adobe for example use it's own font engine (which btw use it's own subpixel rendering too). If you use Notepad or Write you'll get bad kerning (if any at all). Word supposed to do this job better. I don't know how OSX grapahics engine built so can't tell. As for extra spaces between words - it's user's fault, not application. And definitely not OS.



    Prior to about 5 years ago, patents ran for 17 years from the date granted.



    Now they run for 20 years from the date they were applied for.
  • Reply 13 of 15
    almalm Posts: 111member
    Ok. First of all, Apple didn't used it for fonts. And they actually didn't used the fact that monitor tube use three different monochormatic dots to render complete colored pixel. Apple II resolution was 280 and even those subpixels were much bigger than CRT dots. Single subpixel occupied many monochromatic dots on the screen. What they did is they allowed user to address coloured subpixels inside normal white pixel. It's like addressing bits inside byte. Similar thing was used in Sinclair ZX computers. They had 8x8 "full-colored" blocks for which you could set any two colours and then draw inside it using only those two colors. Idea is to save memory and allow higher resolution. Subpixels on Apple II were huge and you could clearly see that they actually colored.



    Microsoft on the other hand actually used the fact that LCD pixels made of three dots and program "know" where each software pixel will be located on actual screen. They specially used those monochrome dots to increase resolution. Also they used the fact that human eye won't notice single coloured dot. ClearType designed for LCD and it might work well on Trinitron CRT because it have similar pattern of RGB dots. Never saw ClearType on traditional CRTs, but I guess it will look blurry indeed because those RGB triads arranged differently and there's no way to predict where computer pixel will be located on the screen.



    Those techniques look the same, but they not. Common word "subpixel", but approaches and goals are totally different.
  • Reply 14 of 15
    mr. memr. me Posts: 3,221member
    Quote:
    Originally Posted by ALM


    ....



    Font engine have very little to do with it. ...



    Who said anything about font engines? My previous post was in the context of monitor resolution, not font engines. The poor quality of their monitors leads them to believe the existing spaces do not exist or that their are spaces where there are in fact none.
  • Reply 15 of 15
    almalm Posts: 111member
    Quote:
    Originally Posted by Mr. Me


    As for the superiority of of Windows text rendering, I have yet to see it. Most monitors connected to Windows computers that I have seen are too blurry to render fine details.



    Ah, sorry. You said you didn't saw windows on good monitors. Ok. I saw and I say windows fonts look better. Especially small sizes.
Sign In or Register to comment.