Why oh why is the truetype bytecode interpreter turned off?
I'm not a fan of anti-aliasing. It tends to make everything blurry as heck. So I have used that defaults write line to fix Safari to not anti-alias text 24 points and under. It does it anyway sometimes for certain 'system' fonts like Lucida Grande.
I spent a while trying some settings out on the MacBook.. use the 'strong' setting, not the medium. It does look slightly better.
And for web browsing with AA turned off, I recommend Verdana 22 point, and for the monospace, Bitstream Vera Sans (free) also at 22 point.
Anyways, there are a # of fonts out there that would look better when AA is turned off, if only they had access to a truetype bytecode interpreter. But Apple isn't using that technology anymore Windows and Linux both still use it.
I decided to try using some old windows type 1 postscript fonts I had lying around because they came with bitmaps to see if that would look better on the screen, except they don't seem to work on OS X.. Who knew Type 1 files came in a Mac format.
Seriously. Why do people think blurry fonts are good for the eyes?? At large font sizes it's a good idea, but not at smaller sizes.
I spent a while trying some settings out on the MacBook.. use the 'strong' setting, not the medium. It does look slightly better.
And for web browsing with AA turned off, I recommend Verdana 22 point, and for the monospace, Bitstream Vera Sans (free) also at 22 point.
Anyways, there are a # of fonts out there that would look better when AA is turned off, if only they had access to a truetype bytecode interpreter. But Apple isn't using that technology anymore Windows and Linux both still use it.
I decided to try using some old windows type 1 postscript fonts I had lying around because they came with bitmaps to see if that would look better on the screen, except they don't seem to work on OS X.. Who knew Type 1 files came in a Mac format.
Seriously. Why do people think blurry fonts are good for the eyes?? At large font sizes it's a good idea, but not at smaller sizes.
Comments
Seriously. Why do people think blurry fonts are good for the eyes?
Because it makes it possible to display higher detail than the resolution actually supports, especially with subpixel rendering.
The other thing is, when you have a working bytecode interpreter, letters are spaced properly apart. But when it's turned off, you will have letters touching a /, being too close to each other, etc. As well as some fonts that have no hinting at all, and get these horrible black regions without the interpreter to 'shape' them even if no AA is performed.
The thing is though.. in the system preferences there is an option to 'turn off AA below a certain size'.. and the max supported size is 12 point.. so I feel that if it is being supported, there should also be a feature to enable bytecode interpretation (which is a technology Apple used for quite some time in System 7 to 9) which would give better shaped letters at 12 point and under.
I realize noone here can do anything about it. I'm just saying, is all.
And anyways I think Windows use better technology of subpixel font rendering than OSX. Especially for small fonts.
But again - it's all in mind
But here are some of the variables involved:
Quality of the LCD or CRT display
Visual acuity of the user
Tolerance level of the user
"Font smoothing style" level (in OS X)
...
And anyways I think Windows use better technology of subpixel font rendering than OSX. Especially for small fonts.
But again - it's all in mind
FWIW, subpixel rendering was invented and patented by Apple Computer and implemented first on the Apple ][. Microsoft released the technology under the name ClearType after Apple's patent expired.
As for the superiority of of Windows text rendering, I have yet to see it. Most monitors connected to Windows computers that I have seen are too blurry to render fine details. The documents that I receive from my Windows-using colleagues reinforce my observations. Typically, spaces must be added between a large percentage of words and extraneous spaces must be deleted before documents can go forward.
It's all so subjective that it's not worth discussing...
But here are some of the variables involved:
Quality of the LCD or CRT display
Visual acuity of the user
Tolerance level of the user
"Font smoothing style" level (in OS X)
And personal preference.
I can't stand working on XP mainly because none of the fonts are antialiased.
Seriously. Why do people think blurry fonts are good for the eyes?? At large font sizes it's a good idea, but not at smaller sizes.
Percieved legibility and true legibility are different things. At about 12 pts and higher for sans-serif fonts, and a little bit higher for 14 pts for more complex serif fonts, anti-aliasing increases legibility. (This is for hinted fonts. For non-hinted fonts, AA almost always increased legibility at any size.)
Since most people read san-serif fonts at 12 pts, it's a win. You can turn off anti-aliasing for fonts up to size 12, meaning that in any possible case where the anti-aliasing slows you down, you can disable it.
For fonts sized 12-14 and up, anti-aliasing always increased legibility—aliasing errors in non-anti-aliased text become distractions.
And for web browsing with AA turned off, I recommend Verdana 22 point, and for the monospace, Bitstream Vera Sans (free) also at 22 point.
I hope you're talking about one of those super high resolution Windows screens.
You know that reading jumbo-humungo-sized type all the time greatly reduces legibility, right?
And plus, at 22 points, not having AA is just stupid.
FWIW, subpixel rendering was invented and patented by Apple Computer and implemented first on the Apple ][. Microsoft released the technology under the name ClearType after Apple's patent expired.
As for the superiority of of Windows text rendering, I have yet to see it. Most monitors connected to Windows computers that I have seen are too blurry to render fine details. The documents that I receive from my Windows-using colleagues reinforce my observations.
I don't know much about patents, but isn't it granted for like 50 years or so?
As for blurry monitors - I compared it on the same Apple Cinema 23" monitor. But say it again - whole thing is very subjective.
Typically, spaces must be added between a large percentage of words and extraneous spaces must be deleted before documents can go forward.
Font engine have very little to do with it. Windows GDI is pretty old-fashioned, it's all based on integers so there's no fine-detailing. And all rounding errors very noticeable on small fonts. Application must take care of kerning and all other font features. Adobe for example use it's own font engine (which btw use it's own subpixel rendering too). If you use Notepad or Write you'll get bad kerning (if any at all). Word supposed to do this job better. I don't know how OSX grapahics engine built so can't tell. As for extra spaces between words - it's user's fault, not application. And definitely not OS.
I don't know much about patents, but isn't it granted for like 50 years or so?
As for blurry monitors - I compared it on the same Apple Cinema 23" monitor. But say it again - whole thing is very subjective.
Font engine have very little to do with it. Windows GDI is pretty old-fashioned, it's all based on integers so there's no fine-detailing. And all rounding errors very noticeable on small fonts. Application must take care of kerning and all other font features. Adobe for example use it's own font engine (which btw use it's own subpixel rendering too). If you use Notepad or Write you'll get bad kerning (if any at all). Word supposed to do this job better. I don't know how OSX grapahics engine built so can't tell. As for extra spaces between words - it's user's fault, not application. And definitely not OS.
Prior to about 5 years ago, patents ran for 17 years from the date granted.
Now they run for 20 years from the date they were applied for.
Microsoft on the other hand actually used the fact that LCD pixels made of three dots and program "know" where each software pixel will be located on actual screen. They specially used those monochrome dots to increase resolution. Also they used the fact that human eye won't notice single coloured dot. ClearType designed for LCD and it might work well on Trinitron CRT because it have similar pattern of RGB dots. Never saw ClearType on traditional CRTs, but I guess it will look blurry indeed because those RGB triads arranged differently and there's no way to predict where computer pixel will be located on the screen.
Those techniques look the same, but they not. Common word "subpixel", but approaches and goals are totally different.
....
Font engine have very little to do with it. ...
Who said anything about font engines? My previous post was in the context of monitor resolution, not font engines. The poor quality of their monitors leads them to believe the existing spaces do not exist or that their are spaces where there are in fact none.
As for the superiority of of Windows text rendering, I have yet to see it. Most monitors connected to Windows computers that I have seen are too blurry to render fine details.
Ah, sorry. You said you didn't saw windows on good monitors. Ok. I saw and I say windows fonts look better. Especially small sizes.