anti-aliasing

2

Comments

  • Reply 21 of 52
    dfilerdfiler Posts: 3,420member
    Quote:

    Originally posted by ichroma ...please enlighten me. [/B]



    Did you read my post or were you too busy posting three consecutive messages to this thread?



    Perception of color, and thus anti-aliasing, differs greatly between people. There are physical differences in the human eye that make it such that no single anti-aliasing scheme will be suitable for all users. (Allong with other reasons)
  • Reply 22 of 52
    newnew Posts: 3,244member
    Quote:

    Originally posted by ichroma

    the hardware is good. i cannot understand how so many good folks can look at, say, some black text on white ground - that in os9 appears crisp/well defined/high contrast - and then look at the same text - that in osx appears grey/ill-defined/blurry - and report that the latter is better at displaying fonts! please enlighten me.



    Well, speaking as EDIT: an uptight, nerdy designer, the fact that fonts actually look like how they were designed (and ment) to look is very important. Very few fonts where designed to be bitmap fonts. Most were originally, (and still are), designed for paper printing, and their bitmap versions came with DTP (desktop publishing) and the Adobe Postscript system.



    Helvetica for instance is a quite good font in print, but it's bitmap version does not do it justice. It came as a standard font with the mac. And quickly became one of the most populare fonts in early DTP.

    Windows copied this with the Arial font. The font isn't as good as Helvetica in print, but actually is far better for bitmap screen display. That, combined with it being present on most windows pcs, has made it a very popular font in webdesign. As text on the web up to now has been mostly bitmap.



    In the early days of web design, to have anti-aliased text displayed properly, for instance in a logo, you had to use grapics, like a gif file. This changed when Flash was introduced, and designers could suddenly display anti-aliased fonts the way they where supposed to look. OSX takes this a step further by introducing this throughout the OS, including with text on the web. Making it hard to destinguish between normal text and designed graphics.



    I can see how this might take some time getting used to. But I don't agree with you that anti-aliased fonts in OSX are less readable. At least not in every case. You cannot honestly mean that italic letters, for instance, are not better when anti-aliased. I have absolutly no problem reading stuff in OSX, (and is it me, or is anti-aliasing even better in panther?).



    For pure reading, or code, your point is still valid, on screen, some things are easier on they eye than others. Still, reaserch has shown that people in the old days used to read gothic letters (that is blackletters or fraktur) just as fast as we read modern letters today, so I wouldn't be to worried.



    With the way screen technology is going, you will eventually not have much choice anyway.
  • Reply 23 of 52
    dfilerdfiler Posts: 3,420member
    Quote:

    Originally posted by New

    Well, speaking as a designer by education, the fact that fonts actually look like how they were designed (and meant) to look is very important. Most were originally, (and still are), designed for paper printing, and their bitmap versions came with DTP (desktop publishing) and the Adobe Postscript system.



    Take note, many people may discount the rest of your post after reading an opening line like this. Knee-jerk reactions are common when designers proclaim the 'right' look as equivalent to a designer's personal preference.



    Personally, when dealing with the web, I could give a rat's ass what the designer intended. I base my preferences on what works and looks best on my system, not what happens to please some other random individual.



    Also, bitmap fonts were designed to be displayed as bitmaps. They are in no way secondary or subservient to vector-based equivalents destine for print. Computer based fontography started with bitmaps and it took a few decades for postscript/printable versions to emerge.
  • Reply 24 of 52
    newnew Posts: 3,244member
    This does not have anything to do with my personal preferences.



    I'm not in any way dismissing bitmap fonts. I mostly work with web myself.

    That being said. The art of designing letters is ancient, and printing is several hundred years old. Bitmap fonts, like other technical fonts, are products of technical needs (and limitation). Aside from looking cool and retro and all, they should be seen in this context.



    Anyway, computer fontography probably started with punch holes. I've changed the opening line on your recomendation.



    EDIT: You are quite right that bitmap fonts, as a concept, pre-date digital vectorized fonts. But I don't think we have many of those original ones left. There are some new bitmap fonts, however, like MINI 9, which are very nice.

    But they are very limited in use.
  • Reply 25 of 52
    dfilerdfiler Posts: 3,420member
    Quote:

    Originally posted by New

    The art of designing letters is ancient, and printing is several hundred years old. Bitmap fonts, like other technical fonts, are products of technical needs (and limitation). Aside from looking cool and retro and all, they should be seen in this context.



    Interesting note along these lines: Serifs were also a product of technical needs. They arose to facilitate the chiseling of type into stone, wood, or brick. The significance of this is to point out that serif-fonts could also be considered retro. Just as serifs improve legibility by better differentiating letter-forms, jagged bit-mapped fonts, can have the same effect when compared to their anti-aliased counterparts.



    Without vector based monitors in common use, computer-based fonts will always be bit-mapped in their end representation to the viewer. The conundrum lies in that some people prefer the crisper, jagged version while others prefer smoothed and perhaps blurred versions. Neither has yet been empirically proven as inherently more legible.
  • Reply 26 of 52
    kickahakickaha Posts: 8,760member
    I have to admit I prefer the aliased... heck, go in System Preferences -> General, and bump up the 'turn off text smoothing' to 12pt. Then open some 12pt text, and play with the toggle.



    For almost any font out there, to my eye, the aliasing looks better. There are few (crafted for bitmaps) that look better without, but almost any font looks better to me with it on.



    You *really* want to see the difference, set the setting to 8 or 9pt, then keep reducing the font size in your web browser until it reaches that threshhold. Now play with the toggle. At small sizes, the unaliased version will be simply unreadable - there aren't enough pixels to distinguish shapes easily. With aliasing? No problem reading it.
  • Reply 27 of 52
    newnew Posts: 3,244member
    Quote:

    Originally posted by dfiler

    Interesting note along these lines: Serifs were also a product of technical needs. They arose to facilitate the chiseling of type into stone, wood, or brick.



    True. All fonts are to a certain degree.
  • Reply 28 of 52
    Quote:

    Originally posted by ichroma

    anyone know how disable anti-aliasing in OSX?



    Enable the root account on OSX



    Open Terminal



    type "su" enter the password



    copy Charcoal from MacOS 9 to /system/library/fonts



    type "defaults write.GlobalPreferences AppleAntiAliasingThreshold 72"



    type "exit"



    type "defaults write.GlobalPreferences AppleAntiAliasingThreshold 72"



    close Terminal



    reboot



    Use tinkertool to set Charcoal as the system font



    Use supercal to calibrate your display



    A lot of the anti-aliasing is gone.



    I use a high quality NEC LCD with MacOS9 it is very nice.

    But it isn't half as nice when using OSX.



    I decided to switch back to MacOS9 and decided not to buy a macintoch again until there is an option to disable anti-aliasing in OSX.



    I now have bought a Windows machine and I start to like it more and more for its crisp and clear display. Windows XP also has an option to turn on anti-aliasing but it is disabled by default.



    I have never seen a windows user who enabled it.



    Even the latest redhat ships with anti-aliasing but at least that can be switched off.



    I don't understand why people like anti-aliasing it makes a crisp and clear LCD a fuzy CRT.



    I really hate anti-aliasing and wish it was never invented.
  • Reply 29 of 52
    dfilerdfiler Posts: 3,420member
    Quote:

    Originally posted by dubbel69

    I don't understand why people like anti-aliasing it makes a crisp and clear LCD a fuzy CRT.



    While your preference is reasonable, it seems you're ignoring much of what has been discussed here. For many people, anti-aliased text is perceived as being more crisp. While ingrained or inexplicable preferences may play a roll, these alone can't account for such disparate opinions. Measurable differences in the human eye certainly play a significant roll.



    Go and run some comparisons by a RG color-blind friend and you'll quickly discover how significant color perception is with regard to anti-aliasing.



    Sorry... I guess I'm tired of this ridiculous argument about whether anti-aliasing is 'better'. It's as bad as a bunch of little kids arguing about how everyone should concur with their choice of favorite color.
  • Reply 30 of 52
    What the heck! The first thing I did in XP was to turn on AA. It looks soooo much better. I would imagine that windows will stop giving you the option to turn off AA in the future.
  • Reply 31 of 52
    What exactly does anti-aliasing mean in this context? Technically, I mean. I know it smooths texts and lines, so that words and lines are less jaggy, but how does it relate to the usual aliasing effect?



    The aliasing that I'm familiar with, is the effect that you get when you have as input, signals higher than twice your sampling frequency and your output gets garbled.



    That is, if your input signal has frequency 11Hz, say, and you are sampling at 20Hz, you will perceive that signal as no different from a 1Hz signal. What's this got to do with text? Please explain.
  • Reply 32 of 52
    I absolutely blows my mind that anyone would want anti-aliasing turned off! The fact that windows has anti-aliasing turned of as the default setting drives me nuts! Anti-Aliasing makes type easier to read since it makes it more like fonts on the printed page. Those of you who don't like it have proably become accustomed to the overly sharp, poorly rendered type we had to deal with in the GUI dark ages. Looking at a web page in Internet Explorer on Windows and then in Safari is like a breath of fresh air. Aliased fonts in IE are the biggest reason type-savy web designers prefer Flash to HTML, despite it's numerous drawbacks.
  • Reply 33 of 52
    Quote:

    Originally posted by drumsticks

    What exactly does anti-aliasing mean in this context? Technically, I mean. I know it smooths texts and lines, so that words and lines are less jaggy, but how does it relate to the usual aliasing effect?



    The aliasing that I'm familiar with, is the effect that you get when you have as input, signals higher than twice your sampling frequency and your output gets garbled.



    That is, if your input signal has frequency 11Hz, say, and you are sampling at 20Hz, you will perceive that signal as no different from a 1Hz signal. What's this got to do with text? Please explain.




    It's the same phenomenon, except it's in 2-d. You're trying to sample a higher frequency signal, the character shape, at the lower frequency of the screen's pixels. The jaggies are a manifestation of the aliased signal.



    The standard way of dealing with the problem is super-sampling. Sample the characters at a higher resolution. Then apply a band pass filter to the result, and down sample to the screen resolution.
  • Reply 34 of 52
    So the sampling rate is each pixel (spatial sampling as opposed to "conventional" temporal sampling, I'm aware of this). Let's consider a grayscale display and only one horizontal line across the screen, so that we have the one dimensional case.



    Say, we type a series of ones as in 1111111. This means, in that one line, we have alternating black and white pixels (font dependent, of course, but let's assume so). Consider the white pixel the peak of a sine wave, and the black pixel the trough of the sine wave, then we have two samples per period, which is exactly the Nyquist frequency. Any other grayscale image would have larger periods and hence lower frequency content.



    So, the highest frequency is still sampled sufficiently high enough. Where is the aliasing? I'm not arguing the term here, just trying to see where it originates from...
  • Reply 36 of 52
    smirclesmircle Posts: 1,035member
    Quote:

    Originally posted by ichroma

    thanks everybody. i've taken on board all of your suggestions and had a play with all the settings that apple makes available to users. none of which makes an iota of difference.



    Don't forget to log out and back in after you have changed the antialiasing method in the System Preferences panel. For me, setting it to light makes the fonts a lot less mushy than the "best" *cough* method.
  • Reply 37 of 52
    Yeah, those links do a better job of explaining the problem. Basically, the boundary of a character or polygon never coincides with the pixel boundaries. So trying to reconstruct point samples of the polygon is going to result in artifacts.



    It gets really bad when you animate the polygons as in 3-d games. You'd see boundaries crawl and pixels pop on and off.



    Anti-aliasing takes the hard, square edge of the boundary and smoothes it to a gentler function by eliminating the higher spatial frequencies.
  • Reply 38 of 52
    Thanks dfiler for the links. However, I still say that the term "anti-aliasing" is not quite right...



    We start off with [in theory] an infinitely high resolution image or font, then we sample it at 72dpi squares (the pixels). Taking only the center values of these squares and painting the rest of the square with the same center color, we get the jaggy edges - aka the aliasing effects.



    The so-called "anti-aliasing" algorithm really just takes the [possibly weighted] average value over the entire square and paints the entire square with that average color. The final sampling is still at 72dpi.



    It's just a different method of getting the final pixel values. The term "anti-aliasing", IMO, is misleading and the cause of my confusion.
  • Reply 39 of 52
    Quote:

    Originally posted by dfiler

    Did you read my post or were you too busy posting three consecutive messages to this thread?



    Perception of color, and thus anti-aliasing, differs greatly between people. There are physical differences in the human eye that make it such that no single anti-aliasing scheme will be suitable for all users. (Allong with other reasons)




    of course i did you the courtesy of reading your post - as i have those of all respondants. re: my consecutive posts, i stand admonished. it seems these days that one can't help offending someone's gods.



    thanks to all posts, i have amassed a great deal of technical information on anti-aliasing and why - in certain instances - it is thought desirable, essential even .



    however, i remain unpersuaded that text which displays in osx as grey, fuzzy and slightly out of focus, actually constiutes an enhanced user experience.



    my original and subsequent posts sought to address the questions raised from a non-technical perspective and were deliberately framed in non-technical language. i realise this makes my contribution easily dismissable as some "purely subjective" issue. i am convinced that this is not the case but lack the resources to prove it.
  • Reply 40 of 52
    amorphamorph Posts: 7,112member
    Quote:

    Originally posted by ichroma



    however, i remain unpersuaded that text which displays in osx as grey, fuzzy and slightly out of focus, actually constiutes an enhanced user experience.





    At this stage of display technology, it's not an unambiguous improvement - that's why there's a lot of controversy swirling around it. There are tradeoffs to doing things this way, and one of them is that, in fact, anti-aliased fonts can look "out of focus." That's why I said "I got used to it:" At this stage of the game, it's not better, it's just different. One of the adjustments I made was to bump up font sizes, to avoid the "grey smear" of small anti-aliased fonts in OS X.



    So, given that there are tradeoffs, here are the advantages:
    • Fonts are much closer to true WYSIWYG, because the actual instructions in the font are followed;

    • Vector fonts look and act consistent across all sizes and families and foundries (and even types). In OS 9, on screen, any font without a bitmap looked like hell, and lagged on screen. Any font with a bitmap looked like hell if sized to a size that it didn't have a bitmap for;

    • Display resolution is going up. Bitmap fonts will shrink with pixel sizes, but vector fonts can remain the same size, preserving WYSIWYG at 100 and 150 and 300 ppi. Furthermore, the vector fonts will get better and better looking over time.

    So it's a forward-looking technology, like much of the technology in OS X. That in itself is a good enough reason to go with it. Bitmaps, although they look sharp on screen, are a compromise that will get less and less attractive over time until they finally become an impediment.



    (Of course, they can always be rendered to a logical pixel; the W3C already defines pixels as logical to divorce web resolution from any particular screen resolution. In that case, the bitmap would be scaled as-if it the screen resolution was 96dpi - in the case of the Web - and possibly antialiased as well!)
Sign In or Register to comment.