16 bit display and Mac OS X

Posted:
in macOS edited January 2014
I am rereading John Siracusa's article on memory usage in OS X at <a href="http://arstechnica.com/reviews/01q4/macosx-10.1/macosx-10.1-6.html."; target="_blank">http://arstechnica.com/reviews/01q4/macosx-10.1/macosx-10.1-6.html.</a>; Great stuff as usual.

For anybody who's wondering why OS X takes up so much memory, this is THE required reading.



Back on topic: one of the major component of memory usage in OS X is the Window Server, which has to buffer *each* open window at all times, unlike Windows or Mac OS Classic whose applications draw directly to the screen (since no system-wide compositing is performed in those systems). One can watch this memory usage live with the Quartz Debug utility installed with the OS X Dev CD.



After noticing the Encoding column (in Quartz Debug) which shows the display depth of each window, I decided to try switching down to a 16-bit depth to see if it would have any effect on window buffering memory usage, as one would logically expect. An interesting thing happened then: while some windows now displayed 16-bit in the Encoding column, most actually remained at 32-bit! Unsurprisingly overall memory usage didn't decrease dramatically.

So it looks like going from 32 bit to 16-bit display depth doesn't do much for OS X memory usage. Speed didn't seem much improved either.



On a side note, IE antialiasing looks much better with 32-bit depth.
Sign In or Register to comment.