Originally Posted by Chucker
You do realize that the OS X in the Apple TV has much, much less to do than the Mac OS X in your Mac mini? No. That would be too easy to consider. I'm sure you receive e-mails on your Apple TV all the time. Or compile ports in DarwinPorts. Yep. Gotta keep the machine busy. While watching 720p video.
The H.264 component on the Apple TV is indeed different. But your conclusions are still quite a stretch.
The AppleTV is running OS X 8N5107, so actually it bears a very striking resemblance to the OS on my MacBook (10.4.8 Intel was 8L2127, 10.4.9 Intel is 8P2137). Sure, there are lots of extra bits and pieces installed in my OS, but extra frameworks and the like take up no extra CPU resources just by being there. You have to be using them for that to happen. I should have said in my post that QuickTime and Finder where the only applications running when I did my test. There was nothing
sucking up CPU cycles in the manner that you suggest.
Quit all apps apart from Activity monitor, QuickTime and Finder.
On my MacBook, Activity monitor and pmTool (a utility that activity monitor uses) together take up <3% cpu, and various other components such as Window Server and the kernel take up a total of <1.5% CPU between them. This means that without Activity monitor or pmTool running, at least 98.5% of the CPU cycles are available.
Now, open up a 1280 x 720/24p 5 Mbit/s H.264 file in QuickTime and play it back. On my MacBook (with one core disabled), the WindowServer, kernel and QuickTime CPU usages all increase. The usages vary quite a bit, the peak seems to be around 4% for WindowServer and 3% for the kernel. So, without Activity Monitor running, there would be at least 93% processor power available to QuickTime. It is without Activity Monitor running that I performed my test that demonstrated a heavy rate of dropped frames. And this on a Core Duo core with improved (over the Pentium-M in the AppleTV) SSE1/2 and SSE3 (which the Pentium-M doesn't have at all) running at almost twice the clock speed.
How exactly do you propose that the OS on the AppleTV is structured? Even if you drive the WindowServer and kernel CPU usages down to zero, there still isn't anywhere near enough CPU power available for QuickTime to decode a 1280 x 720/24p 5 Mbit/s H.264 stream in real-time. The AppleTV must be using the GeForce hardware-decode features.
If you think the AppleTV is doing it all on the CPU, why do you think Apple put a GeForce in there? Wouldn't it have been much cheaper to use a GMA950? And if Apple in the process of developing the AppleTV worked out how to reduce the CPU usage of their QuickTime H.264 decoder by a factor of something like 4, why didn't they deliver that improved codec with the release of QuickTime 7.1.5?