Originally Posted by Wolfman
AppleTV provides nominal support for 1080i. Since the output resolution is offered, it is supported and the statement is correct.
AppleTV doesn't have a display where you could apply the phrase "native resolution", any content that doesn't match the selected output resolution will be scaled.
There are 3 issues here
1) the maximum quality video that the AppleTV can play (720p25)
2a) the maximum output resolution sent to the TV (1080i60)
2b) the input the TVs can accept (usually up to 1080i60 on plasma)
3) the native resolution of the TV screen (usually 480p, 720p, or 1080p)
On the AppleTV - people confuse 1 & 2a. The AppleTV can not play a 1080 movie, it's not capable of it.
Likewise, on TVs people have confused 2b & 3. Plasmas were advertising themselves as 1080i even if they were only 480p TVs. Fortunately TV makers have been forced to make the true resolution clear. The give away phrase is "1080i" on a Plasma - Plasmas can NOT be be natively "interlaced".... so if it says 1080i it's just the input accepted.
Originally Posted by zygoat
Um, this is exactly backwards. 1080i59.94 provides "smoother image quality in fast-moving sports programming", while 720p29.97 (etc.), being half the field rate (or less) but around twice the vertical resolution per frame, is better suited to provide a "sharper picture in low-motion still shots".
No he was right. The 1080i60 replaces half the lines every 1/60th of a second, which can work great on a regular old cathode ray tube. But a plasma screen can't show just 1/60th of the lines, so when half the lines are replaced it has to continue showing the other half of the lines which come from a picture 1/60th of a second ago. If the picture hasn't moved much, those other lines fit almost perfectly. But in fast motion scenes they don't, and you get a weird series of horizontal lines where the picture is moving (called the 'combing' effect).
So 720p30 would refresh less often than 1080i60, but each picture would be very clear (clearer than the higher resolution 1080 with combing effects). Of course when it's broadcast at 720p60 the frame rate is matched.
Oh, 2 other sources of confusion
1) when 1080i refreshes HALF the lines 60 times a second, most times that's referred to as 1080i60. But every now and then I see an article that calls that 1080i30... due to measuring whole frames rather than every second line (aka fields).
2) when it comes to movies, it's a 24 frame movie that is then converted into 1080i60... but it's done in a way so that TVs can easily convert it back to 1080p24. So for movies, the interlaced vs progressive differences disappear, as do the 60 frames a second. It's 720p24 vs 1080p24... and easy to say which is better.
Originally Posted by pmjoe
Are you claiming that a 1280x720 picture has a higher vertical resolution than 1920x1080?
A lot of TVs simply halve the resolution for fast moving TV scenes. Some estimate which sections of the picture are moving fast and halve that part of the picture's resolution, while for the still sections they combine the previous field. There are several ways of converting interlaced to show on plasma/lcd screens - and the latest technology for estimating every 2nd line is really impressive.
Originally Posted by pmjoe
The reason why some claim that 720p is better than 1080i for sports is because there are sometimes increased compression artifacts in 1080i due to the higher bandwidth required ... and/or  the better frame rate with 720p60.
No... mainly it's the combing effects. Remember also that 1080i60 broadcasts the same number of pixels as 720p60 (though progressive compresses a little better). 720p30 has half as many pixels of course since it has half the frames - but since MPEG4 compression encodes what has changed since the last frame, each frame in 720p30 has changed more than when there are twice as many frames.
It's not at all simple this stuff.
Oh another factor to make things more confusing
3) the latest compression schemes are starting to put LESS effort into the fast moving scenes and MORE into the still sections, which goes against what's been done for years. The theory is that you can't tell how good the quality is in fast moving sections anyway, so make the still sections look STUNNING and people will rate the whole picture as better quality. This is kinda similar to what better 1080p Plasmas already do with 1080i as I described above. Of course, if you pause a fast scene you'll see that the detail isn't there... but the comparisons of paused fast motion scenes miss the point of this compression.
(I have to say I haven't watched this type of compression knowingly. I did once watch a 4Mbps 720p24 film of people walking in a snow storm and it was terrible, it couldn't handle all the detail of the snow... it would have looked far better if they'd dropped the resolution to 320p24 for those scenes!)