No upscaling matches the output to be that of the display.
I don't see where you are contradicting what I said. I explained what is done, you explained why someone would want it done. There are times that the display has a better scaler than the source, so it's not universally true that the source's upscaler is better than the display's. I imagine that the nVidia chip is probably going to be better than what is in the TV more often than not.
I don't see where you are contradicting what I said. I explained what is done, you explained why someone would want it done.
Sorry chap but i'm not contradicting you. You've said about algorithms which implies it's all down to software. The issue is where the upscaling is done. If it's done on the apple tv and the apple tv is outputting a 720p, that's what we want. However the TV doesn't 'blow up' per se it scales each pixel to occupy the space of multiple pixels in hardware on the panel in question. (like changing your resolution)
Sorry chap but i'm not contradicting you. You've said about algorithms which implies it's all down to software. The issue is where the upscaling is done. If it's done on the apple tv and the apple tv is outputting a 720p, that's what we want. However the TV doesn't 'blow up' per se it scales each pixel to occupy the space of multiple pixels in hardware on the panel in question. (like changing your resolution)
I think this is just a disagreement or misunderstanding on what the terms mean. I don't really differentiate scaling and "blow up", at least, I was not aware of any such distinction. Maybe you interpret blow up as simple pixel multiplication, which certainly is a bad way to scale. I also generally don't care whether the scaling was done in software or hardware, those algorithms can be performed in hardware or software, what's more important is the results.
I think this is just a disagreement or misunderstanding on what the terms mean. I don't really differentiate scaling and "blow up", at least, I was not aware of any such distinction. Maybe you interpret blow up as simple pixel multiplication, which certainly is a bad way to scale. I also generally don't care whether the scaling was done in software or hardware, those algorithms can be performed in hardware or software, what's more important is the results.
Yep fair enough. But that's my point if the device in question doesn't upscale then we're talking 'pixel multiplication' by the panel. We certainly don't want that
First, give up on the Netgear. The reviews are out...it's crap.
I'd be impressed if iTunes offered 16:9 video at 854 x 480 to start. As good as the DVD rips I've done at 640 x 360 look, 720p would be overkill for me.
3x my Bandwith is about 9 Mbps, and my current Bandwith makes me wait about an hour and a half to two hours for an iTunes quality Movie. So if Apple were to offer HD Downloads say... tomorrow for example, it would take me about 6 hours for the download which would be slowed down if I was doing anything else online. Then there is the space requirements......
Sebastian
No dude -- he's talking about Apple's bandwidth & storage requirements. Because a movie at ghettovision specs runs about 800MB. A movie at 720p will run 1.5+ GB (starting at your 75 minute movie). When you consider hundreds or thousands of movies hosted by ITS and the dedicated bandwidth it requires to host each download session... well, there you go.
Comments
No upscaling matches the output to be that of the display.
I don't see where you are contradicting what I said. I explained what is done, you explained why someone would want it done. There are times that the display has a better scaler than the source, so it's not universally true that the source's upscaler is better than the display's. I imagine that the nVidia chip is probably going to be better than what is in the TV more often than not.
I don't see where you are contradicting what I said. I explained what is done, you explained why someone would want it done.
Sorry chap but i'm not contradicting you. You've said about algorithms which implies it's all down to software. The issue is where the upscaling is done. If it's done on the apple tv and the apple tv is outputting a 720p, that's what we want. However the TV doesn't 'blow up' per se it scales each pixel to occupy the space of multiple pixels in hardware on the panel in question. (like changing your resolution)
Sorry chap but i'm not contradicting you. You've said about algorithms which implies it's all down to software. The issue is where the upscaling is done. If it's done on the apple tv and the apple tv is outputting a 720p, that's what we want. However the TV doesn't 'blow up' per se it scales each pixel to occupy the space of multiple pixels in hardware on the panel in question. (like changing your resolution)
I think this is just a disagreement or misunderstanding on what the terms mean. I don't really differentiate scaling and "blow up", at least, I was not aware of any such distinction. Maybe you interpret blow up as simple pixel multiplication, which certainly is a bad way to scale. I also generally don't care whether the scaling was done in software or hardware, those algorithms can be performed in hardware or software, what's more important is the results.
I think this is just a disagreement or misunderstanding on what the terms mean. I don't really differentiate scaling and "blow up", at least, I was not aware of any such distinction. Maybe you interpret blow up as simple pixel multiplication, which certainly is a bad way to scale. I also generally don't care whether the scaling was done in software or hardware, those algorithms can be performed in hardware or software, what's more important is the results.
Yep fair enough. But that's my point if the device in question doesn't upscale then we're talking 'pixel multiplication' by the panel. We certainly don't want that
I'd be impressed if iTunes offered 16:9 video at 854 x 480 to start. As good as the DVD rips I've done at 640 x 360 look, 720p would be overkill for me.
3x my Bandwith is about 9 Mbps, and my current Bandwith makes me wait about an hour and a half to two hours for an iTunes quality Movie. So if Apple were to offer HD Downloads say... tomorrow for example, it would take me about 6 hours for the download which would be slowed down if I was doing anything else online. Then there is the space requirements......
Sebastian
No dude -- he's talking about Apple's bandwidth & storage requirements. Because a movie at ghettovision specs runs about 800MB. A movie at 720p will run 1.5+ GB (starting at your 75 minute movie). When you consider hundreds or thousands of movies hosted by ITS and the dedicated bandwidth it requires to host each download session... well, there you go.
But yeah, sounds like you need FiOS.