Does anyone know at what bit rate the average file is for the video and the audio?
Someone else can check the files they've downloaded for comparison, but I have 5.16 Mbps.
As someone else mentioned above, I think it would be really interesting to do a serious comparison of the various streaming services (with bluRay as a baseline). Xbox and others have had 1080 for a bit, but I've always wondered what compromises they were making just to get that label on there.
And I'm not talking about a technical data rate comparison (though that would be an interesting side note), but image to image.
I used to see that for 720p stuff ... but the SD versions stopped downloading a while ago ... now I just get the HD version from TV Show subscriptions (if I manually select an episode to buy it still gives me both versions though)
I was hoping for a HD720p and HD1080p icon (or something) to differentiate the different resolutions
I thought I remembered seeing somewhere (isn't senility a bitch) that Apple's iTunes SD was actually 480p and their HD was their 720P. But they never advertised that and I guess now they have a problem on their hands with calling 1080P HD which probably now should be called HD+ or something.
As someone else mentioned above, I think it would be really interesting to do a serious comparison of the various streaming services (with bluRay as a baseline). Xbox and others have had 1080 for a bit, but I've always wondered what compromises they were making just to get that label on there.
Quite a bit. I remember reading about Xbox Marketplace 1080p being a lower bit rate than Apple's 720p. Plus, I haven't seen any 1080p from YouTube that bested iTS 720p. Granted, what I've seen could have been the odd man out or Google and MS have moved to higher quality 1080p files since I last checked.
so ... I deleted all the prior episodes of Happy Endings Season 2 that I had already downloaded through my subscription and selected them to re-download from iCloud ...
so far they are all re-downloading in 1080p.
Will have to do the same for all my other iTunes subscription TV shows (at least for this season ... but I wonder if prior seasons that were 720p will be available in 1080p versions as well)
The article isn?t saying the data is compressed; it?s saying the compression efficiency has improved.
But is very difficult to know how that efficiency was achieved. The article was written as if the relationship between the number of pixels and the size of the file would be linear, and since it's not concludes it's due to the new codec.
While I'm sure that's part, and perhaps most, of it, do we know that other compression variables weren't also altered to obtain the smaller file sizes? The per-pixel level of quality could have also been slightly reduced to get a smaller than expected file size. A lot more pixels with a little less quality per-pixel would still result in an improvement, just not as good as when file size wasn't a concern.
I'm sure the new codec helps a lot, but there many have been other, presumably minor, compromises to get reasonable file sizes.
I find it interesting that they pushed the compression as low as they did. I would have expected something more in the range of 2-3GB for a tv show episode and 6-7GB for a movie.
I suppose the level of compression could be a factor in why they didn't change the price. Certainly the studios will love it because it doesn't threaten their precious Blu-ray sales.
The quality of the video depends on how it is shot. I know the video engineer on "Big Bang". He is using Sony F900 cameras to shoot the series. It's shot in 1080i. So watching it at 1080p will mean nothing. It won't look any better at 1080p since it wasn't shot with that increased definition.
They will use newer cameras next season so you may notice an improvement.
Not true, the show is shot progressive. Purely 1080i would have been totally unacceptable for a show like that. Check this list and look up Big Bang:
The highly respected HDW-F900 camcorder has now been refined into the next-generation HDW-F900R, offering a variety of further enhanced functionalities. The HDW-F900R camcorder records images in accordance with the CIF (Common Image Format) standard, which specifies a sampling structure of 1920 x 1080 active pixels (horizontal x vertical). Plus, as well as recording at 24P, the HDW-F900R camcorder is switchable to record at 25P, 29.97P progressive scan, and also at 50 or 59.94 Hz interlaced scan.
Please note sometimes the progressive stream is embedded in the interlaced band. This might be the case with big bang but then the end result is effectively progressive, keeping compatibility with non-progressive devices.
The icon is just "HD" regardless of the resolution. Of note is the left side of the icon is clipped off. This wasn't the case in any previous version of iTunes.
And iTunes now automatically assigns this icon to media if it's 720p or above. Previously if you added your own HD content, you had to use Subler to manually add the HD icon.
And I hate that. Because I have a few 720p things, and that's not HD to me. I don't want to see them as such. My intent is to move my entire catalogue to 1080p, and I can't do that as easily when I have to check every time whether it's a 1080 file or not.
Wouldn't that be because the file has both HD and SD versions in it? Not anything to do with the resolution itself?
Do you "hate" watching everything broadcast on ABC, Fox, and ESPN? You are the first person I've come across who would say those networks aren't HD.
Regarding television programs which are broadcast in 1080i (CBS, NBC, PBS), are the iTunes versions truly going to be 1080P? They aren't the same thing unless it's a still image. Also, I believe ABC/ESPN chose 720P over 1080i as it provided better resolution for action sports.
The quality of the video depends on how it is shot. I know the video engineer on "Big Bang". He is using Sony F900 cameras to shoot the series. It's shot in 1080i. So watching it at 1080p will mean nothing. It won't look any better at 1080p since it wasn't shot with that increased definition.
They will use newer cameras next season so you may notice an improvement.
1080p does not have greater definition than 1080i. 1080p has more frame coherency but less temporal accuracy. On top of that, normally 1080p refers to 24 fps while 1080i is roughly equivalent to 30 fps.
More importantly though, with video captured originally as 1080i, distributing as 1080p allows the deinterlacing to be done on expensive hardware with carefully managed settings.
With that said, I'm surprised that BBT is shot digitally on 1080i. My guess would have been that real film was being used and then the final cut released at 1080i.
EDIT-------
Oops, I missed this reply before posting my reply. Thanks for the info!
Quote:
Originally Posted by dacloo
Not true, the show is shot progressive. Purely 1080i would have been totally unacceptable for a show like that. Check this list and look up Big Bang:
The highly respected HDW-F900 camcorder has now been refined into the next-generation HDW-F900R, offering a variety of further enhanced functionalities. The HDW-F900R camcorder records images in accordance with the CIF (Common Image Format) standard, which specifies a sampling structure of 1920 x 1080 active pixels (horizontal x vertical). Plus, as well as recording at 24P, the HDW-F900R camcorder is switchable to record at 25P, 29.97P progressive scan, and also at 50 or 59.94 Hz interlaced scan.
Please note sometimes the progressive stream is embedded in the interlaced band. This might be the case with big bang but then the end result is effectively progressive, keeping compatibility with non-progressive devices.
The icon is just "HD" regardless of the resolution. Of note is the left side of the icon is clipped off. This wasn't the case in any previous version of iTunes.
And iTunes now automatically assigns this icon to media if it's 720p or above. Previously if you added your own HD content, you had to use Subler to manually add the HD icon.
And I hate that. Because I have a few 720p things, and that's not HD to me. I don't want to see them as such. My intent is to move my entire catalogue to 1080p, and I can't do that as easily when I have to check every time whether it's a 1080 file or not.
Sounds like you prefer high resolution over greater frame rate. Note that some people prefer sports broadcast at 720p (60 fps).
Though if we're talking about movies and TV shows, 1080 does seem preferable. Frequently these are shot and mastered at 24p so the 720p60 standard offers nothing over 1080(i or p).
Do you "hate" watching everything broadcast on ABC, Fox, and ESPN? You are the first person I've come across who would say those networks aren't HD.
I don't really watch modern TV.
Quote:
Originally Posted by dfiler
Sounds like you prefer high resolution over greater frame rate. Note that some people prefer sports broadcast at 720p (60 fps).
Sounds like resolution and frame rate have nothing to do with one another.
I'd prefer 1080/60p content. People don't like that because "they move unnaturally". No, you're just idiots used to twenty-suck frames per second. But I digress.
Sounds like resolution and frame rate have nothing to do with one another.
I'd prefer 1080/60p content. People don't like that because "they move unnaturally". No, you're just idiots used to twenty-suck frames per second. But I digress.
This is an interesting topic but your manner of response makes me not want to engage in discussing it.
This is an interesting topic but your manner of response makes me not want to engage in discussing it.
Oh, you know what I mean. People like different things, and after, what, 100 years of 24 frames per second, people think that reality is unrealistic now. They demand slower frame rates because smoother ones look "unnatural". I don't hold any animosity toward people who think that, they're just wrong. Choosing to remain wrong in the face of the truth is idiocy. It's not an insult, just a descriptor.
People also demand movies show distanced explosions (and any actions, really) at the exact same time as we hear them. Really takes away the immersion, but that's me.
Oh, you know what I mean. People like different things, and after, what, 100 years of 24 frames per second, people think that reality is unrealistic now. They demand slower frame rates because smoother ones look "unnatural". I don't hold any animosity toward people who think that, they're just wrong. Choosing to remain wrong in the face of the truth is idiocy. It's not an insult, just a descriptor.
People also demand movies show distanced explosions (and any actions, really) at the exact same time as we hear them. Really takes away the immersion, but that's me.
It's definitely a perception issue that's been nurtured over nearly a century. Film at 24fps has been the standard since the 20s. Whereas the higher frame rates tend to look like video, which people on a base level associate with home video.
Invariably, people who make movies aspire for them to look like "movies" as they've know them.
It will be a tough perception to shake. Peter Jackson's THE HOBIT is going to be 60fps, though that is mostly to counterbalance the 3D.
Peter Jackson's THE HOBIT is going to be 60fps, though that is mostly to counterbalance the 3D.
Will the 2D version still be 60, or will it be dropped to 30? And I heard James Cameron wants to only shoot in 60fps in the future, so between those two, we might actually see some change enacted.
[QUOTE=dacloo;2067212]Not true, the show is shot progressive. Purely 1080i would have been totally unacceptable for a show like that. Check this list and look up Big Bang:
Except that the camera is not capable of shooting 1080p. Notice "1080/59.94i". The available format modes are:
HDCAM 1080/59.94i/29.97P/24P/23.98P
40 min. with BCT-40HD (59.94i Mode)
48 min. with BCT-40HD (50i/25P Mode, 25P: HDW-650P only)
50 min. with BCT-40HD (23.98PsF Mode, HDW-650F only)
Comments
well for episode 17 of Happy Endings (30 min comedy) it is:
size: 854.1 MB
Profile: Low Complexity
Channels: Stereo, Dolby Digital 5.1
Bit Rate: 150 kbps
Total Bit Rate: 5405 kbps
Video Dimensions: 1920x1080
Awesome thank you
Does anyone know at what bit rate the average file is for the video and the audio?
Someone else can check the files they've downloaded for comparison, but I have 5.16 Mbps.
As someone else mentioned above, I think it would be really interesting to do a serious comparison of the various streaming services (with bluRay as a baseline). Xbox and others have had 1080 for a bit, but I've always wondered what compromises they were making just to get that label on there.
And I'm not talking about a technical data rate comparison (though that would be an interesting side note), but image to image.
Does anyone know at what bit rate the average file is for the video and the audio?
Easy enough to figure out with a random selection since it tells you the file size and duration.
Example:
I used to see that for 720p stuff ... but the SD versions stopped downloading a while ago ... now I just get the HD version from TV Show subscriptions (if I manually select an episode to buy it still gives me both versions though)
I was hoping for a HD720p and HD1080p icon (or something) to differentiate the different resolutions
I thought I remembered seeing somewhere (isn't senility a bitch) that Apple's iTunes SD was actually 480p and their HD was their 720P. But they never advertised that and I guess now they have a problem on their hands with calling 1080P HD which probably now should be called HD+ or something.
As someone else mentioned above, I think it would be really interesting to do a serious comparison of the various streaming services (with bluRay as a baseline). Xbox and others have had 1080 for a bit, but I've always wondered what compromises they were making just to get that label on there.
Quite a bit. I remember reading about Xbox Marketplace 1080p being a lower bit rate than Apple's 720p. Plus, I haven't seen any 1080p from YouTube that bested iTS 720p. Granted, what I've seen could have been the odd man out or Google and MS have moved to higher quality 1080p files since I last checked.
so far they are all re-downloading in 1080p.
Will have to do the same for all my other iTunes subscription TV shows (at least for this season ... but I wonder if prior seasons that were 720p will be available in 1080p versions as well)
The article isn?t saying the data is compressed; it?s saying the compression efficiency has improved.
But is very difficult to know how that efficiency was achieved. The article was written as if the relationship between the number of pixels and the size of the file would be linear, and since it's not concludes it's due to the new codec.
While I'm sure that's part, and perhaps most, of it, do we know that other compression variables weren't also altered to obtain the smaller file sizes? The per-pixel level of quality could have also been slightly reduced to get a smaller than expected file size. A lot more pixels with a little less quality per-pixel would still result in an improvement, just not as good as when file size wasn't a concern.
I'm sure the new codec helps a lot, but there many have been other, presumably minor, compromises to get reasonable file sizes.
I suppose the level of compression could be a factor in why they didn't change the price. Certainly the studios will love it because it doesn't threaten their precious Blu-ray sales.
The quality of the video depends on how it is shot. I know the video engineer on "Big Bang". He is using Sony F900 cameras to shoot the series. It's shot in 1080i. So watching it at 1080p will mean nothing. It won't look any better at 1080p since it wasn't shot with that increased definition.
They will use newer cameras next season so you may notice an improvement.
Not true, the show is shot progressive. Purely 1080i would have been totally unacceptable for a show like that. Check this list and look up Big Bang:
http://www.google.nl/url?sa=t&source...wFAoi-STH5PbmQ
Camera:
The highly respected HDW-F900 camcorder has now been refined into the next-generation HDW-F900R, offering a variety of further enhanced functionalities. The HDW-F900R camcorder records images in accordance with the CIF (Common Image Format) standard, which specifies a sampling structure of 1920 x 1080 active pixels (horizontal x vertical). Plus, as well as recording at 24P, the HDW-F900R camcorder is switchable to record at 25P, 29.97P progressive scan, and also at 50 or 59.94 Hz interlaced scan.
Please note sometimes the progressive stream is embedded in the interlaced band. This might be the case with big bang but then the end result is effectively progressive, keeping compatibility with non-progressive devices.
The icon is just "HD" regardless of the resolution. Of note is the left side of the icon is clipped off. This wasn't the case in any previous version of iTunes.
And iTunes now automatically assigns this icon to media if it's 720p or above. Previously if you added your own HD content, you had to use Subler to manually add the HD icon.
And I hate that. Because I have a few 720p things, and that's not HD to me. I don't want to see them as such. My intent is to move my entire catalogue to 1080p, and I can't do that as easily when I have to check every time whether it's a 1080 file or not.
Wouldn't that be because the file has both HD and SD versions in it? Not anything to do with the resolution itself?
Do you "hate" watching everything broadcast on ABC, Fox, and ESPN? You are the first person I've come across who would say those networks aren't HD.
Regarding television programs which are broadcast in 1080i (CBS, NBC, PBS), are the iTunes versions truly going to be 1080P? They aren't the same thing unless it's a still image. Also, I believe ABC/ESPN chose 720P over 1080i as it provided better resolution for action sports.
The quality of the video depends on how it is shot. I know the video engineer on "Big Bang". He is using Sony F900 cameras to shoot the series. It's shot in 1080i. So watching it at 1080p will mean nothing. It won't look any better at 1080p since it wasn't shot with that increased definition.
They will use newer cameras next season so you may notice an improvement.
1080p does not have greater definition than 1080i. 1080p has more frame coherency but less temporal accuracy. On top of that, normally 1080p refers to 24 fps while 1080i is roughly equivalent to 30 fps.
More importantly though, with video captured originally as 1080i, distributing as 1080p allows the deinterlacing to be done on expensive hardware with carefully managed settings.
With that said, I'm surprised that BBT is shot digitally on 1080i. My guess would have been that real film was being used and then the final cut released at 1080i.
EDIT-------
Oops, I missed this reply before posting my reply. Thanks for the info!
Not true, the show is shot progressive. Purely 1080i would have been totally unacceptable for a show like that. Check this list and look up Big Bang:
http://www.google.nl/url?sa=t&source...wFAoi-STH5PbmQ
Camera:
The highly respected HDW-F900 camcorder has now been refined into the next-generation HDW-F900R, offering a variety of further enhanced functionalities. The HDW-F900R camcorder records images in accordance with the CIF (Common Image Format) standard, which specifies a sampling structure of 1920 x 1080 active pixels (horizontal x vertical). Plus, as well as recording at 24P, the HDW-F900R camcorder is switchable to record at 25P, 29.97P progressive scan, and also at 50 or 59.94 Hz interlaced scan.
Please note sometimes the progressive stream is embedded in the interlaced band. This might be the case with big bang but then the end result is effectively progressive, keeping compatibility with non-progressive devices.
The icon is just "HD" regardless of the resolution. Of note is the left side of the icon is clipped off. This wasn't the case in any previous version of iTunes.
And iTunes now automatically assigns this icon to media if it's 720p or above. Previously if you added your own HD content, you had to use Subler to manually add the HD icon.
And I hate that. Because I have a few 720p things, and that's not HD to me. I don't want to see them as such. My intent is to move my entire catalogue to 1080p, and I can't do that as easily when I have to check every time whether it's a 1080 file or not.
Sounds like you prefer high resolution over greater frame rate. Note that some people prefer sports broadcast at 720p (60 fps).
Though if we're talking about movies and TV shows, 1080 does seem preferable. Frequently these are shot and mastered at 24p so the 720p60 standard offers nothing over 1080(i or p).
Do you "hate" watching everything broadcast on ABC, Fox, and ESPN? You are the first person I've come across who would say those networks aren't HD.
I don't really watch modern TV.
Sounds like you prefer high resolution over greater frame rate. Note that some people prefer sports broadcast at 720p (60 fps).
Sounds like resolution and frame rate have nothing to do with one another.
I'd prefer 1080/60p content. People don't like that because "they move unnaturally". No, you're just idiots used to twenty-suck frames per second. But I digress.
Sounds like resolution and frame rate have nothing to do with one another.
I'd prefer 1080/60p content. People don't like that because "they move unnaturally". No, you're just idiots used to twenty-suck frames per second. But I digress.
This is an interesting topic but your manner of response makes me not want to engage in discussing it.
This is an interesting topic but your manner of response makes me not want to engage in discussing it.
Oh, you know what I mean. People like different things, and after, what, 100 years of 24 frames per second, people think that reality is unrealistic now. They demand slower frame rates because smoother ones look "unnatural". I don't hold any animosity toward people who think that, they're just wrong. Choosing to remain wrong in the face of the truth is idiocy. It's not an insult, just a descriptor.
People also demand movies show distanced explosions (and any actions, really) at the exact same time as we hear them. Really takes away the immersion, but that's me.
Oh, you know what I mean. People like different things, and after, what, 100 years of 24 frames per second, people think that reality is unrealistic now. They demand slower frame rates because smoother ones look "unnatural". I don't hold any animosity toward people who think that, they're just wrong. Choosing to remain wrong in the face of the truth is idiocy. It's not an insult, just a descriptor.
People also demand movies show distanced explosions (and any actions, really) at the exact same time as we hear them. Really takes away the immersion, but that's me.
It's definitely a perception issue that's been nurtured over nearly a century. Film at 24fps has been the standard since the 20s. Whereas the higher frame rates tend to look like video, which people on a base level associate with home video.
Invariably, people who make movies aspire for them to look like "movies" as they've know them.
It will be a tough perception to shake. Peter Jackson's THE HOBIT is going to be 60fps, though that is mostly to counterbalance the 3D.
Peter Jackson's THE HOBIT is going to be 60fps, though that is mostly to counterbalance the 3D.
Will the 2D version still be 60, or will it be dropped to 30? And I heard James Cameron wants to only shoot in 60fps in the future, so between those two, we might actually see some change enacted.
Except that the camera is not capable of shooting 1080p. Notice "1080/59.94i". The available format modes are:
HDCAM 1080/59.94i/29.97P/24P/23.98P
40 min. with BCT-40HD (59.94i Mode)
48 min. with BCT-40HD (50i/25P Mode, 25P: HDW-650P only)
50 min. with BCT-40HD (23.98PsF Mode, HDW-650F only)
http://pro.sony.com/bbsc/ssr/product-HDWF900R/
Watching the show on CBS, they broadcast 1080i.
http://mediainfo.sourceforge.net/en