Blu-Ray Technology on the PowerMac?

2456711

Comments

  • Reply 21 of 202
    wmfwmf Posts: 1,164member
    It's hard to say much about picture quality since that is mostly in the hands of the content creators. Both formats will allow super high quality, but they also allow low quality.
  • Reply 22 of 202
    If both formats use the same codec (which seems to be true at this point), then the Blu-Ray theoretically could have better picture-quality because of its higher storage capacity. Blu-Ray can fit a full-length movie with less compression.
  • Reply 23 of 202
    programmerprogrammer Posts: 3,458member
    Quote:

    Originally posted by wmf

    It's hard to say much about picture quality since that is mostly in the hands of the content creators. Both formats will allow super high quality, but they also allow low quality.



    Picture quality depends almost entirely on bandwidth. The Blu-Ray and HD-DVD technologies both have just over triple the existing DVD bandwidth. This places a limit on both new technologies being up to about 3 times as good, but of course nothing compels the content provider to use the full bandwidth available. Sadly the 3x improvement is roughly the same as going from the existing 640x480p to HD's 1280x720, but considerably less than the eventual 1920x1080 (which requires an almost 7x improvement). I had hoped they would set a new standard that would hold up a fair time into the future.
  • Reply 24 of 202
    hmurchisonhmurchison Posts: 12,425member
    Quote:

    I had hoped they would set a new standard that would hold up a fair time into the future.





    They'll milk us for a decade or so and then we'll have to replace our HD-DVD stuff with the new Ultra HD 4k stuff. Of course by then the average screen size will be probably around 8 feet with 16-20' being the aficionados choice.
  • Reply 25 of 202
    bungebunge Posts: 7,329member
    Quote:

    Originally posted by Programmer

    Picture quality depends almost entirely on bandwidth.



    We have to consider that the MPEG-2 format used in DVDs is greatly inferior to current compression technology. That considered, the 3x v 7x isn't such as great of a leap.
  • Reply 26 of 202
    eugeneeugene Posts: 8,254member
    On the topic of quality, DTV stations are already demonstrating this by trying to multicast several SD sub-channels + an HD sub-channel on one channel. They just up the compression and think nobody cares.



    What's going to happen is that studios will make even more versions of each movie. You know how there are platinum, limited, special, deluxe editions of DVDs? Sony also releases "Superbit" DVDs where they use higher bitrate MPEG-2 encoding and leave out the extras. That's Sony's prep for doing the same with Blu-Ray Disc. The initial BD release of a filme is going to have tons of useless extras, and will be encoded in multiple formats for convenience. Then they'll release the higher bitrate MPEG-2 version, except the difference in video quality will probably be noticeable, which is less the case with "Superbit" DVDs right now.



    It's too early for me to worry about this stuff anyway. Home products based on either aren't going to be in the hands of early adopters until probably 2006. Not only that, but the format war is going to cause a total lack of content well beyond that...
  • Reply 27 of 202
    mellomello Posts: 555member
    QUAD-LAYER Blu-Ray Discs!!



    Click here for story.
  • Reply 28 of 202
    marzetta7marzetta7 Posts: 1,323member
    Given the advent of dual and quad layered Blu-Ray Discs and the H.264 codec, do you guys think 1080p is a feasible HD format relatively soon? Check out this interesting article:



    http://www.cedmagazine.com/ced/2004/0604/06e.htm



    Let's hope Philip Garvin's attention to detail is echoed by others in the industry and that our bandwidth infrastructure can handle it. I am interested in seeing how much H.264 will be able to compress such a format as 1080p and if broadcasters can manage to push it that way. Anybody have any other insight to 1080p?
  • Reply 29 of 202
    hmurchisonhmurchison Posts: 12,425member
    Quote:

    Originally posted by marzetta7

    Given the advent of dual and quad layered Blu-Ray Discs and the H.264 codec, do you guys think 1080p is a feasible HD format relatively soon? Check out this interesting article:



    http://www.cedmagazine.com/ced/2004/0604/06e.htm



    Let's hope Philip Garvin's attention to detail is echoed by others in the industry and that our bandwidth infrastructure can handle it. I am interested in seeing how much H.264 will be able to compress such a format as 1080p and if broadcasters can manage to push it that way. Anybody have any other insight to 1080p?




    I don't think it's just feasible I think it is essential. The manufacturers right now know that 1080p is the Holy Grail and it will be in their top products and woo the people that have to have it all. MPEG2 1080p 24 is likely 40Mbps throughput. But once you add in the HE codecs you can squeeze this back down to a broadcastable 19Mpbs. However as the article states most STB aren't ready for the newer codecs. I think 1080p hits with recorded formats first and if the broadcaster wants to move that way they might but as one guy said. They'd rather multiplex 3 channel$.
  • Reply 30 of 202
    wmfwmf Posts: 1,164member
    At the same frame rate, 1080p uses the same bandwidth as 1080i. It is almost certain that Blu-ray and HD-DVD movies will be 1080p -- there's no downside.
  • Reply 31 of 202
    hmurchisonhmurchison Posts: 12,425member
    Quote:

    Originally posted by wmf

    At the same frame rate, 1080p uses the same bandwidth as 1080i. It is almost certain that Blu-ray and HD-DVD movies will be 1080p -- there's no downside.



    No actually the bandwidth doubles when you move from Interlaced to Progressive. If the bandwidth stayed the same the Broadcasters wouldn't be mulling over the idea of



    Interlaced vs Progressive



    Quote:

    1 Interlace doubles the vertical resolution for a given bandwidth and frame rate.

    2 P requires more bandwidth or channel capacity than I for the same resolution.

    3 We have to have interlace so that we can have cheaper receivers.

    4 P raises costs for broadcasters.

    5 No one knows how to make P cameras with adequate SNR.

    6 Many programs that will be used for SDTV transmission already exist in NTSC format.



    1080p is a must for the nextgen recorder/players but we may be waiting for broadcasters for some time.
  • Reply 32 of 202
    If you made a standard VHS tape using dual layer blue DVD technology how much information could it store in terms of gigabytes I wonder?
  • Reply 33 of 202
    northgatenorthgate Posts: 4,461member
    Quote:

    Originally posted by wmf

    At the same frame rate, 1080p uses the same bandwidth as 1080i. It is almost certain that Blu-ray and HD-DVD movies will be 1080p -- there's no downside.



    The studios will definitely see a downside. Movies like the new Star Wars, Once Upon a Time in Mexico, Collateral, etc. are MASTERED in 1080P. The studios will not want a master-quality version of their film out in the public. They will always want a lower-resolution version of their film.



    Also, many studios are re-mastering their prints at 1080P. All "consumer" versions of that master will always be of lower quality, whether it be 1080i or 720P.



    Many of the studios already have a heirarchal policy as follows:



    MASTER (2K or 1080P from either 35mm print or HD origination)

    HD Consumer Release (1080i@60 or 720P@24)

    DVD Release (480i and 480P)

    VHS Release
  • Reply 34 of 202
    northgatenorthgate Posts: 4,461member
    Also, I'd like to make another point. I'm a filmmaker getting ready to shoot a feature in the 1080P@24 HDCAM-SR format. There's a 30% chance this film will be see theatrical distrubution outside the US. So I need this much resolution to insure a high-quality film print. Most people, however, will probably only see the movie on standard def DVD.



    With that said, I must admit that the idea of a 1080P@24 version of the film out in the public scares the shit out of me. Motion picture exhibition is pretty much a closed circuit. 99.9% of the consumer public can't do a damn thing with a copy of a movie's film print. The worst they can do with a stolen film print is project it in a theater, aim a camcorder at it, and distribute it on DVD. But at least that "copy" will be of the absolute worst inferior quality. But MILLIONS of people can do mischieveous things with a pristine 1080P copy. D-VHS and the future of HD-DVD will only amplify this problem.



    As a content producer, I worry about these things. My internal policy is to release only a 720P version of the film in HD (along with standard def DVD and VHS). Most people don't even have HD sets capable of displaying 1080i, let alone progressive. Also, 720P looks fantastic. So I don't feel like I'm taking much away from film aficionados by down-res'ing the film.
  • Reply 35 of 202
    mr. memr. me Posts: 3,221member
    Quote:

    Originally posted by hmurchison

    No actually the bandwidth doubles when you move from Interlaced to Progressive. If the bandwidth stayed the same the Broadcasters wouldn't be mulling over the idea of



    Interlaced vs Progressive



    ....




    This is not true. You are thinking analog. HD is now digital and compressed. The data transmitted is the difference between successive frames. Unless the camera is moving like a Mexican jumping bean, most HD video will have only small differences between frames. As a result, 1080p will require more bandwidth than 1080i, but it won't be nearly double. I see the real trade-off as the processor power required to encode and decode the data stream. Whether your display is 1080p or 1080i, you are displaying the same number of pixels. However, 1080p requires the codec to produce twice as many pixels/second as 1080i. Seen in this light, 1080i requires slightly more codec power than 720p, put 1080p will require more than twice the power required for 1080i. Better codecs may well mitigate the extra bandwidth requirements of 1080p content, but better codecs require even more computer power. If you are waiting for adoption 1080p content, I believe that you are going to have a very long wait.
  • Reply 36 of 202
    wmfwmf Posts: 1,164member
    1920 x 1080 x 30 frames/s = 1920 x 540 x 60 fields/s



    At the same frame rate, 1080p uses the same bandwidth as 1080i. (For some reason, that article was assuming 60 frames/s 1080p, which is indeed a little crazy. But movies are all 24 frames/s anyway.)
  • Reply 37 of 202
    mr. memr. me Posts: 3,221member
    Quote:

    Originally posted by wmf

    1920 x 1080 x 30 frames/s = 1920 x 540 x 60 fields/s



    At the same frame rate, 1080p uses the same bandwidth as 1080i. (For some reason, that article was assuming 60 frames/s 1080p, which is indeed a little crazy. But movies are all 24 frames/s anyway.)




    hmurchison is correct, 1080p/60fps is the goal. However, current technology limits 1080p to 30fps. Since the frame rate of most commercial filmed content is 24 fps, the technology limitation is of no consequence to the electronic presentation of most commercial films. Neither does it impede the substitution of digital production for chemical film production in commercial cinema. However, 1080p/60fps will allow the high definition production with a minimum of motion artifacts. It will also allow the most faithful electronic conversion of IMAX films, which are also 60fps.
  • Reply 38 of 202
    mellomello Posts: 555member
    I don't know if creating movies at 60 frames per second at1080p would be

    financially possible. Can you imagine the extra effort to do ROTK or Starwars 3

    if they recorded it at 60 fps instead of 24 fps? They would need a render farm

    that is 3 times the size of what they had now. Extra artists to erase wires &

    cleaning up scenes. I saw a little documentary of golum where the artists

    painstakingly erased the actor who played golum from each frame of a fight

    scene with frodo in order to replace it with the digital golum. It would add

    50 to 100 million bucks to the fx budget.
  • Reply 39 of 202
    onlookeronlooker Posts: 5,252member
    Why they want 60FPS in movies is the real question. I've never even heard of this. You can only see 30 (evenly timed) max anyway. It's not like gaming where fps goes up, and down, and there are bog periods so the higher fps you get the better chance you'll not drop below 30. It seems totally unnecessary for movies.





    IMAX SUCKS.
  • Reply 40 of 202
    northgatenorthgate Posts: 4,461member
    "Oklahoma" was shot twice. A 65mm/70mm Todd-AO 30fps version and a 24fps version for 35mm release prints.



    Rather than film two significantly different versions, as was necessary in Oklahoma!, Todd's director of photography, Lionel Linden, A.S.C., shot both versions in 65mm. In some cases he used two identical Todd-AO cameras and lenses side by side, one running at 30fps for the 70mm version, and the other running at 24fps for the 35mm reduction print, in other cases the same camera was used by changing the camera speed, and in some cases, the material photographed by a single camera was used in both versions, though this could not be done with dialog sequences.



    Also, there have been numerous studies that have proven that 30fps looks exceptionally better on the big screen. Mainly because of the reduction of flicker, especially on anamorphic releases. The studies even proved that the cost of changing the governors on the projectors would be a nominal expense for theater owners, the industry pretty much ignored it. The industries chief complain was the increased expense in film stock. Roughly 30% more film required for shooting, 30% more for prints, and 30% more reels to ship per film.
Sign In or Register to comment.