Can Macs play 1080p and 720p video?

Posted:
in General Discussion edited January 2014
I'm not trolling, just curious. The reason I ask this because I was browsing for some HD-quality clips on my PC and came accross this.



I was shocked to notice this:



Quote:

...Although other system configurations may be able to playback this content, for an optimal experience we recommend systems of at least a 2.53 GHz Intel or AMD Athlon XP 2200+ or higher processor for 720p and 3.0GHz or greater processor for 1080p. For either scenario, an AGP4x based NVIDIA or ATI video adapter card with at least 32 MB of RAM and the most recent OEM driver updates is recommended.



The higher the data rate (in Mbps), the higher the resource requirement. For optimal playback of 1080p content, an AGP8x 128MB video card such as the ATI Radeon 9800 Pro series is required. Learn more about Windows Media High Definition.



I am curious because I want to know how this translates into Quicktime on the Mac and high definition viewing. The Cinema HD display is nice, but is there a Mac capable of driving it to its full potential(ie. HD 1080p video)? We all know the any G4 is a lacking and bandwidth starved and cannot compare to the power of a 3.0GHz P4. I understand Quicktime has some Altivec optimizations but is it enough to decode high-quality MPEG4 or Sorensen 3 content? Is a G5 required? How does Pixlet play into this?

Comments

  • Reply 1 of 12
    applenutapplenut Posts: 5,768member
    I know pixlet requires 1Ghz G4 and that's like half HD right?



    so, id say probably require a G5 to do HD.



    I dont know though
  • Reply 2 of 12
    I have this trailer for Matrix Reloaded that is 1000x540, encoded in Sorenson Video 3 and MPEG-4 audio. It plays fine (well, mostly -- it skips very briefly in a couple spots) on my dual 500 G4.







    What exactly are the specifics on HD video? It's not too much larger than that, is it?



    If video of this size plays okay on my three-year-old Mac, I doubt you'd need a much more powerful computer for HD.
  • Reply 3 of 12
    applenutapplenut Posts: 5,768member
    well, are they talking uncompressed HD?





    im confused.... is HD just resolution or is it a format/specification?
  • Reply 4 of 12
    existenceexistence Posts: 991member
    Quote:

    Originally posted by applenut

    well, are they talking uncompressed HD?





    im confused.... is HD just resolution or is it a format/specification?




    It's just HD resolution, in their case, being compressed under WME 9.



    1080p HD resolution is 1920x1280 which is about 2.46 million pixels. That matrix trailer is 1000x540 which is 540,000 pixels or roughly 4.5 times less detailed. I'm guessing if it barely plays on a dual 500, you'd need five times that power for 1080p to play well. Serious stuff.



    What about MPEG4? Since Apple's adopted MPEG4, its playback might be better optmized for Altivec, etc...
  • Reply 5 of 12
    Ok, stop



    1) 1080p content isn't even being produced currently. There are no 1080p cameras on the market right now. It's a format that has been discussed, and was discussed a lot around 1997.



    2) 1080p, in theory, would have a progressive scanned resolution of 1920x1080... that's where the "1080" in "1080p" comes from. But like I said above, we don't have 1080p. And don't hold your breath for it anytime soon.



    What we do have are 720p and 1080i. 720p is progressively scanned with a resolution of 1280x720. 1080i is an interlaced signal with a resolution of 1920x1080 lines of resolution.



    You have to understand that 1080i is 60 fields a second, equalling 30 full frames per second. An interlaced device draws all odd horizontal lines, then all even horizontal lines. So you are never, at any given second, seeing both the odd and even fields of video existing at the same time. Since they are scanned so quickly, they appear as a single frame.



    Progressive scanned images are displayed one whole frame at a time. This is why, depending on who you ask, people will say 720p is a better format than 1080i. Now when I was working at Turner, it was in the beginning days of HD professional kit starting to roll out. I have seen a LOT of HD content on professional monitors, and to be quite honest, I felt the 1080i looked better than the 720p... but sometimes it was the other way around.



    I digress...



    I have played back compressed 720p content on my mac. I run an SGI 1600SW, which has a native resolution of 1600x1024, so the content was able to be displayed natively.



    To play back 1080i content, you have to have a display with a resolution of 1920x1080 or higher. The Apple Cinema display will play it back natively. You can play it back compressed no problem... uncompressed. Uncompressed, or even with a lossless codec, you're going to want some special hardware and some wicked fast disk arrays to play the content back. You should look at the kit used by Weta for LotR. They play back 2k film on their workstations in real time, and it's a hell of a lot more data than HD.



    Anyhow, I'm hoping this answers your questions...
  • Reply 6 of 12
    Quote:

    Originally posted by LoCash

    I have played back compressed 720p content on my mac.



    ...and it should be noted that Jack has a dual 450 G4.
  • Reply 7 of 12
    eugeneeugene Posts: 8,254member
    Quote:

    Originally posted by Brad

    ...and it should be noted that Jack has a dual 450 G4.



    I've played 1280x720x30fps encapsulated MPEG-2 files with no skips too.



    A while back there were 720p T3, Matrix, ESPN-HD and Charlie's Angels clips floating around.
  • Reply 8 of 12
    xmogerxmoger Posts: 242member
    I know 1080p isn't going to be broadcast over the air or through satellite anytime soon. But since we're talking about computer playback, it will have to be converted to progressive for display. A lot of content will need to be deinterlaced or inverse-telecined, but the T2 HD DVD had a progressive encoding I believe and it would have been 1920x1080p if the movie weren't in 2.35 format.



    Here's some sample 720p and 1080i content. They have a much higher bitrate than the windows media samples but doesn't require as much CPU power(only mpeg2). Although this comparison isn't fair because mpeg2 is hardware accelerated on a lot of cards.
  • Reply 9 of 12
    matsumatsu Posts: 6,558member
    just a note. 1080P is no 1920x1280, it's 1920x1080: about 2.07MP per frame times 24, 30, or 60 frames, depending.



    But will any macs play UltraHD ? 4000 x 2000 !!!



    See, this is silly, the equipment to do it (just plain old HDTV) won't reach reasonable prices for a few years yet, even G5's and dual Xeons will be old hat by then.



    AS for playback, I'm pretty sure that DP G4's About dual 800 and up and G5's will be fine for MPEG2 playback in real time.



    NOTHING short of a 4 drive raid can deliver the speed needed to treat RAW 1080P data (from a telecine or Raptor ONLY at this point) in real time. So, since the drives can't keep up, there's no use talking about the CPU in that case. Think of the math for just 24fps.



    1920x1080 = 2 073 600 pixels.



    3CCD "film/video" camera - 8bpp - 24 bit color



    2 073 600 x 24 = 49 766 400 bits per frame



    times 24 frames



    equals 1 194 393 600 bits per second



    149 299 200 bytes per second



    or about 150MB per second !!! depending on how you divide your MB's.



    I can't even think of a 2 drive setup that can deliver that sustained data rate. 4 striped drives, minimum, and they'd probably wilt under the strain after a days worth of work. Ouch. And the Raptor camera I mentioned earlier can even double that rate by recording to a full 48 bit color space.



    I know that's not what you're thinking of, but still, dealing with raw footage in real time remains an EXPENSIVE proposition.



    As for compression schemes, MPEG2 gets help fromt he video card in most cases, and Apple has been very adept at getting MPEG4/QT codecs to deal with high res. Most of the QT video I see out there looks brilliant (when it comes from a good source.) and though Apple seems uninterested in it, even the Divx/WMP/the alphabet soup of warez/pornez windows codecs can be played back nicely with VLC, which keeps getting updated.



    Pixlet, I believe, is NOT a sharing streaming distribution technology. It's stictly editing, it brings HD content down to MiniDV data rates and makes them manageable, if still demanding.
  • Reply 10 of 12
    709709 Posts: 2,016member
    I remember reading a while back about a codec Apple was developing with Aja called 'Alpine'. Anyone know if this was/is Pixlet? Or might it be a different codec to work with the Aja Io?
  • Reply 11 of 12
    wmfwmf Posts: 1,164member
    Quote:

    Originally posted by LoCash

    1080p content isn't even being produced currently. There are no 1080p cameras on the market right now.



    George Lucas shoots 1080p with Sony cameras.



    http://www.henninger.com/library/hdtvfilm24/
  • Reply 12 of 12
    I didn't realize that camera was actually available to anybody with the money to rent it - at least not yet. I read about it years ago. I stand corrected
Sign In or Register to comment.