AMD is planning to add this right onto their graphics cards I hear. Maybe that is what this guy is talking about since Apple is a big fan on ATI's cards??
Rather than spending $50 on a dedicated H.264 chip, I think it would make more sense to just use a $50 more expensive GPU. MacBook and mini customers would probably much prefer something like an 8300 over the IGP.
Rather than spending $50 on a dedicated H.264 chip, I think it would make more sense to just use a $50 more expensive GPU. MacBook and mini customers would probably much prefer something like an 8300 over the IGP.
I read over on Doom9 that they came to the conclusion that the CPU is still a better processor for video decoding/encoding than the GPU.
I guess this makes sense when you think about the functions that a GPU accelerates well. Functions that can be parallel processed. Sadly video encoding is more serial and with GOP (group of pictures) encoding you kind of need to know the what the frames look like ahead. I'm sure it could be done (improved encoding on a GPU) but the work involved may not justify the effort.
The ambarella chip sounds nice. I know the mini HD camcorders coming out have all but standardized on the A2 chip. The 3Dlabs DMS-02 should be another nice chip for portable applications.
I read over on Doom9 that they came to the conclusion that the CPU is still a better processor for video decoding/encoding than the GPU.
I think he was just referring to that a better video card would be more useful to the average consumer, in terms of games, etc., not that the video card would be more useful for processing video.
I'm not sure if I agree, but I'm 99% sure that, as every other time, Dvorak is full of shit here anyway
Quote:
I guess this makes sense when you think about the functions that a GPU accelerates well. Functions that can be parallel processed. Sadly video encoding is more serial and with GOP (group of pictures) encoding you kind of need to know the what the frames look like ahead. I'm sure it could be done (improved encoding on a GPU) but the work involved may not justify the effort.
I think you're a bit optimistic about video encoding on the GPU
Perhaps but knowing how GOP encoding is done I don't see a GPU adding that much benefit over a CPU thusfar. Even the ATI AVIVO card doesn't use the card for encoding. Tom's Hardware showed the CPU load was affected very little. It seems the AVIVO has a fast encoder that merely uses the GPU as a "dongle"
Actually, I do think that GPUs could accelerate video encoding. Don't think about encoding multiple frames in parallel; think about encoding all the macroblocks within a frame in parallel.
According to some of the video freaks over at Doom9 the CPU is still best for decoding h.264 according to some tests amongst members. Sorry don't have a link maybe I'll have time to search this week.
There are some nice encoding/decoding chips coming but currently they're a bit expensive. Fujitsu is putting some work into h.264 product. It looks like they want to become a fixture in the STB and DVR market.
Fujitsu just announced a world's first H.264 chip capable of encoding/decoding 1920 x 1080 (60i/50i) video in real time. The chip features 256MB of onboard FCRAM and ultra low 750mW power draw when encoding video. That means lickity quick, MPEG-2 quality processing with only a third, or half the required storage. The ¥30,000 ($247) MB86H51 chip is available to OEMs starting July 1st after which you'll find it bunged into the latest up-scale, consumer-class video recorders.
$$$$$$$$ when this performance drops to under $100 things get fun.
I think for Apple to include someting like this the chip would have to be no more than $25 in quantity. You have to wonder about whether a seperate chip is going to make sense even in 2 years when the base computer will be a quad core and high end units will be SMP 16-core systems.
They probably aren't the best solution, but even good software can do wonders - compare VLC and QT at HD H.264, on my older PC, 720p H.264 would struggle with QT, but play fine with VLC, I've never tried 1080p, as I don't have a monitor that goes that high.
Having H.264 decoding and nice DSP capabilities in all Macs would be a nice feature.
H.264 decoding 1080p barely makes a Mac notebook sweat. I've had a 1.83GHz MBP decode and play two 1080p streams, simultaneously, in real time without issue, it only stutters when I try to play a third. They aren't Cells, but the Core Duo chip offers decent vector units. A low power encoder would be nice, but the premise of this thread, h.264 decoding in a dedicated chip, seems to be pointless to me now, unless it's a very small, low power chip to help extend video play time in a notebook, and I don't know if the trade-off is worth it.
Hardware ENCODING would be much more useful and time-saving than hardware decoding. I can play DVDs just fine on my Mac, it's encoding video content to MP4 that takes forever and burns up my CPU.
Comments
AMD is planning to add this right onto their graphics cards I hear.
I thought it has it already. That's why long ago Mac users are wondering if this capability is enabled under Mac OS X or not.
OTOH, http://www.ambarella.com/
Rather than spending $50 on a dedicated H.264 chip, I think it would make more sense to just use a $50 more expensive GPU. MacBook and mini customers would probably much prefer something like an 8300 over the IGP.
OTOH, http://www.ambarella.com/
I read over on Doom9 that they came to the conclusion that the CPU is still a better processor for video decoding/encoding than the GPU.
I guess this makes sense when you think about the functions that a GPU accelerates well. Functions that can be parallel processed. Sadly video encoding is more serial and with GOP (group of pictures) encoding you kind of need to know the what the frames look like ahead. I'm sure it could be done (improved encoding on a GPU) but the work involved may not justify the effort.
The ambarella chip sounds nice. I know the mini HD camcorders coming out have all but standardized on the A2 chip. The 3Dlabs DMS-02 should be another nice chip for portable applications.
I read over on Doom9 that they came to the conclusion that the CPU is still a better processor for video decoding/encoding than the GPU.
I think he was just referring to that a better video card would be more useful to the average consumer, in terms of games, etc., not that the video card would be more useful for processing video.
I'm not sure if I agree, but I'm 99% sure that, as every other time, Dvorak is full of shit here anyway
I guess this makes sense when you think about the functions that a GPU accelerates well. Functions that can be parallel processed. Sadly video encoding is more serial and with GOP (group of pictures) encoding you kind of need to know the what the frames look like ahead. I'm sure it could be done (improved encoding on a GPU) but the work involved may not justify the effort.
I think you're a bit optimistic about video encoding on the GPU
http://www.cs.ust.hk/~gchan/papers/ICME06_Motion.pdf
(Although we haven't seen the new iMacs yet...but......I doubt it.)
Having H.264 decoding and nice DSP capabilities in all Macs would be a nice feature.
There are some nice encoding/decoding chips coming but currently they're a bit expensive. Fujitsu is putting some work into h.264 product. It looks like they want to become a fixture in the STB and DVR market.
http://www.engadget.com/2007/05/21/f...-a-worlds-fir/
Fujitsu just announced a world's first H.264 chip capable of encoding/decoding 1920 x 1080 (60i/50i) video in real time. The chip features 256MB of onboard FCRAM and ultra low 750mW power draw when encoding video. That means lickity quick, MPEG-2 quality processing with only a third, or half the required storage. The ¥30,000 ($247) MB86H51 chip is available to OEMs starting July 1st after which you'll find it bunged into the latest up-scale, consumer-class video recorders.
$$$$$$$$ when this performance drops to under $100 things get fun.
http://www.fujitsu.com/global/news/p...070521-01.html
I think for Apple to include someting like this the chip would have to be no more than $25 in quantity. You have to wonder about whether a seperate chip is going to make sense even in 2 years when the base computer will be a quad core and high end units will be SMP 16-core systems.
doen't the new MBP with the 8600M graphic card offer hardward H.264 decoding?
Yes. And the ATI Radeon HD 2xxx series does as well.
http://www.anandtech.com/video/showdoc.aspx?i=2977&p=8
They probably aren't the best solution, but even good software can do wonders - compare VLC and QT at HD H.264, on my older PC, 720p H.264 would struggle with QT, but play fine with VLC, I've never tried 1080p, as I don't have a monitor that goes that high.
doen't the new MBP with the 8600M graphic card offer hardware H.264 decoding?
Yes, but Apple probably doesn't use it.
Having H.264 decoding and nice DSP capabilities in all Macs would be a nice feature.
H.264 decoding 1080p barely makes a Mac notebook sweat. I've had a 1.83GHz MBP decode and play two 1080p streams, simultaneously, in real time without issue, it only stutters when I try to play a third. They aren't Cells, but the Core Duo chip offers decent vector units. A low power encoder would be nice, but the premise of this thread, h.264 decoding in a dedicated chip, seems to be pointless to me now, unless it's a very small, low power chip to help extend video play time in a notebook, and I don't know if the trade-off is worth it.