or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Blu-Ray Vs Red-Ray
New Posts  All Forums:Forum Nav:

Blu-Ray Vs Red-Ray

post #1 of 22
Thread Starter 
This is Red Ray...


This is a digital disk (and solid state) media player.

And despite the title of the post, it is not in direct competition with Blu-Ray. It's not a consumer format. It's not even a fancy disk format. It's just a solution to produce a compact high-definition digital-cinema player.

RedRay will do something that BluRay cannot. It will play movies at 4K from a box about the same size as a DVD drive - to a suitably equipped 4K digital projector.

By way of comparison. BluRay's image size is 1920 x 1080 pixels.
"2K" resolution is a bit higher than that. (Full Aperture Native 2K is defined as 2048x1556) Most experts think that a good cinema resolution is typically around 2K.

4K has twice that linear resolution. So we are talking about an image size of 12 mega-pixels versus BluRay's 2 mega-pixel image.

So a sensible question is: What's coming off the disk? RedRay uses a codec which has a high compression ratio. And apparently very few compression artifacts. In a recent demo. Red showed an audience a 4K showreel - and then revealed that the data-stream was just 10 megabits. Not much more that a standard def MPEG2 DVD. Although it can use much higher bit rates.

I am not suggesting this is a some kind of nail in BluRay's coffin ... but does suggest that better technology is arriving all the time. So it's going to be tough to stick to a standard and expect it to last for 10 or 20 years.

C.
post #2 of 22
Let's put this in its proper context. Because when RED showed the demo reel at 4k and then told people it was at 10Mbits per second it was reported that they got a few claps.

I think people may have assumed that they meant 10 Megabytes per second. This is nothing short of astonishing.

Not only was the RED Ray playing back content that was far more dense than today's consumer HD formats but it was also playing this content back in 10-bit video which offers a marked improvement in color.

This is gobsmacking, fall on the floor and giggle like a girl, good stuff. The RED Ray is going to be about a thousand dollars and frankly it's going to be one hell of a DSP or processing architecture bolted to a red laser drive. Though the proof will be the unpacking of the amazing codec their using to playback glitchless video.

I'm assuming that this technology could probably do 1080p video at 5mbps datarate and look fantastic.

I think the next decade will have similar playback devices and computers crunching out this data so that movie downloads are tiny for distribution yet still look great.

The revolution continues.
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
post #3 of 22
There is the Blu-Ray disc and the Blu-Ray video formats though and they aren't tied together. So you could for example use the Red-Ray compression technology on a Blu-Ray disc and mail it out to a Cinema so that it can play 4K video and the Cinema just have a proper decoder box with a Blu-Ray drive.

As for the consumers, it should be possible to cram a full 1080p movie onto a standard DVD disc and that could threaten Blu-Ray as it would mean that manufacturers could presumably sell 1080p DVD boxes with some updated firmware at a fraction of the cost of Blu-Ray drives.

I would doubt the movie industry will support something like this. I think Blu-Ray is here to stay at this stage. One area that may segregate it to archival solutions though is the possibility of Video on Demand again using lower bitrate transmissions to deliver HD quality movies.

720p over a 2Mbit connection with higher than DVD quality (720p doesn't already make it higher than DVD if the quality per bitrate is bad) will make it much stronger competition to Blu-Ray.
post #4 of 22
Quote:
Originally Posted by Carniphage View Post

This is Red Ray...


This is a digital disk (and solid state) media player.

And despite the title of the post, it is not in direct competition with Blu-Ray. It's not a consumer format. It's not even a fancy disk format. It's just a solution to produce a compact high-definition digital-cinema player.

RedRay will do something that BluRay cannot. It will play movies at 4K from a box about the same size as a DVD drive - to a suitably equipped 4K digital projector.

By way of comparison. BluRay's image size is 1920 x 1080 pixels.
"2K" resolution is a bit higher than that. (Full Aperture Native 2K is defined as 2048x1556) Most experts think that a good cinema resolution is typically around 2K.

4K has twice that linear resolution. So we are talking about an image size of 12 mega-pixels versus BluRay's 2 mega-pixel image.

So a sensible question is: What's coming off the disk? RedRay uses a codec which has a high compression ratio. And apparently very few compression artifacts. In a recent demo. Red showed an audience a 4K showreel - and then revealed that the data-stream was just 10 megabits. Not much more that a standard def MPEG2 DVD. Although it can use much higher bit rates.

I am not suggesting this is a some kind of nail in BluRay's coffin ... but does suggest that better technology is arriving all the time. So it's going to be tough to stick to a standard and expect it to last for 10 or 20 years.

C.

You're assuming that 4k will be the next big thing home cinema. I'd put money on the table that it won't. You literally have to have a 100" or greater screen (with a seating distance of less than 5 feet) to see any benefit over 1080p (which is nearly 2k). 4k offers zero benefits for home theaters.

Quote:
Originally Posted by Marvin View Post

As for the consumers, it should be possible to cram a full 1080p movie onto a standard DVD disc and that could threaten Blu-Ray as it would mean that manufacturers could presumably sell 1080p DVD boxes with some updated firmware at a fraction of the cost of Blu-Ray drives.

A DVD player would require significantly more than a firmware update to play the format. And as for cost, I paid $99 for a second BD player last fall. This fall, the cheap Chinese branded players will be out and push the price envelope even lower. This isn't 2006 when the $600 PS3 was the cheapest BD player. How much cheaper do you want it?
post #5 of 22
Thread Starter 
Quote:
Originally Posted by infinitespecter View Post

You literally have to have a 100" or greater screen (with a seating distance of less than 5 feet) to see any benefit over 1080p (which is nearly 2k). 4k offers zero benefits for home theaters.

I sort of agree. And sort of disagree.

I have a 46" screen - a I use a closer-than-normal viewing distance - and yet I struggle to tell 720p from 1080p. It's possible that RedRay's 10-bit depth might make a more discernible difference. But that would be asking a great deal from the playback device.

Even if your home theater happens to be a full-sized movie theater, you'll struggle to tell the difference between 2K and 4K.

But the point of my OP was to show that the speed of innovation in the area of video technology is getting faster all the time. The very idea of a "standard" codec tied to a physical hardware standard is a negative point against Blu Ray.

The notion that any one standard will stick around for 10 or even 5 years seems to me much more to do with wish-thinking about Sony's business model than anything to do with delivering the very best in video technology.

So while I agree that 1080p is a "good enough" resolution for the majority of video applications, and we might still be happy with 1080p content in ten years time, it is the other part of the BluRay format that I have a problem with.

Placing each movie on a 12 cm disk of plastic is nowhere near good enough. Plastic digital disks were very cool back when they were introduced in 1982, but a quarter of a century later, in the age of the iPod, these silver mini gramophone disks seem bizarrely anachronistic.

C.
post #6 of 22
The next jump would best be to 10-bit video.

We're only seeing the final maturation of LCD technology utilizing 8-bit panels. I'm
ready to see what can be done with 10-bit media and 10-bit panels.
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
post #7 of 22
Thread Starter 
Quote:
Originally Posted by hmurchison View Post

The next jump would best be to 10-bit video.

We're only seeing the final maturation of LCD technology utilizing 8-bit panels. I'm
ready to see what can be done with 10-bit media and 10-bit panels.

That would be an interesting direction - I'd certainly love to see displays (and movies) with increased dynamic range.

That's a much more interesting direction than stereo.

But in terms of consumer technology, I'd really like to see movies liberated from physical disks.
And if I never see a half-assed DVD navigation menu again, it will be too soon.

C.
post #8 of 22
Quote:
Originally Posted by infinitespecter View Post

You're assuming that 4k will be the next big thing home cinema. I'd put money on the table that it won't. You literally have to have a 100" or greater screen (with a seating distance of less than 5 feet) to see any benefit over 1080p (which is nearly 2k). 4k offers zero benefits for home theaters.

It does not really matter what the screen size is, because you just sit closer to a smaller screen, until it occupies 30 degrees of your vision - the only exception to this would be screens so small that they are closer than the minimum focal distance of your eye, which gets longer as you age.

20/20 vision can discern line pairs 1/60th of a degree apart, which corresponds to 120 pixels per degree (one pixel for the line, one pixel for the space between the lines - total horizontal resolution of 3600 pixels in 30 degrees of vision). So you could probably notice the difference between 1080x1920 and 2000x4000 no matter how big your screen was, if you sat close enough to get a 30" view. Higher than 2000x4000 would be a total waste unless we get bionic eyes, though.

After we get to 2000x4000, the real improvements will be in higher frame rates, and fewer compression effects, until we get to 200 fps or something, then we are totally done, pending the holodeck. Once we have a holodeck version of Chuck, I am never leaving the house.
45 2a3 300b 211 845 833
Reply
45 2a3 300b 211 845 833
Reply
post #9 of 22
Quote:
Originally Posted by e1618978 View Post


After we get to 2000x4000, the real improvements will be in higher frame rates, and fewer compression effects, until we get to 200 fps or something, then we are totally done, pending the holodeck. Once we have a holodeck version of Chuck, I am never leaving the house.

It's arguable that people want higher frame rates. There's some kind of magic to 24 fps movies. I don't know it's because everyone is used to seeing movies at 24 fps but seeing more fluid motion in movies is kinda creepy.

I'm sure it's just psychological and that once people get used to a more fluid 200 fps motion, they will find 24 fps choppy and weird.

The question is, is Red-Ray just a bunch of hyperbole? Seems almost too good to be true...and you know what they say about things that are too good to be true.
post #10 of 22
hasn't it been long established that 66fps is the sweet spot for moving pictures?

why add to the data count with 200fps?
I don't see how an anti M$ stance can be seen as a bad thing on an Apple forum I really can't!

nagromme - According to Amazon: "SpongBob Typing Tutor" is outselling Windows
Reply
I don't see how an anti M$ stance can be seen as a bad thing on an Apple forum I really can't!

nagromme - According to Amazon: "SpongBob Typing Tutor" is outselling Windows
Reply
post #11 of 22
Thread Starter 
Quote:
Originally Posted by kim kap sol View Post

It's arguable that people want higher frame rates. There's some kind of magic to 24 fps movies. I don't know it's because everyone is used to seeing movies at 24 fps but seeing more fluid motion in movies is kinda creepy.

I think higher frame rates are tolerable for sports - but I hate drama in higher frame rates. Not sure if it just conditioning or something more. For a long time TV drama was shot on video - And so I associate smooth motion with cheap.

My TV has a feature to interpolate frames - to get smoother motion. The result is that every Hollywood movie looks like a movie of the week.

Quote:
The question is, is Red-Ray just a bunch of hyperbole? Seems almost too good to be true...and you know what they say about things that are too good to be true.

Red promised a camera - and then went and delivered a camera. So they have some credibility.

Red Ray is not essential to their line-up. My guess is that in the course of developing the (Epic/Scarlet) camera they needed some custom chips. Red Ray is just something you can easily build if those chips are at hand.

C.
post #12 of 22
Quote:
Originally Posted by Walter Slocombe View Post

hasn't it been long established that 66fps is the sweet spot for moving pictures?

why add to the data count with 200fps?

For fluid slow motion?
post #13 of 22
Quote:
Originally Posted by kim kap sol View Post

For fluid slow motion?

Yes ....hell they have cameras that do much higher than 200fps to capture
hummingbirds in flight and it's a must to be able to overcrank to get high quality
slow motion. I think our eyes are more perceptible to motion than we give credit.

Everytime I walk into Fry's here in WA I see a crowd around the new Samsung Luxia LED TV. Not only is the contrast eye popping but it runs at 120 or 240hz depending on the model. You can see this in the video ..there's a smoothness to the picture that is evident and for my eyes kind of shocks me. I think watching this tv and others at 120/240hz and going back to a 60hz TV will end up making the 60hz feel "jerky" by comparison.
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
post #14 of 22
Quote:
Originally Posted by kim kap sol View Post

It's arguable that people want higher frame rates. There's some kind of magic to 24 fps movies. I don't know it's because everyone is used to seeing movies at 24 fps but seeing more fluid motion in movies is kinda creepy.

I think that 24 fps is way too slow - I watched an imax movie once that had a girl doing Irish dancing, and she totally blurred out so that you could no longer see her feet or face clearly. Where I do notice this is in film noise, I prefer to watch a movie that has the ticks and scratches that you see on film and not on digital projection.

I don't miss the "stop and melt" that happened all the time when I was a kid, though.
45 2a3 300b 211 845 833
Reply
45 2a3 300b 211 845 833
Reply
post #15 of 22
Quote:
Originally Posted by e1618978 View Post

I think that 24 fps is way too slow - I watched an imax movie once that had a girl doing Irish dancing, and she totally blurred out so that you could no longer see her feet or face clearly. Where I do notice this is in film noise, I prefer to watch a movie that has the ticks and scratches that you see on film and not on digital projection.

I don't miss the "stop and melt" that happened all the time when I was a kid, though.

24p or rather the stubborn adherence to it is quickly becoming an American pathology. Frankly 24p doesn't mean something is "Filmic" to me nor do I feel it's "dreamy". I think the best thing that ehances a film is good cinematography and depth of field for ehancement (Bokeh).

I won't lie I get tired of blurred pans and judder inherent to 24p.
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
post #16 of 22
Quote:
Originally Posted by hmurchison View Post

I'm assuming that this technology could probably do 1080p video at 5mbps datarate and look fantastic.

Be careful with that assumption. All that their being able to stuff a compressed data stream into 10 megabits really means is that, to the human brain, there is approximately 10 megabits of useful visual information in that stream. It may be that the Blu-Ray codec, if given 25% or so more data rate, would look just as good. Or that giving the Red-Ray codec additional bandwidth doesn't make it look better. Or that its primary 'advantage' over Blu-Ray is to throw away the higher resolution data. This is contingent on content and viewing distance, so how much bitrate would be required to maintain that quality if the viewer was closer? I don't know which of these statements is true, but all of them are more likely than being able to magically compress an HD stream into 5 megabits without loss of fidelity. Can't absolutely rule the latter out, but codecs aren't magic and the existing ones are already quite good so an large improvement isn't to be expected. There is some real amount of information that the codecs are preserving for the observer at the end of the process, and the ideal codec cannot compress to below that amount without compromising visual quality.
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #17 of 22
Quote:
Originally Posted by hmurchison View Post

We're only seeing the final maturation of LCD technology utilizing 8-bit panels. I'm ready to see what can be done with 10-bit media and 10-bit panels.

This is not correct. Sony (edit: see this page, note publishing date: april 2007) and Samsung (and probably Sharp too, I'll check... edit: found this) have been using 10 bit panels for a couple of years already. Not sure why given that there's no 10 bit material to feed them with, but there you go.
it's = it is / it has, its = belonging to it.
Reply
it's = it is / it has, its = belonging to it.
Reply
post #18 of 22
For anything other than sports, 100hz and 200hz is absolute rubbish and I have no bloody idea why people are watching movies or drama with it. It's weird, artificial, and gimmicky, IMO.

Quote:
Originally Posted by Carniphage View Post

I think higher frame rates are tolerable for sports - but I hate drama in higher frame rates. Not sure if it just conditioning or something more. For a long time TV drama was shot on video - And so I associate smooth motion with cheap.

My TV has a feature to interpolate frames - to get smoother motion. The result is that every Hollywood movie looks like a movie of the week.



Red promised a camera - and then went and delivered a camera. So they have some credibility.

Red Ray is not essential to their line-up. My guess is that in the course of developing the (Epic/Scarlet) camera they needed some custom chips. Red Ray is just something you can easily build if those chips are at hand.

C.
post #19 of 22
Quote:
Originally Posted by Mr. H View Post

This is not correct. Sony (edit: see this page, note publishing date: april 2007) and Samsung (and probably Sharp too, I'll check... edit: found this) have been using 10 bit panels for a couple of years already. Not sure why given that there's no 10 bit material to feed them with, but there you go.

Excellente! There's plenty of 10-bit video in the Prosumer /Professional realm but sadly no consumer driven playback devices that I know of. By all means vendors ...deliver high quality 10-bit video monitors and we'll find a use for them.


Quote:
Originally Posted by nvidia2008 View Post

For anything other than sports, 100hz and 200hz is absolute rubbish and I have no bloody idea why people are watching movies or drama with it. It's weird, artificial, and gimmicky, IMO.

I think the idea of 120hz or 240Hz scanning of video is to fix an inherent problem with LCD displays that aren't quite as fast as CRT technology and thus ghost or blur fast moving images.
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
post #20 of 22
A few years ago I was seriously think about buying a TV. While that desire faded away I was taken aback by all the marketing crap, lies and half truths associated with 1089p and 720p. It didn't matter which store I walked into, nor which screen I looked at but I could immediately tell which was a higher resolution screen. Even today large screens look pixelated to me.

Now the big joke was hey if you sit at X distance from the screen you won't notice. Maybe for some this is the case but this isn't the case for me. Large pixelated screens are just hard to view, that is I find them mentally tiring. Maybe there is more going on here besides resolving pixels but in the end I just gave up as I couldn't see the value in the larger screens. Admittedly it has been a long time and maybe the 1080p screens are better now but my perception was that there where far to few pixels on theses screen for acceptable viewing.

In the end it might be worthwhile to study what people experience with these screens. In the end I suspect to much of the info about what is acceptable in a TV screen, resolution wise, has been made up to market the then state of the art hardware.

Quote:
Originally Posted by e1618978 View Post

It does not really matter what the screen size is, because you just sit closer to a smaller screen, until it occupies 30 degrees of your vision - the only exception to this would be screens so small that they are closer than the minimum focal distance of your eye, which gets longer as you age.

20/20 vision can discern line pairs 1/60th of a degree apart, which corresponds to 120 pixels per degree (one pixel for the line, one pixel for the space between the lines - total horizontal resolution of 3600 pixels in 30 degrees of vision). So you could probably notice the difference between 1080x1920 and 2000x4000 no matter how big your screen was, if you sat close enough to get a 30" view. Higher than 2000x4000 would be a total waste unless we get bionic eyes, though.

After we get to 2000x4000, the real improvements will be in higher frame rates, and fewer compression effects, until we get to 200 fps or something, then we are totally done, pending the holodeck. Once we have a holodeck version of Chuck, I am never leaving the house.
post #21 of 22
Quote:
Originally Posted by wizard69 View Post

A few years ago I was seriously think about buying a TV. ...

So, wizard69 is into necroposting? Interesting.
post #22 of 22
Quote:
Originally Posted by Mr. Me View Post

So, wizard69 is into necroposting? Interesting.

There was a spambot that brought the thread back up (which I subsequently deleted). wizard69 must not have checked the original post's date is all.

Originally Posted by Slurpy

There's just a TINY chance that Apple will also be able to figure out payments. Oh wait, they did already… …and you’re already f*ed.

 

Reply

Originally Posted by Slurpy

There's just a TINY chance that Apple will also be able to figure out payments. Oh wait, they did already… …and you’re already f*ed.

 

Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Blu-Ray Vs Red-Ray