or Connect
AppleInsider › Forums › Mobile › iPhone › Apple unveils redesigned, thinner iPhone 4 with two cameras
New Posts  All Forums:Forum Nav:

Apple unveils redesigned, thinner iPhone 4 with two cameras - Page 9

post #321 of 508
That depends...how do you say "hijacked" in Brittish?

Just kiddin, no worries from me, and glad to hear you are happy with the new dictionary!
post #322 of 508
Quote:
Originally Posted by melgross View Post

I'd like to see a 3:1 optical zoom and image Stabilization, along with a stronger flash.

There were various apps that implemented it via the accelerometer, but I don't recall if Apple actually added it via the OS. Anyone know for sure?
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
post #323 of 508
Quote:
Originally Posted by melgross View Post

I'd like to see a 3:1 optical zoom and image Stabilization, along with a stronger flash.

Mmm... Image stabilization. FCS has that capability after the fact (post processing). I have a VideoCam with special circuity? to do that.

Question: Shouldn't a [relatively powerful] general purpose CPU along with a GPU and some DSP and FFT APIs be able to handle that in real time? As part of the video capture/encoding process?

I would like to see an iPad version of FCS Motion.

.
"Swift generally gets you to the right way much quicker." - auxio -

"The perfect [birth]day -- A little playtime, a good poop, and a long nap." - Tomato Greeting Cards -
Reply
"Swift generally gets you to the right way much quicker." - auxio -

"The perfect [birth]day -- A little playtime, a good poop, and a long nap." - Tomato Greeting Cards -
Reply
post #324 of 508
Quote:
Originally Posted by mstone View Post

No it was your apparent lack of understanding of these things that was making your comments nonsensical.

Cut the snark and explain what you mean and why melgross is wrong.

Or move along. Because, at this point, you're just coming through as being passive-aggressive and not particularly interested (or helpful) in advancing the discussion.
post #325 of 508
Quote:
Originally Posted by mstone View Post

When I go outside and look up at the sky, I never see any banding, but in many 24 bit digital photos, the sky shows banding because there aren't enough pixels on the screen to cover the colors required to reproduce the image accurately.

Wait, you don't see banding in the sky?
post #326 of 508
Quote:
Originally Posted by Fake_William_Shatner View Post

That does come to mind -- sure. But you'd need a super fast video compressor-decompressor and lose a lot of quality to MASTER video in 17 Gigs.

Apple's Intermediate CODEC is good for editing video. But more or less, I'd say that in REAL WORLD terms, you need about 3X the storage space of the Time of the video on a FINAL HD CODEC. When you layer graphics you've got the source, and then you've got the rendering of the layers in a much less compressed codec.

On top of that, you MIGHT be doing something more than just video editing. My 16 Gig iPod Touch is full to the brim, and that's without ANY video editing going on. So maybe I slim it down to 8 Gigs, and then add all the newer LARGER apps that will be designed for the new HD platform...

... So, by my rough estimate -- a REAL HD video will require about a Gig a Minute -- but for people doing that home movie who can't tell a compressed SD from an HD signal -- maybe they can do an hour with 18 Gigs of free space.

I have a Panisonic Lumix camera that records compressed 720p HD video with stereo. It will record 4 hours in 32 gigs of memory at it's highest quality setting. It uses an AVCHD Lite compression. I would assume Apple would have some comparable codec that would give similar results.
post #327 of 508
Quote:
Originally Posted by cameronj View Post

Wait, you don't see banding in the sky?

Sky?

.
"Swift generally gets you to the right way much quicker." - auxio -

"The perfect [birth]day -- A little playtime, a good poop, and a long nap." - Tomato Greeting Cards -
Reply
"Swift generally gets you to the right way much quicker." - auxio -

"The perfect [birth]day -- A little playtime, a good poop, and a long nap." - Tomato Greeting Cards -
Reply
post #328 of 508
Quote:
Originally Posted by DJRumpy View Post

There were various apps that implemented it via the accelerometer, but I don't recall if Apple actually added it via the OS. Anyone know for sure?

Quote:
Originally Posted by Dick Applebaum View Post

Mmm... Image stabilization. FCS has that capability after the fact (post processing). I have a VideoCam with special circuity? to do that.

Question: Shouldn't a [relatively powerful] CP CPU along with a GPU and some DSP and FFT APIs be able to handle that in real time? As part of the video capture/encoding process?

I would like to see an iPad version of FCS Motion.

.

I haven't seen the apps that say they give stabilization. Any names I could try out?

I suppose they could do it in a number of ways. they first need a sensor with more pixels than they use for the image, so that they can have enough around the edges to move the image around to counter the shake. After that, it's however they can get to the info before it's actually recorded so they can work with it. As long as they can get to the hardware, it would be ok. The big thing is the chip though.
post #329 of 508
Quote:
Originally Posted by melgross View Post

I haven't seen the apps that say they give stabilization. Any names I could try out?

I suppose they could do it in a number of ways. they first need a sensor with more pixels than they use for the image, so that they can have enough around the edges to move the image around to counter the shake. After that, it's however they can get to the info before it's actually recorded so they can work with it. As long as they can get to the hardware, it would be ok. The big thing is the chip though.

ProCamera comes to mind. Just found another called 'Night Camera'. They use the accelerometer to detect relatively 'quiet' moments when the phone isn't moving around much. I haven't tried any of them however.
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
iMac 27" 2.8 Quad i7 / 24" Dual Core 3.06 / 17" Macbook Pro Unibody / Mac Mini HTPC / iPhone 4
Reply
post #330 of 508
Quote:
Originally Posted by DJRumpy View Post

ProCamera comes to mind. Just found another called 'Night Camera'. They use the accelerometer to detect relatively 'quiet' moments when the phone isn't moving around much. I haven't tried any of them however.

I just opened iTunes and typed "camera" into the app store. BOY! That's a lot of camera apps. I lost track of most of them long ago.

Night Camera uses the accelerometer to find when the camera is moving the least, and then takes the pic. Possibly useful. The problem is that it's not actually stabilizing the image, and you may miss the pic if it waits too long.

Pro Camera says "Anti-Shake", but doesn't say how it works, so I just popped for the big $1.99 to try it out. I'm syncing now.
post #331 of 508
Quote:
Originally Posted by Haggar View Post

Probably in the same place as those "Who cares about a flash with a camera phone?" naysayers. And the "Who cares about multitasking?" naysayers. And the "Who cares about copy and paste?" naysayers. But those naysayers will try to backtrack and claim that they never dismissed those features, and that they have always liked Apple to add them.

I cannot remember many people saying that these features were absolutely useless. Of course they are useful. Its really more a matter of their over all importance. Many of us argued that Apple would slowly evolve the iPhone to include these features, and that is exactly what they've done.

Quote:
Originally Posted by melgross View Post

As far as flash with a camera phone, well, it's marginally useful. I would have preferred if Apple used two diodes. One, even if it is the Phillips one, isn't terribly strong. I would really have loved to see a ring flash using several.


I'm not that big of a fan of flash photography as most people don't really know how to properly use it. Often they end up washing out skin tones or provide too harsh of a light in a persons face. All in all it can easily make a bad picture.
post #332 of 508
Quote:
Originally Posted by melgross View Post

I haven't seen the apps that say they give stabilization. Any names I could try out?

I suppose they could do it in a number of ways. they first need a sensor with more pixels than they use for the image, so that they can have enough around the edges to move the image around to counter the shake. After that, it's however they can get to the info before it's actually recorded so they can work with it. As long as they can get to the hardware, it would be ok. The big thing is the chip though.

I wasn't clear... I don't know of any apps that provide image stabilization... I was asking if it's possible on an iPad with a video captured on an iPhone 4-- when both have iOS 4.

Here's a couple of quick and dirty iMac iMovie examples (not stabilized) that I want do do (or have one of the grandkids do) on the iPhone / iPad when we're out and about on those long Saturdays from Jul - Nov.

Be sure and choose the HD versions.

http://www.youtube.com/watch?v=cOtZ0fSfQPo


http://www.youtube.com/watch?v=OzAft7b7z4I

Some other ids are here:

http://www.youtube.com/dicklacara#p/u

IF iPhone & iPad can do this on site... its over!

.
"Swift generally gets you to the right way much quicker." - auxio -

"The perfect [birth]day -- A little playtime, a good poop, and a long nap." - Tomato Greeting Cards -
Reply
"Swift generally gets you to the right way much quicker." - auxio -

"The perfect [birth]day -- A little playtime, a good poop, and a long nap." - Tomato Greeting Cards -
Reply
post #333 of 508
Quote:
Originally Posted by TenoBell View Post


I'm not that big of a fan of flash photography as most people don't really know how to properly use it. Often they end up washing out skin tones or provide too harsh of a light in a persons face. All in all it can easily make a bad picture.

No argument there. It's just that between the poor sensitivity of these tiny sensors, and the really slow lenses they're coupled with, a flash will help. They're really just intended for a portrait shot at close distance. Anything further away that about three feet won't get much benefit anyway. I'm assuming that the flash is automatic, or that the software will compensate for the distance. Two LED's will give maybe 4.5 feet of effective light. It's better than nothing with these things.
post #334 of 508
Please tell me this iOS4 finally has a DELETE ALL EMAIL (at once) button. 2010 and we still have to manually select 200 messages one by one to delete them?
post #335 of 508
Quote:
Originally Posted by Dick Applebaum View Post

I wasn't clear... I don't know of any apps that provide image stabilization... I was asking if it's possible on an iPad with a video captured on an iPhone 4-- when both have iOS 4.

Here's a couple of quick and dirty iMac iMovie examples (not stabilized) that I want do do (or have one of the grandkids do) on the iPhone / iPad when we're out and about on those long Saturdays from Jul - Nov.

Be sure and choose the HD versions.

http://www.youtube.com/watch?v=cOtZ0fSfQPo


http://www.youtube.com/watch?v=OzAft7b7z4I

Some other ids are here:

http://www.youtube.com/dicklacara#p/u

IF iPhone & iPad can do this on site... its over!

.

They were interesting, but I was having some problems with them and I don't know if it was from my end or not. They were jerky at times, but not the kind of jerk from lack of stabilization. At times, they were rock solid. But some looked to be shot in slo mo.

As for the question about anti shake, stabilization, or whatever we call it on an iPad. The answer is that I don't know yet. Maybe some of the guys here who are developers and who have seen the new OS may have some idea. But, while there are schemes to post process video so as to eliminate shake from exposure, its fairly sophisticated stuff.

What is done is to crop slightly so that the image can be moved around so as to eliminate the shake. The frame is moved up or down by whatever amount is needed, and sometimes sideways too. But it requires a fair amount of processing power and sophistication from the software. It has to know what's shake, and what's movement, and separate out the differences, and correct for the proper errors. Often you can click on an object that shouldn't be moving, so that the software can compensate for it. But if it moves out of the frame, another has to be used, and there may be a jerk as the software moves from one reference to another.

I don't think this stuff will be seen on an iPad for some time.

It's really much easier to do it in the camera.
post #336 of 508
Quote:
Originally Posted by success View Post

Please tell me this iOS4 finally has a DELETE ALL EMAIL (at once) button. 2010 and we still have to manually select 200 messages one by one to delete them?

It does have a delete a thread.
post #337 of 508
I am very excited about the new iPhone 4. It is a substantial upgrade from my 2 year old 3g iPhone. I often read posts about how the Android phones are better than the iPhone and sometimes it seems like a huge rivalry. I am glad that there is healthy competition in the current smartphone market. If Palm or Android phones had not come out with some nice features, we may still be missing "cut and paste" and other features that Apple has added in the last year or so. Apple is doing a great job of staying ahead of the competition. Sometimes a leader must feel the follower's breath on their neck to cause the leader to pick up the pace in a race. Imagine if the only other smartphones were Blackberrys or WinMobile phones. Would the current iPhone be as impressive? Hard to say, but I like what I am seeing in the iPhone 4 and am personally thinking the next 19 days are going to move very slowly.

Mike http://forums.appleinsider.com/image...ies/1smile.gif
post #338 of 508
Quote:
Originally Posted by success View Post

Please tell me this iOS4 finally has a DELETE ALL EMAIL (at once) button. 2010 and we still have to manually select 200 messages one by one to delete them?

Noo... there's an iMac for that....

.
"Swift generally gets you to the right way much quicker." - auxio -

"The perfect [birth]day -- A little playtime, a good poop, and a long nap." - Tomato Greeting Cards -
Reply
"Swift generally gets you to the right way much quicker." - auxio -

"The perfect [birth]day -- A little playtime, a good poop, and a long nap." - Tomato Greeting Cards -
Reply
post #339 of 508
Quote:
Originally Posted by anantksundaram View Post

Cut the snark and explain what you mean and why melgross is wrong.

Or move along. Because, at this point, you're just coming through as being passive-aggressive and not particularly interested (or helpful) in advancing the discussion.

Sorry but when an guy who thinks he knows everything wants to argue about a subject that I have been involved with professionally for over 25 years ...AND is so far off target that I can't even begin to start a dialog without hours of preparation in order to be instructive and not sound argumentative, I just have to pass.

Life is too short to drink bad coffee.

Reply

Life is too short to drink bad coffee.

Reply
post #340 of 508
Quote:
Originally Posted by mstone View Post

Sorry but when an guy who thinks he knows everything wants to argue about a subject that I have been involved with professionally for over 25 years ...AND is so far off target that I can't even begin to start a dialog without hours of preparation in order to be instructive and not sound argumentative, I just have to pass.

Oh please. don't think you're so smart. I've been doing this even longer than you have.

Either explain what you mean, or go away, and leave the rest of us alone.
post #341 of 508
Quote:
Originally Posted by melgross View Post

It does have a delete a thread.

Meaning 'Google like' conversation? That's BS. Many people have been requesting for this feature for ages. The reason why it's needed it because many people use POP accounts. When you first set up the POP account on the iPhone it imports say all messages that that account has which could easily be 1,000 archived messages. So not only do you have to wait while the iPhone imports those 1,000 messages but after you're done those messages sit on the iPhone. They aren't needed on the iPhone because you can view them on the desktop/webmail. So many of us manually delete those POP emails because we're just interested in the new messages that come in after we set up the account. Right now you have to go in and manually delete 1,000 messages. There is no delete all button. There is a delete all button for the messages in the trash box but...
post #342 of 508
Quote:
Originally Posted by success View Post

Meaning 'Google like' conversation? That's BS. Many people have been requesting for this feature for ages. The reason why it's needed it because many people use POP accounts. When you first set up the POP account on the iPhone it imports say all messages that that account has which could easily be 1,000 archived messages. So not only do you have to wait while the iPhone imports those 1,000 messages but after you're done those messages sit on the iPhone. They aren't needed on the iPhone because you can view them on the desktop/webmail. So many of us manually delete those POP emails because we're just interested in the new messages that come in after we set up the account. Right now you have to go in and manually delete 1,000 messages. There is no delete all button. There is a delete all button for the messages in the trash box but...

I just know what they talked about in the address given today. Don't blame me.
post #343 of 508
Quote:
Originally Posted by twoslick View Post

For someone who claims to teach this stuff, it honestly saddens me that you've tried to equate the cell density of the retina with the density of a display device held 10 to 12 inches away. Your nitpicking assumes you smashed the display up against your retina (of course, then you couldn't actually see the whole display). Your vision really does lose "resolution" the farther away an object is. Otherwise, we'd all be able to see the stripes on the American flag on the Moon.

Maybe you'd like to source some research that says the retina can pick out detail higher than 326 ppi at 10 inches away before making nonsensical arguments.

You know, I'm sorry... I forgot the old internet forum posting rule warning that when you mention you have an advanced education, it will always provoke a comment from a jealous, insecure, hater, trying to feel better about himself by insulting and criticizing those who make the post.

This is all over, a few minutes of looking, and you could have saved yourself from looking like a complete asshole.

This is one of the first modern studies showing the limits of visual acuity.

Vernier acuity, crowding, and cortical magnification.
Levi, Klein, and Altsebaomo, 1985.
Vision Research, Vol. 25, Issue 7, pp 963 - 977

It is considered a seminal paper in the field of vision research on visual acuity.

And to provide you with a source closer to your level;
"Eye, Human", in the 2006 Brittanica.

Also, a quick search of Google Scholar will return literally tens of thousands of hits on human visual acuity. In fact, in the top five hits for "Vernier Acuity", is the first paper I mention above, plus others investigating and describing such visual acuity.

The fact is we are able to distinguish details down to 8 arcminutes at 16 feet. And just so you know, that is 40% of the size of a human cone photoreceptor.

Now STFU and go back to your troll cave.
post #344 of 508
Sorry if I missed it, but we're on page 9 here.

Did Apple confirm that iPhone 4 will be able to use the Bluetooth Keyboard?
I didn't see it anywhere on the specs page.
The evil that we fight is but the shadow of the evil that we do.
Reply
The evil that we fight is but the shadow of the evil that we do.
Reply
post #345 of 508
Quote:
Originally Posted by melgross View Post

They were interesting, but I was having some problems with them and I don't know if it was from my end or not. They were jerky at times, but not the kind of jerk from lack of stabilization.

Prolly, at my end... Cameraman (me) error. Again I am usually at midfield... am always jerking around looking for the ball & zoom in and out. One problem is the Panny has a only an LCD viewfinder that is totally useless in the sun... so I use a tripod and dead reckoning. Several clips were slowed down to synch with the music. I could have spent more, time, used FCS (and image stabilization), still frames instead of jerks.... But my goal was to examine 3-4 hours of footage, every Sunday, and have highlights published on Monday, for Tuesday practice... and to have a little fun.

Quote:


At times, they were rock solid. But some looked to be shot in slo mo.

Yeah, quick and dirty... one video clip, "Zack Takes Control", was about 7 seconds long.., but Zack faked out the opposing player, twice, in that seven seconds... The only way I could figure to show this was to convert the vid to an image-sequence at 30 FPS... than play that as a herky-jerky slide-show.

Quote:

As for the question about anti shake, stabilization, or whatever we call it on an iPad. The answer is that I don't know yet. Maybe some of the guys here who are developers and who have seen the new OS may have some idea. But, while there are schemes to post process video so as to eliminate shake from exposure, its fairly sophisticated stuff.

I am a Developer... 4.0 beta has all this stuff DSP, etc that seems designed for video processing. Unfortunately, you can only use it on the iPhone at this time (iPad support for 4.0 is not available yet, even for developers).

I am not knowledgeable enough to write code for image stabilization... so I have to depend on others.

That's why I ask!

Quote:


What is done is to crop slightly so that the image can be moved around so as to eliminate the shake. The frame is moved up or down by whatever amount is needed, and sometimes sideways too. But it requires a fair amount of processing power and sophistication from the software. It has to know what's shake, and what's movement, and separate out the differences, and correct for the proper errors. Often you can click on an object that shouldn't be moving, so that the software can compensate for it. But if it moves out of the frame, another has to be used, and there may be a jerk as the software moves from one reference to another.

There is at least 1 iPhone app that can stitch a series of overlapping still images into a panorama... I am surprised by the results and the performance.

http://itunes.apple.com/us/app/autos...318944927?mt=8

It doesn't do realtime, but the processing should be similar to stabilization, No?

Quote:

I don't think this stuff will be seen on an iPad for some time.

It's really much easier to do it in the camera.

I an interested in why... is it easier to do in hardwired circuitry on the cam?

TIA

I am learning something!

.
"Swift generally gets you to the right way much quicker." - auxio -

"The perfect [birth]day -- A little playtime, a good poop, and a long nap." - Tomato Greeting Cards -
Reply
"Swift generally gets you to the right way much quicker." - auxio -

"The perfect [birth]day -- A little playtime, a good poop, and a long nap." - Tomato Greeting Cards -
Reply
post #346 of 508
I am curious about FaceTIme. Did they say it was an open standard? I read that somewhere. Would that mean it could open up video calling to other phones? Right now, if it's just iPhone 4 to iPhone 4, that'd be of very limited usefulness.
post #347 of 508
I am actually liking the design. Almost reminds me of the Prada phone. It's not as "Bling" as iPhones passed. But I kinda like that. It seems more suitable for business. A more professional look if you will.

Kinda looks like this:

http://i.imgur.com/DHP8p.jpg
post #348 of 508
Quote:
Originally Posted by melgross View Post

There's no such thing as the "RGB" gamut. RGB can have a small gamut, or a very large gamut. Seeing detail has nothing to do with the gamut per se.

Explain yourself out of this comment with links, documentation, evidence, or anything relevant and maybe we have a discussion

Life is too short to drink bad coffee.

Reply

Life is too short to drink bad coffee.

Reply
post #349 of 508
Quote:
Originally Posted by MFago View Post

What is the angular resolution of the human eye, and what linear distance does this angle correspond to at 12" away? I believe Steve was claiming that it is 300 dpi.

I don't have the time to do a conversion, and it is also distance dependent, but we are capable of perceiving a difference in two lines as small as 8 arcminutes of angle. This is typically measured with the Snellen methodology which is usually at 12, 16, or 20 feet. This size (8 arcmin.) is about 40% the size of a cone receptor at 16 ft.

Quote:
Originally Posted by masternav View Post

yeah - get the receptor thingy BUT how does the visual center process all that data? We are talking perceptivity here not receptivity, processing not physical capacity. Again, don't get distracted by the label - consider the function - unless you are in fact a hopeless pedant. Go back to what the actuall Apple website says about the display and actually watch what Steve Jobs says about it - not the translated, transformed, transliterated abomination that passes for the article here.

The vast majority of what the brain processes is done so unconsciously. We don't have to wake up in the morning and say to ourselves, "I'm going to be extra vigilant today and pay more attention to what my brain is doing". The brain processes everything it takes in. Awareness is another thing entirely. You are making the mistake that many first time Perception students make which is assuming we're aware of everything being processed by the brain. We are not, in fact, we are only aware of less than about 1% of all perceptual activity performed by the brain.

Quote:
Originally Posted by melgross View Post

You certainly didn't understand what Jobs was saying. nd what you are saying isn't entirely correct.

Actually I do, and much of what you say here demonstrates that you really don't understand anything about the optics of the eye or the physical relationship between locations in the visual field and how we perceive those objects.

Quote:
Originally Posted by melgross View Post

I've been in the photo industry since 1969, and have degrees in biology, as I assume you do, and 300 dpi, ppi, spi, etc., are standards for good reasons. Our visual acuity is limited. Under VERY special circumstances we can see lines, and dots that are finer than we can normally see. But that's under circumstances that are unusual. Under conditions of extreme contrast, we can see details we otherwise can't see. Those conditions don't normally exist for us when reading newspapers, magazines, and books. You might remember that he mentions, specifically, printed matter as the standard there.

My claim about the degree of human visual acuity is based on the standardized Snellen procedure used by every optometrist in the world. Such acuity is usually measured at distances of 12, 16, or 20 feet.

Yes, there are things that can be finer than the resolution of our visual acuity. But we can see things that are extremely small. As I replied to other posters above, details less than one half the size of a cone photoreceptor.

Quote:
Originally Posted by melgross View Post

The lower the contrast, the lower our ability to discriminate detail. This is pretty well known and understood.

See above.

Quote:
Originally Posted by melgross View Post

You are also making a major error in talking about the density of the retina. The retina is small. There's no point in saying that there are millions of rods and cones per inch. Do you know the size of the retina?

In addition to that small retina size is the fact that the iPhone screen is vastly larger than that retina. Can you figure out how many retinas would be needed to cover the iPhone screen? The point is that the retina has just so many sensing cells, and the iPhone screen has just so many pixels. Talking about the number of cells per inch is a useless statement because that small "sensor" is looking at a much larger screen.

In addition to that is the other fact that we aren't using more than a small part of out retina to image the iPhone screen, so we are just using a fraction of the cells in it. And, in addition to that, is the other fact that while the screen has the same sensitivity from edge to edge, our eye has very poor vision outside a fairly small area in the middle.

The situation isn't as clear as you make it out to be.

I can continue with this, but enough is enough.

No, I'm not making a mistake. Just look it up. The area of the retina has nothing with resolving power. By your argument, eagles, with retinae smaller than ours, should have poorer vision than ours. It is well documented that the visual acuity of birds of prey is superior to humans. What you're saying here is that simply having a bigger retina increases acuity. That is simply false.

The only thing you say here that is even close to correct is that we have better acuity at the fovea than anywhere else in our field of vision. That is true, and my statements above are assuming foveal vision.

Other than that you do not know what you're talking about here.

Quote:
Originally Posted by cgc0202 View Post

Are you sure you are actually familiar with the biology of vision itself?
And, for that matter the psycology -- what people see, and how they perceive what they saw?

Please do enlighten us with your expertise. Your initial lecture does not suggest you do.

Equating precision with the size of a reinal cell? But then again, if it cannot display anything, how can it be the arbiter of precision, and more important, perception?

I heard, a team of scientists (one of them a professor at Harvard) won the Nobel prize for their seminal work on the biology of vision. You may want to brush up on their work and those who followed on their research. There is also a great body of work on the psychology of vision.

CGC

Yes, I am familiar with that work, I earned a PhD in Perceptual Psychology with an emphasis on visual attention, psychophysics, and physiological psychology at UC Santa Cruz. I now teach Perception at another UC campus now.

Sorry if not listing references here doesn't meet with your approval; I have a lot of other things to do in a day. I forgot the natural way people tend to respond on the internet when you say something simple such as "I have an advanced degree in ..."

I'm not going to waste my time here with these insults or even more politely worded posts as yours. In an earlier response I referenced an excellent Brittanica article that backs up my claims. I would also recommend as starters, "Sensation and Perception" 5th. ed. by Harvey Richard Schiffman, "Vision Science, photons to phenomenology" by Stephen Palmer, and "Foundations of Vision" by Brian A. Wandell, and "Visual Perception, a clinical orientation" by Steven H. Schwartz. Any of these works, especially the first two, will be helpful. The latter two are much more technical, and assume a background in psychology, neurophysiology, and also require first year calculus to understand the mathematics of the computer modeling discussed.

I point other posters to these references, and the ones made in my previous reply here.
post #350 of 508
Quote:
Originally Posted by Dick Applebaum View Post

Prolly, at my end... Cameraman (me) error. Again I am usually at midfield... am always jerking around looking for the ball & zoom in and out. One problem is the Panny has a only an LCD viewfinder that is totally useless in the sun... so I use a tripod and dead reckoning. Several clips were slowed down to synch with the music. I could have spent more, time, used FCS (and image stabilization), still frames instead of jerks.... But my goal was to examine 3-4 hours of footage, every Sunday, and have highlights published on Monday, for Tuesday practice... and to have a little fun.



Yeah, quick and dirty... one video clip, "Zack Takes Control", was about 7 seconds long.., but Zack faked out the opposing player, twice, in that seven seconds... The only way I could figure to show this was to convert the vid to an image-sequence at 30 FPS... than play that as a herky-jerky slide-show.

Ok, that's good, what I saw was what you were doing.

Quote:
I am a Developer... 4.0 beta has all this stuff DSP, etc that seems designed for video processing. Unfortunately, you can only use it on the iPhone at this time (iPad support for 4.0 is not available yet, even for developers).

I am not knowledgeable enough to write code for image stabilization... so I have to depend on others.

That's why I ask!



There is at least 1 iPhone app that can stitch a series of overlapping still images into a panorama... I am surprised by the results and the performance.

http://itunes.apple.com/us/app/autos...318944927?mt=8

It doesn't do realtime, but the processing should be similar to stabilization, No?



I an interested in why... is it easier to do in hardwired circuitry on the cam?

TIA

I am learning something!

.

Stitching does do something similar to what software would do post. It must straighten out the images so that they can be combined. That's some of it.

I can use an example from the real world. A magnet has field lines. Force that is there because of the properties of the elements in the magnet. They just are. But in order to calculate those force lines we must devise, and then use the math. That's more complex. When we fall, gravity pulls us down at a steady rate depending on the mass of the planet. It just happens. It takes calculus to figure out the actual speeds that occur.

With image stabilization in-camera. The sensor tells the software where the movement is, and the software can move the image around to compensate fairly easily within the larger sensor area.

When we already have the shaky video, it's different. Then the object that must remain positioned in the same place, say, the left upright of the goal, must usually be selected. A point there must be measured by the software from one side and the top or bottom of the frame. Then the frame must be cropped so as to keep that point in the same place. If the camera points downward, then the top must be cropped. If the camera points upward, then the bottom must be cropped. Same thing with side to side shake. Each frame would have a different amount cropped out. This must be done field by field, or frame by frame. That takes a bit of work that doesn't have to be done in-camera.

One of the problems is cropping the right amount, as every frame must be the same size. I've never seen software do this in a way that I was happy with. It doesn't work well for a lot of shake.
post #351 of 508
Quote:
Originally Posted by mstone View Post

Explain yourself out of this comment with links, documentation, evidence, or anything relevant and maybe we have a discussion

You're the one who has to do that here. I already explained the situation. You've explained nothing. I don't even know what you disagree with, as you've declined the opportunity to explain it.

Are you disagreeing with what a pixel is? Are you disagreeing with what a sub-pixel is? I don't know.

Are you disagreeing that you can't simply say "RGB gamut', as opposed to sRGB gamut, or Adobe 1998 RGB gamut, or Prophoto RGB gamut, to give a few examples?

I dont know. You refuse to explain yourself.
post #352 of 508
I haven't read a single post in this thread, but will go back and do so, so this may already have been mentioned.

On the Apple.com website, it reads:

"
While most phones have only one microphone, iPhone 4 has two. The main mic, located on the bottom next to the speakers, is for phone and FaceTime calls, voice commands, and memos. The second mic, built into the top near the headphone jack, is for making your phone and video calls better. It works with the main mic to suppress unwanted and distracting background sounds, such as music and loud conversations. This dual-mic noise suppression helps make every conversation a quiet one."

Notice it says the main mic is place near the "speakers".

Does anyone know if another speaker has been added, either near where the mic is, or internally similar to the way the iPod touch has it?

Also, has anyone found out how much RAM it has?

Cheers!

Greg
post #353 of 508
Quote:
Originally Posted by melgross View Post

You're the one who has to do that here. I already explained the situation. You've explained nothing. I don't even know what you disagree with, as you've declined the opportunity to explain it.

Are you disagreeing with what a pixel is? Are you disagreeing with what a sub-pixel is? I don't know.

Are you disagreeing that you can't simply say "RGB gamut', as opposed to sRGB gamut, or Adobe 1998 RGB gamut, or Prophoto RGB gamut, to give a few examples?

I dont know. You refuse to explain yourself.

it really doesn't have to be any more complicated than 255^3

Life is too short to drink bad coffee.

Reply

Life is too short to drink bad coffee.

Reply
post #354 of 508
Am I the last person to realize the iPhone 4 has 5.8Mbps HSUPA? How did I miss this all day?
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #355 of 508
Quote:
Originally Posted by melgross View Post

Ok, that's good, what I saw was what you were doing.



Stitching does do something similar to what software would do post. It must straighten out the images so that they can be combined. That's some of it.

I can use an example from the real world. A magnet has field lines. Force that is there because of the properties of the elements in the magnet. They just are. But in order to calculate those force lines we must devise, and then use the math. That's more complex. When we fall, gravity pulls us down at a steady rate depending on the mass of the planet. It just happens. It takes calculus to figure out the actual speeds that occur.

With image stabilization in-camera. The sensor tells the software where the movement is, and the software can move the image around to compensate fairly easily within the larger sensor area.

When we already have the shaky video, it's different. Then the object that must remain positioned in the same place, say, the left upright of the goal, must usually be selected. A point there must be measured by the software from one side and the top or bottom of the frame. Then the frame must be cropped so as to keep that point in the same place. If the camera points downward, then the top must be cropped. If the camera points upward, then the bottom must be cropped. Same thing with side to side shake. Each frame would have a different amount cropped out. This must be done field by field, or frame by frame. That takes a bit of work that doesn't have to be done in-camera.

One of the problems is cropping the right amount, as every frame must be the same size. I've never seen software do this in a way that I was happy with. It doesn't work well for a lot of shake.


Thank you! I understand... "steady as you go" is a lot easier than first defining what "steady" and "go" mean, then trying to adjust some frames to match those definitions.

I am installing the SDK with the 4.0 GM.

It's past my bedtime, here, in CA... must be way past yours.

Long day... let's se what tomorrow brings!

I appreciate the discussion on this thread.

.
"Swift generally gets you to the right way much quicker." - auxio -

"The perfect [birth]day -- A little playtime, a good poop, and a long nap." - Tomato Greeting Cards -
Reply
"Swift generally gets you to the right way much quicker." - auxio -

"The perfect [birth]day -- A little playtime, a good poop, and a long nap." - Tomato Greeting Cards -
Reply
post #356 of 508
This whole argument/discussion started because SJ said that the new iPhone's display equaled human vision as if nothing can surpass the quality of the iPhone in practical terms, ever, which was clearly just marketing speak. Now we are simply debating the technical accuracy of that statement which is irrelevant to the actual quality of the screen comparatively, at least to any other product currently on the market. IPhone wins for now. No argument there.

Life is too short to drink bad coffee.

Reply

Life is too short to drink bad coffee.

Reply
post #357 of 508
Quote:
Originally Posted by Bowser View Post

Actually I do, and much of what you say here demonstrates that you really don't understand anything about the optics of the eye or the physical relationship between locations in the visual field and how we perceive those objects.

I have to differ. You can discuss perception, and that's fine. But under what conditions are you relating this to? What contrast levels are you using for your statements? I don understand this, and I know very well that at about 12" 300 dpi, or ppi, the normal human eye can't distinguish individual lines or dots, unless the contrast is very high. It's generally accepted that 20/20 vision allows about a one minute width line to be observed in a high contrast line pair.

Are you saying differently?

Quote:
My claim about the degree of human visual acuity is based on the standardized Snellen procedure used by every optometrist in the world. Such acuity is usually measured at distances of 12, 16, or 20 feet.

I think we are all familiar with the Snellen charts. At 20 feet it's similar to infinity, and we can see (if 20/20) about a 1.75mm line.

I understand that 20/20 isn't really the "normal" eye, just the standard he set up.

But even the Snellen charts aren't always accurate. Different lighting levels in the office can determine how well patients see them. My doctor agrees with that. In addition, now they use computers projecting letters into an LCD very often. Some have higher contrast than others. We also went through this at my doctor's office, where he has several.

Quote:
Yes, there are things that can be finer than the resolution of our visual acuity. But we can see things that are extremely small. As I replied to other posters above, details less than one half the size of a cone photoreceptor.

Theta is 1/60th of a degree as I've said. We can see a single line of a line pair at 1/2 theta. It's the line pairs that are normally spoken about. From what I remember, at about 12" we can see black and white lines just a bit bigger than 0.0035 inch. Correct me if that's wrong.

Quote:
No, I'm not making a mistake. Just look it up. The area of the retina has nothing with resolving power. By your argument, eagles, with retinae smaller than ours, should have poorer vision than ours. It is well documented that the visual acuity of birds of prey is superior to humans. What you're saying here is that simply having a bigger retina increases acuity. That is simply false.

You misunderstood what I was saying. I'm not saying that having a bigger retina would give us better vision. I didn't say that once. I'm also not comparing our retinas with those of an eagle, or a fish, for that matter. What I'm saying is that the image of the screen will take up just a small part of the image on the retina, depending on the distance. It will just be using a portion of those rods and cones. The fact that there are so many per inch is fine, but it will be just a fraction of an inch on the retina that the screen will be impinging upon. compare that with the size of the screen itself and the number of pixels, and just as importantly, the number of sub-pixels, and we get to a point where the eye can't resolve more information than what is on the screen. Don't forget that each pixel is made up of three sub-pixels, one of each color. Those sub-pixels are smaller than the entire pixel. So as we consider that the screen is 326 ppi, that's for entire pixels.

Quote:
The only thing you say here that is even close to correct is that we have better acuity at the fovea than anywhere else in our field of vision. That is true, and my statements above are assuming foveal vision.

Other than that you do not know what you're talking about here.

Well, I admit that I got off the point somewhat, and didn't state it as well as I should have. But the point is that we are only using a small part of the retina, and only a fraction of the rods and cones. Considering that as contrast goes down, we lose acuity quickly, even the paper by Levi shows that, we have to know what contrast levels we are talking about. This is well known photographically, which is why lenses are characterized at differing contrast levels.

You're making it out to seem as though we can see a particular amount of detail under any condition. That's one of the things I disagree about.
post #358 of 508
Quote:
Originally Posted by mstone View Post

it really doesn't have to be any more complicated than 255^3

Yes, but you know that that's too general. And you're talking about 8 bit, if you're indicating what I think you are..
post #359 of 508
Quote:
Originally Posted by mstone View Post

This whole argument/discussion started because SJ said that the new iPhone's display equaled human vision as if nothing can surpass the quality of the iPhone in practical terms, ever, which was clearly just marketing speak. Now we are simply debating the technical accuracy of that statement which is irrelevant to the actual quality of the screen comparatively, at least to any other product currently on the market. IPhone wins for now. No argument there.

I really don't think Jobs meant that nothing could ever surpass this in terms of usefulness (the screen, that is).

But several of us are having an argument, where the argument seems to revolve around the idea that we see black and white lines, or dots, only. But that's not true. Even text on a screen is continuous tone, because there is bleed from one pixel to the next. In continuous tone images, particularly color images, it's much more difficult to pick out a line at the edge of visual acuity.

While one person here seems to think that the phone resolution is too course for the human eye to perceive it as continuous tone, industry standards in the photo industry, which certainly has plenty of experience in this area, says otherwise, for the average person. I can second that from my own experiences in that industry since 1969.
post #360 of 508
Quote:
Originally Posted by mstone View Post

This whole argument/discussion started because SJ said that the new iPhone's display equaled human vision as if nothing can surpass the quality of the iPhone in practical terms, ever, which was clearly just marketing speak. Now we are simply debating the technical accuracy of that statement which is irrelevant to the actual quality of the screen comparatively, at least to any other product currently on the market. IPhone wins for now. No argument there.

Yes! I was watching a live blog (the keynote video is not, yet, available).

SJ said something like: the iPhone 4 screen had a higher resolution than the human eye could detect (under "normal" conditions)... whatever those are.

So, in typical market-speak they coined a phrase to describe the capability: "Retinal Display", or somesuch...

We can't take these things too literally! After all, Ringling Bros., Barnum & Bailey wasn't really "the Greatest Show On Earth!"

Give the Ringmaster his due!

.
"Swift generally gets you to the right way much quicker." - auxio -

"The perfect [birth]day -- A little playtime, a good poop, and a long nap." - Tomato Greeting Cards -
Reply
"Swift generally gets you to the right way much quicker." - auxio -

"The perfect [birth]day -- A little playtime, a good poop, and a long nap." - Tomato Greeting Cards -
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: iPhone
AppleInsider › Forums › Mobile › iPhone › Apple unveils redesigned, thinner iPhone 4 with two cameras