The last thing I want to do is shove a smartphone against my eyeball every time I want to use the smartphone. I am sure Apple considered and discarded the idea.
Were the implementation unobtrusive and it worked it's another option I suppose. But that would mean capturing enough iris scan detail at a normal viewing distance (maybe 18 inches or so?) to give a secure pattern ID for unlocking and that strikes me as a significant challenge. Basically the thing is going to have to grab an image of my entire face, locate an eye and then zoom in for the ID details.... tough nut to crack in something stuffed into a phone.
No you don't have to, but you have to move your finger over the screen of the phone but don't touch it to take/reject the call using Air Gesture...yep.
I'm trying to understand your point in context of either my message or the one I was responding to. On a Samsung phone, which I don't have, can't you just answer the phone while driving using one of three (easier) techniques: (1) press a hands-free button on your steering wheel using Bluetooth to answer the call, or (if your car is not so equipped) (2) answer by just raising the phone to your ear against your cheek, or (if your phone is not so equipped) just (3) press a button somewhere on your phone without looking at it and raise it to your ear?
If any of these three capabilities exist on your phone, then the person's comment that I was responding to made no sense. If none of these capabilities exist on your phone, then your phone is lacking basic functionality. If (worse) your phone wants to be so cool that the only way you can answer a call is by waving a finger over it with Air Gesture, then your phone is too stupid for words, bypassing basic functionality in favor of a "wow feature" that presents trouble in ordinary situations.
@jfc1138 For now in some application the Galaxy Note 3 is detecting the eyes movements and positions and scroll the page up or down as needed or wanted. That's already some of the work done. Doing this in all lighting condition.
For now in some application the Galaxy Note 3 is detecting the eyes movements and positions and scroll the page up or down as needed or wanted. That's already some of the work done. Doing this in all lighting condition.
Thanks, I did expect the real challenge would be posed by the detail level needed to be captured.
Having it unlock when you looked at it would be amusing...
I don't believe even one second that this will be present in their next Galaxy phone
1- Samsung is uncapable of doing something like that. You'd need a camera with a very high resolution capable of doing quickly a complex analysis. And it would need a light or wouldn't work in low light which takes me to
2- it's totally impractical. People don't always stare at their phone when they unlock it. They may have glasses, hair in front of their eyes, they may be looking somewhere else. They may just want to unlock it to give the phone to someone.
Another terrible idea. I'm pretty sure that not even Samsung will dare show something posed to fail like that.
The take home message from this is that Samsung mobile executives are so desperate to emulate the Apple rumour mill that they need to seed it themselves.
Right now my phone is sitting on my desk next to my keyboard.
It's within arm's reach... but nowhere near my head. And it's certainly not in a suitable location to scan my iris.
How would Samsung handle that situation? Would I have to pick up my phone in order for it to scan my eyes? I'm sure I could put in my passcode... but that's exactly what I'd be trying to avoid.
Meanwhile... a gentle press-n-hold on the home button of my iPhone 5S and it opens lickety-split... as it remains on my desk.
Right now my phone is sitting on my desk next to my keyboard.
It's within arm's reach... but nowhere near my head. And it's certainly not in a suitable location to scan my iris.
How would Samsung handle that situation? Would I have to pick up my phone in order for it to scan my eyes? I'm sure I could put in my passcode... but that's exactly what I'd be trying to avoid.
Meanwhile... a gentle press-n-hold on the home button of my iPhone 5S and it opens lickety-split... as it remains on my desk.
Maybe you bump it against your eyeball like S Beam.
@jfc1138 I should have know, you used to steal the technology, so you know nothing about creation process.
A competitor is developing a new technology, so in the process he will have to start from one point and have the process evolved until the goal is attained. What we see on the Galaxy Note is a creation in progress at one stage of developpement. At the end you can steal it an add it on an iPhone and say it's innovation.
I should have know, you used to steal the technology, so you know nothing about creation process.
A competitor is developing a new technology, so in the process he will have to start from one point and have the process evolved until the goal is attained. What we see on the Galaxy Note is a creation in progress at one stage of developpement. At the end you can steal it an add it on an iPhone and say it's innovation.
Well, they need to do something. But Apple's cornered the tech on the finger print scanner. And although iris scanning is a possibility, it might be too cumbersome. That Samsung exec's comments are just sooo... trying hard to make it seem like something big and relevant.
@jfc1138
For now in some application the Galaxy Note 3 is detecting the eyes movements and positions and scroll the page up or down as needed or wanted. That's already some of the work done. Doing this in all lighting condition.
I wear glasses and it is a useless gimmick on my Galaxy S4, it didn't take long to turn that sh*t off.
@jfc1138
I should have know, you used to steal the technology, so you know nothing about creation process.
A competitor is developing a new technology, so in the process he will have to start from one point and have the process evolved until the goal is attained. What we see on the Galaxy Note is a creation in progress at one stage of developpement. At the end you can steal it an add it on an iPhone and say it's innovation.
That was an agreement. Try a different translation app? Or ask Dennis Rodman for help.
Interesting read but if read properly it looks like it requires an additional light source to induce dilation of the pupil (flickering light, pen light etc).
Now wouldn't that be amusing if each time you wanted to unlock the phone, it caused a flash in your eyes. It be flashy for sure :-D
Not sure how the eye works exactly but it might be possible to reach to smaller levels of movement I guess.
Another thing I wonder whether it would consistently work in different levels of lights, particularly low light. Most of these iris recognition technologies are used in relatively bright areas with consistent lighting (airports, security check points, etc). New algorithms and possibly technologies would have to be used to adapt to a less controlled situation such as that experienced by a typical phone user.
If I were Apple I would probably not try and go an iris recognition technology yet for opening the phone but would rather look to develop some type of live-tissue recognition into the TouchID process. While the RF sensing electrode in theory would permit imaging of the live tissue just below the surface, maybe decided not to incorporate it yet due to too many inaccurate denials. If they could overcome this than the security strength of TouchID would increase dramatically.
Thinking a bit more pie-in-the-sky, I wonder if there is anyway to increase the security of the iPhone with the iWatch. Maybe some type of BLE connection that causes the iPhone to require password access if the iWatch is not nearby. On a side note, I would love it if the rumoured iWatch also had some type of "locate my iPhone" feature built into for when I forget where I put the bloody thing in my own house...
Comments
The last thing I want to do is shove a smartphone against my eyeball every time I want to use the smartphone. I am sure Apple considered and discarded the idea.
Were the implementation unobtrusive and it worked it's another option I suppose. But that would mean capturing enough iris scan detail at a normal viewing distance (maybe 18 inches or so?) to give a secure pattern ID for unlocking and that strikes me as a significant challenge. Basically the thing is going to have to grab an image of my entire face, locate an eye and then zoom in for the ID details.... tough nut to crack in something stuffed into a phone.
No you don't have to, but you have to move your finger over the screen of the phone but don't touch it to take/reject the call using Air Gesture...yep.
I'm trying to understand your point in context of either my message or the one I was responding to. On a Samsung phone, which I don't have, can't you just answer the phone while driving using one of three (easier) techniques: (1) press a hands-free button on your steering wheel using Bluetooth to answer the call, or (if your car is not so equipped) (2) answer by just raising the phone to your ear against your cheek, or (if your phone is not so equipped) just (3) press a button somewhere on your phone without looking at it and raise it to your ear?
If any of these three capabilities exist on your phone, then the person's comment that I was responding to made no sense. If none of these capabilities exist on your phone, then your phone is lacking basic functionality. If (worse) your phone wants to be so cool that the only way you can answer a call is by waving a finger over it with Air Gesture, then your phone is too stupid for words, bypassing basic functionality in favor of a "wow feature" that presents trouble in ordinary situations.
For now in some application the Galaxy Note 3 is detecting the eyes movements and positions and scroll the page up or down as needed or wanted. That's already some of the work done. Doing this in all lighting condition.
@jfc1138
For now in some application the Galaxy Note 3 is detecting the eyes movements and positions and scroll the page up or down as needed or wanted. That's already some of the work done. Doing this in all lighting condition.
Thanks, I did expect the real challenge would be posed by the detail level needed to be captured.
Having it unlock when you looked at it would be amusing...
1- Samsung is uncapable of doing something like that. You'd need a camera with a very high resolution capable of doing quickly a complex analysis. And it would need a light or wouldn't work in low light which takes me to
2- it's totally impractical. People don't always stare at their phone when they unlock it. They may have glasses, hair in front of their eyes, they may be looking somewhere else. They may just want to unlock it to give the phone to someone.
Another terrible idea. I'm pretty sure that not even Samsung will dare show something posed to fail like that.
Did you try taking it back to Appl? Maybe you've got a faulty unit.
Oh and if there's someone from macrumors, let me tell you for once, **** your website that uses samsung fans as moderators.
It's within arm's reach... but nowhere near my head. And it's certainly not in a suitable location to scan my iris.
How would Samsung handle that situation? Would I have to pick up my phone in order for it to scan my eyes? I'm sure I could put in my passcode... but that's exactly what I'd be trying to avoid.
Meanwhile... a gentle press-n-hold on the home button of my iPhone 5S and it opens lickety-split... as it remains on my desk.
Maybe you bump it against your eyeball like S Beam.
I should have know, you used to steal the technology, so you know nothing about creation process.
A competitor is developing a new technology, so in the process he will have to start from one point and have the process evolved until the goal is attained. What we see on the Galaxy Note is a creation in progress at one stage of developpement. At the end you can steal it an add it on an iPhone and say it's innovation.
@jfc1138
I should have know, you used to steal the technology, so you know nothing about creation process.
A competitor is developing a new technology, so in the process he will have to start from one point and have the process evolved until the goal is attained. What we see on the Galaxy Note is a creation in progress at one stage of developpement. At the end you can steal it an add it on an iPhone and say it's innovation.
We've got a real comedian here.
Back it up and take it back to an Apple store, you obviously have a faulty phone and they will replace it on the spot.
Mine works well with five fingers, left and right thumbs and index fingers, right pinky for when I'm eating a hamburger.
The only time I've had issues is when my fingers are wet.
I wear glasses and it is a useless gimmick on my Galaxy S4, it didn't take long to turn that sh*t off.
Quote:
I am including an excerpt of the article ("Closing the Door on ?Iris Recognition Vulnerabilities") from http://www.afcea.org/content/?q=node/11607.
Interesting read but if read properly it looks like it requires an additional light source to induce dilation of the pupil (flickering light, pen light etc).
Now wouldn't that be amusing if each time you wanted to unlock the phone, it caused a flash in your eyes. It be flashy for sure :-D
Not sure how the eye works exactly but it might be possible to reach to smaller levels of movement I guess.
Another thing I wonder whether it would consistently work in different levels of lights, particularly low light. Most of these iris recognition technologies are used in relatively bright areas with consistent lighting (airports, security check points, etc). New algorithms and possibly technologies would have to be used to adapt to a less controlled situation such as that experienced by a typical phone user.
If I were Apple I would probably not try and go an iris recognition technology yet for opening the phone but would rather look to develop some type of live-tissue recognition into the TouchID process. While the RF sensing electrode in theory would permit imaging of the live tissue just below the surface, maybe decided not to incorporate it yet due to too many inaccurate denials. If they could overcome this than the security strength of TouchID would increase dramatically.
Thinking a bit more pie-in-the-sky, I wonder if there is anyway to increase the security of the iPhone with the iWatch. Maybe some type of BLE connection that causes the iPhone to require password access if the iWatch is not nearby. On a side note, I would love it if the rumoured iWatch also had some type of "locate my iPhone" feature built into for when I forget where I put the bloody thing in my own house...