Apple expected to replace Touch ID with two-step facial, fingerprint bio-recognition tech

Posted:
in iPhone edited June 2020
Apple is developing advanced biometric security technologies, including facial recognition and optical fingerprint sensing designs, to replace the vaunted Touch ID module implemented in all iPhones and iPads since the release of iPhone 5s.




In a note sent out to investors on Friday, and subsequently obtained by AppleInsider, well-connected KGI analyst Ming-Chi Kuo says he believes Apple is developing a new class of bio-recognition technologies that play nice with "full-face," or zero-bezel, displays. Specifically, Kuo foresees Apple replacing existing Touch ID technology with optical fingerprint readers, a change that could arrive as soon as this year, as Apple is widely rumored to introduce a full-screen OLED iPhone model this fall.

Introduced with iPhone 5s, Touch ID is a capacitive type fingerprint sensing module based on technology acquired through Apple's purchase of biometric security specialist AuthenTec in 2012. Initial iterations of the system, built into iPhone and iPad home buttons, incorporated a 500ppi sensor capable of scanning sub-dermal layers of skin to obtain a three-dimensional map of a fingerprint.

Available on iOS devices since 2013, the technology most recently made its way to Mac with the MacBook Pro with Touch Bar models in October.

A capacitive solution, Touch ID sends a small electrical charge through a user's finger by way of a stainless steel metal ring. While the fingerprint sensing module is an "under glass" design, the ring must be accessible to the user at all times, making the solution unsuitable for inclusion in devices with full-face screens.

Moving forward, Kuo predicts Apple will turn to optical type fingerprint sensing technology capable of accepting readings through OLED panels without need for capacitive charge components. These "under panel" systems allow for a completely uniform screen surface, an aesthetic toward which the smartphone industry is trending.

Apple has, in fact, been working on fingerprint sensors that work through displays, as evidenced by recent patent filings. The IP, as well as the current state of the art, suggests optical fingerprint modules are most likely to see inclusion under flexible OLED panels as compared to rigid OLED or TFT-LCD screens.

Flexible OLED displays feature less signal interference, lower pixel densities and thinner form factor than competing technologies, Kuo notes. As optical fingerprint sensor development is still in its infancy, however, OLED display manufacturers are under increased pressure to provide customized designs capable of incorporating the tech.

While the barriers to entry are high, companies like Apple and Samsung are among the few that have the bargaining power to implement such designs, Kuo says.




As for alternative bio-recognition technologies, Kuo believes Apple is looking to completely replace fingerprint sensors with facial or iris recognition systems. Of the two, the analyst predicts facial recognition to win out, citing a growing stack of patent filings for such solutions, many of which AppleInsider uncovered over the past few years (1, 2, 3, 4, 5). The company is also rumored to have eye scanning technology in development, but Kuo sees Apple leaning toward facial recognition for both hardware security and mobile transaction authentication services.

Being a hands-off, no-touch security solution, facial recognition is preferable to technology that requires user interaction, like fingerprint sensors. However, Kuo points out that certain barriers stand in the way of implementation, such as software design, hardware component development and the creation of a verification database, among other backend bottlenecks.

Considering the onerous task of deploying a standalone face-scanning solution, Kuo suggests Apple might first deploy a hybrid two-step bio-recognition system that requires a user verify their identity with both a fingerprint and facial scan.

It is unclear when Apple might first integrate facial recognition hardware in its expansive product lineup, but optical fingerprint modules are ripe for inclusion in an OLED iPhone rumored to launch later this year. According to recent rumblings, Apple's first OLED iPhone model will sport a stainless steel "glass sandwich" design and incorporate advanced features like wireless charging. Kuo himself added to the "iPhone 8" rumor pile this week, saying Apple is primed to transition to next-generation 3D Touch tech with the forthcoming handset.
«1

Comments

  • Reply 1 of 39
    SoliSoli Posts: 10,035member
    1) Two-step or dual-biometric authentication? The former is a step down, unless there's evidence that Touch ID is currently to insecure to be good convenience feature.

    2) No physical Home Button indentation makes this story rumour sounds fake to me.
  • Reply 2 of 39
    radarthekatradarthekat Posts: 3,842moderator
    Just to be clear on terms...

    Face recognition is the term used to describe the process of identifying a specific person, such as from a database of known persons (no fly list, for example).  This is face recognition.

    There's also face detection, which is the process of detecting the elements of a human face within a scene.  This is typically a precursor to application of face recognition algorithms, used to identify the owner of a face in a scene.

    Then there's facial recognition, which is the process of detecting specific facial expressions (smiling, frowning, sadness, etc).  This term is often used in the medical world to characterize specific inabilities of patients to recognize meaning in human faces.  Or, I suppose, one could use the term facial recognition to mean the detection of someone who has recently come from a spa treatment appointment.  (Kidding.)

    Folks who incorrectly use the term facial recognition will find themselves finally corrected once Apple introduces some form of face detection and face recognition on stage in a product introduction.  Until that day I'm afraid folks will continue to use the wrong term to describe face recognition.  Those who want to be certain of the correct term to use now should do a google search of the huge body of research on the topic, where they will find that all technical papers refer to face recognition as defined above.

    for example:  http://www.face-rec.org/interesting-papers/
    edited January 2017 Solimacxpresspscooter63jSnivelycharlesgresyojimbo007argonautfracdamn_its_hotjony0
  • Reply 3 of 39
    larryalarrya Posts: 606member
    Just to be clear on terms...

    Face recognition is the term used to describe the process of identifying a specific person, such as from a database of known persons (no fly list, for example).  This is face recognition.

    There's also face detection, which is the process of detecting the elements of a human face within a scene.  This is typically a precursor to application of face recognition algorithms, used to identify the owner of a face in a scene.

    Then there's facial recognition, which is the process of detecting specific facial expressions (smiling, frowning, sadness, etc).  This term is often used in the medical world to characterize specific inabilities of patients to recognize meaning in human faces.  Or, I suppose, one could use the term facial recognition to mean the detection of someone who has recently come from a spa treatment appointment.  (Kidding.)

    Folks who incorrectly use the term facial recognition will find themselves finally corrected once Apple introduces some form of face detection and face recognition on stage in a product introduction.  Until that day I'm afraid folks will continue to use the wrong term to describe face recognition.  Those who want to be certain of the correct term to use now should do a google search of the huge body of research on the topic, where they will find that all technical papers refer to face recognition as defined above.

    for example:  http://www.face-rec.org/interesting-papers/
    Interesting and informative (thank you), but the terms are so similar that I have absolutely no expectation that anyone will notice the differences or correct their usage. We can't even get basic grammar right in serious publications and journalism.  If I could go one day without hearing or reading, "different than" or abuses of "me" and "I", there might be some hope of differentiating "facial" and "face" recognition. 😀 
    beowulfschmidtStrangeDayswatto_cobrajony0
  • Reply 4 of 39
    ...is this yet another incremental by design privacy creep in mac/ios? Is all fine as long as the 'good guys' are happy...? Why does Zuckerberg have tape on his webcam? Did it start with the iPhone & app store, with every purchase of interest and iPhone contact photo (contacts unknowing) buried somewhere in some Apple server database for future reference? Turning off iPhoto auto tagging or instant on (2016MBP) now requires terminal, and has secure erase been dropped from Disk First Aid, if one wants to part with a mac ?
  • Reply 5 of 39
    rezwitsrezwits Posts: 879member
    This would be like 3-step, Face, Finger, Device!
  • Reply 6 of 39
    Rayz2016Rayz2016 Posts: 6,957member
    rezwits said:
    This would be like 3-step, Face, Finger, Device!
    Yes, this is why I don't see this happening. TouchID hasn't proven insecure (unless you have access to chemistry lab), so I'm not sure why they'd introduce another step which would cause delays at the checkout.
    mike1watto_cobra
  • Reply 7 of 39
    And how would this work in the dark and without disturbing others in e.g. a cinema, or in the bedroom?
    watto_cobra
  • Reply 8 of 39
    anomeanome Posts: 1,533member
    Consider the situation where you're trying to use TouchID to pay for something with Apple Pay. At the moment, you wave the phone over the sensor, and touch your finger to the TouchID sensor.

    Introduce Face Recognition as another layer, now you have to hold the phone so it can check your face, wave the phone at the sensor, and touch your finger to the TouchID sensor. If it wants the Face Recognition while your on the PayWave sensor, then that's going to be awkward in situations where the sensor is on a low counter. Or do you manually initiate Apple Pay (e.g. double tap the home button, virtual or otherwise), scan your face and finger, then wave it on the PayWave sensor?

    Basically, I can't see a way to introduce Face Recognition specifically without making Apple Pay harder to use. Without a damn good security reason, and a good implementation, I don't see this as anything more than an experiment. If anyone can make it work, it would be Apple, but I don't think it's likely this year, if at all.
    watto_cobra
  • Reply 9 of 39
    mattinozmattinoz Posts: 2,316member
    So if we are getting closer to ToucdID under the screen are we at all getting closer to a full camera under the screen?

    Similar to the patent of many years ago now getting rid of the front camera and replacing with a field array sensor and focus in software.
    Use holes in the flexible OLED panels could act like an array of pinhole cameras for the field array. 
    Field array would also let them create basic 3D models for face detection checking so it can't be fooled by photo.

    If they get it close to centre of the screen you could use facetime landscape or portrait and look the other person in the eyes.
    watto_cobra
  • Reply 10 of 39
    misamisa Posts: 827member
    Soli said:
    1) Two-step or dual-biometric authentication? The former is a step down, unless there's evidence that Touch ID is currently to insecure to be good convenience feature.

    2) No physical Home Button indentation makes this story rumour sounds fake to me.

    The gist of it seems like Apple is trying to put in an actual two-factor authentication (eg finger+face) since the finger alone has a few weaknesses (I'm not sure if the "gummy finger" works on the iPhone, but that's how optical finger print readers are defeated.) The face alone is also weak since you can only use parts of the face that don't change with age which gives you two problems:
    1) Glasses/Contacts - A "retina scan" has to actually look at the Retina, which means that a scan would have to be taken at very close range, an iris scan would be defeated by contact lenses, which leaves a face scan. Typically what you would have is a prompt like "make a smile" and the phone would check that the 60 or so points on a face that correspond to your smile match. That makes it more difficult to defeat than a simple "photo" based match. 

    2) Makeup/complexion/tattoos/piercings, if someone breaks out with acne, or has applied makeup to cover moles, then the detection mechanism no longer has the right reference points, especially if the way the recognition works is looking for unique identifying marks.

    To some extent people with long hair would also confuse face recognition since some people change their hair frequently.

    As far as identifying characteristics go however, The Retina and the Iris are are the ones that can't be changed or fooled, but since they are biological markers, it is possible for those parts of an eye to be damaged through surgery or accident, thus locking one out of the device. There are several DNA markers that are responsible for the color and unique patterns in the Iris (would an identical twin be able to fool it?)

    What I think is going to happen, assuming they remove a the physical home button (I wouldn't expect this to happen because it makes it harder to orient the phone from everything about taking pictures and movies to visually-impaired users being unable to use Siri because they no longer can find the the part of the screen to double-tap) there will still be a part of the screen that has a dedicated space for the fingerprint sensor, and perhaps the screen will "wrap" around this sensor and software that isn't aware about this wrap will just have the previous screen size and aspect ratio assumed, where as sensor-aware apps will be able to go "place finger here to pay"



    charlesgres
  • Reply 11 of 39
    irelandireland Posts: 17,798member
    And when it does not we'll see the Kuo was wrong story.
  • Reply 12 of 39
    mobiusmobius Posts: 380member
    And how would this work in the dark and without disturbing others in e.g. a cinema, or in the bedroom?
    If you're using your phone in those situations then you are already disturbing others with the light from the display. But to answer your question, I'm guessing the display light or flash could be used to illuminate the face enough for the recognition to function.
  • Reply 13 of 39
    danvmdanvm Posts: 1,409member
    And how would this work in the dark and without disturbing others in e.g. a cinema, or in the bedroom?
    The Surface Pro 4 camera is capable of working in the dark. 



    I'm not sure the technology behind it can be used in a smaller device as an iPhone. 
    wonkothesaneargonaut
  • Reply 14 of 39
    evilutionevilution Posts: 1,399member
    danvm said:
    I'm not sure the technology behind it can be used in a smaller device as an iPhone. 
    It's just lighting your face with an infra red LED. You can see the glowing of the LED in that video. It's not visible except through a camera.
    argonautwatto_cobra
  • Reply 15 of 39
    calicali Posts: 3,494member
    ...is this yet another incremental by design privacy creep in mac/ios? Is all fine as long as the 'good guys' are happy...? Why does Zuckerberg have tape on his webcam? Did it start with the iPhone & app store, with every purchase of interest and iPhone contact photo (contacts unknowing) buried somewhere in some Apple server database for future reference? Turning off iPhoto auto tagging or instant on (2016MBP) now requires terminal, and has secure erase been dropped from Disk First Aid, if one wants to part with a mac ?
    It's ironic how paranoid Zuckerberg is and how much he hates having his privacy disturbed.

    someone once posted a photo of him on Facebook and he freaked out. 
    edited January 2017 StrangeDayswatto_cobra
  • Reply 16 of 39
    Soli said:
    1) Two-step or dual-biometric authentication? The former is a step down, unless there's evidence that Touch ID is currently to insecure to be good convenience feature.

    2) No physical Home Button indentation makes this story rumour sounds fake to me.
    The narrative they are pushing here is "bezel free" and "just like Microsoft."
  • Reply 17 of 39
    This idea is all fine and dandy but how will it work when
    - The user is on the ski slopes and wearing a Ski-Mask
    - The user is a Motorcyclist and is wearing a crash helmet
    - The user is a Muslim Woman who is wearing a Nicab
    - The user is a Doctor wearing a scrubs mask.
    etc
    etc
    will it work without re-calibratino is you shave your thick bushy beard off?

    There is are good reasons why this will only ever be a niche solution.
    I can't see it ever replacing touch-id

    argonautwatto_cobra
  • Reply 18 of 39
    mobius said:
    And how would this work in the dark and without disturbing others in e.g. a cinema, or in the bedroom?
    If you're using your phone in those situations then you are already disturbing others with the light from the display. But to answer your question, I'm guessing the display light or flash could be used to illuminate the face enough for the recognition to function.
    That's why I mentioned these two situations. As an example I regularly read in the evening in the bedroom, and in the dark. Using a flash would definitely  disturb  every time I would like to unlock the phone. Same when you walk in the evening in the streets, forest wherever. so far I think I never disturbed anyone in the cinema as well, since it's on low light and silent. Again, a flash to identify me would change this. I don't feel this FR solution if it would depend on something like a flash light. 
  • Reply 19 of 39
    And how would this work in the dark and without disturbing others in e.g. a cinema, or in the bedroom?
    If you're wearing sunglasses, or a Halloween costume, you have to enter your security code.
    retrogusto
  • Reply 20 of 39
    cali said:
    ...is this yet another incremental by design privacy creep in mac/ios? Is all fine as long as the 'good guys' are happy...? Why does Zuckerberg have tape on his webcam? Did it start with the iPhone & app store, with every purchase of interest and iPhone contact photo (contacts unknowing) buried somewhere in some Apple server database for future reference? Turning off iPhoto auto tagging or instant on (2016MBP) now requires terminal, and has secure erase been dropped from Disk First Aid, if one wants to part with a mac ?
    It's ironic how paranoid Zuckerberg is and how much he hates having his privacy disturbed.

    someone once posted a photo of him on Facebook and he freaked out. 
    Ironic? No. inequality is the normal and expected condition for the wealthy elite.
    gatorguyStrangeDayswatto_cobra
Sign In or Register to comment.