Apple could have used pinhole-sized sensors in display to keep Touch ID on the iPhone X

Posted:
in iPhone edited August 2020
Before the introduction of the iPhone X and Face ID, Apple was considering ways to implement Touch ID on a smartphone without using an externally visible fingerprint reader, with one technique involving a series of pinholes in the display panel that could allow a fingerprint to be captured through the screen.




The introduction of the iPhone X in 2017 brought with it a fundamental change in the way Apple designed the iPhone, eliminating the famous home button in favor of an edge-to-edge display. By removing the home button, Apple also had to reconsider how it secured the iPhone, as Touch ID was previously housed in the now-eliminated component.

Apple's ultimate answer was to replace it with Face ID, using the TrueDepth camera array to authenticate the user instead of their fingerprint. While other device producers simply shifted the fingerprint reader to elsewhere on the device, typically on the rear, Apple opted to fundamentally change its security processes instead.

However, a patent application published by the US Patent and Trademark Office on Thursday reveals Apple was still considering how to retain Touch ID while using a larger display with seemingly no available space for a reader. The filing "Electronic device including pin hole array mask above optical image sensor and laterally adjacent light source and related methods" was filed on May 23, 2016, over a year before the iPhone X's launch, suggesting it was still a consideration at that point.

In short, the filing suggests the use of many small holes in the display panel to allow light to pass through to an optical image sensor below. By shining light onto the user's finger, reflected light could pass through the holes to the optical sensor, and could be used to produce a fingerprint.

A cross-section of the display, showing pinholes allowing light to pass through the display for fingerprint reading
A cross-section of the display, showing pinholes allowing light to pass through the display for fingerprint reading


There would be a large number of the holes in order to cover a wide-enough area, and be equally spaced apart between pixels on the display panel so as to not be easily visible by the user. A light source laterally adjacent to the imaging sensor is also used to shine light though the holes onto the user's finger, as while the light from the display panel could do a similar job, doing so with a separate light source leaves the display available to be used for other tasks, as well as enabling the use of infrared or ultraviolet light for fingerprint reading.

A transparent layer may also be used between the display panel and the pin hole array mask layer, which can give space for light to reflect against the user's finger and bounce back to the holes, passing through to the sensor.

Despite using pinholes, using a plurality of them will give enough data to the sensor to be able to make an image of the user's finger. Prototype test images show the concept working with text and lines down to a micron level, making it more than sufficient enough for fingerprints.

Apple suggests the use of the system would also potentially save users time, as it could eliminate the authentication step in a process by simply reading the finger when it touches the display when required.

An example of the images that could be captured by the technique on prototype hardware
An example of the images that could be captured by the technique on prototype hardware


Apple files numerous patents and applications with the USPTO on a weekly basis, but while there is no guarantee that the concepts described will make their way into a future product or service, they do indicate areas of interest for the company's research and development efforts.

Using holes in the display is not the only way Apple has come up with for reading a fingerprint. Patents granted to the company in April relate to the use of acoustic transducers to vibrate the surface of the display and to monitor for waves altered by coming into contact with fingerprint ridges.

If adopted, the technique could effectively turn the entire display into a fingerprint reader, capturing the biometric element regardless of where the finger touches the display, and at any angle.
«1

Comments

  • Reply 1 of 25
    DAalsethDAalseth Posts: 2,783member
    And if they had everyone would have been complaining about ‘the holes’, how it destroyed the appearance of the screen, how it made watching videos impossible, how Samsung did it without holes. 
    razorpitRayz2016JWSCwatto_cobra
  • Reply 2 of 25
    jgreg728jgreg728 Posts: 100member
    Honestly Samsung nailed it with the "pulse" idea they up with for fingerprint sensing under the screen.
  • Reply 3 of 25
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    jgreg728 said:
    Honestly Samsung nailed it with the "pulse" idea they up with for fingerprint sensing under the screen.
    Minus the fact that a 3d print fools it.


    Capacitive sensors, like it used before and Apple used in Touch ID prevents this kind of attack.
    StrangeDaysn2itivguyboxcatchercornchipwatto_cobra
  • Reply 4 of 25
    mike1mike1 Posts: 3,275member
    DAalseth said:
    And if they had everyone would have been complaining about ‘the holes’, how it destroyed the appearance of the screen, how it made watching videos impossible, how Samsung did it without holes. 
    Not to mention how it may have been more difficult and therefore impractical to manufacture in quantity.
    watto_cobra
  • Reply 5 of 25
    techsavytechsavy Posts: 34member
    The idea is interesting however it seems that it would cost quite alot and apple would probably add a ludicrous amount of money for the option.
  • Reply 6 of 25
    jgreg728 said:
    Honestly Samsung nailed it with the "pulse" idea they up with for fingerprint sensing under the screen.
    Minus the fact that a 3d print fools it.


    Capacitive sensors, like it used before and Apple used in Touch ID prevents this kind of attack.
    Meh, it’s a convenience feature, not a security feature. /s
    razorpitStrangeDayswatto_cobra
  • Reply 7 of 25
    MplsPMplsP Posts: 3,911member
    jgreg728 said:
    Honestly Samsung nailed it with the "pulse" idea they up with for fingerprint sensing under the screen.
    Minus the fact that a 3d print fools it.


    Capacitive sensors, like it used before and Apple used in Touch ID prevents this kind of attack.
    I’m guessing capacitive sensors could be fooled in the same way, just use a different material for the fingerprint mold that is conductive.

    FaceID is not perfect, but it’s pretty damned good, and on balance has fewer flaws than TouchID did. After using an iPhone XS for 6 months, my only complaint with the notch is that I can’t see the battery percentage without swiping down (and reachability is not nearly as convenient as it was with the home button).
    edited May 2019 watto_cobra
  • Reply 8 of 25
    esummersesummers Posts: 953member
    I like Face ID.  I just wish it had a wider field of view or at least detected that it was too close to your face and waited for you to move the phone away. 
    watto_cobra
  • Reply 9 of 25
    1348513485 Posts: 343member
    Technically feasible I suppose, but the holes would quickly fill with lint, epithelials and other junk in short order.
    DAalsethwatto_cobra
  • Reply 10 of 25
    elijahgelijahg Posts: 2,753member
    I'd imagine this wouldn't work for long since it uses light, and those pinholes would quickly become full of light blocking gunk. Acoustic methods seem much more robust.
    edited May 2019
  • Reply 11 of 25
    iOS_Guy80iOS_Guy80 Posts: 810member
    Face ID is so slick, Touch ID seems so outdated. 
    watto_cobra
  • Reply 12 of 25
    Eric_WVGGEric_WVGG Posts: 966member
    Six months in and I kinda hate FaceID.

    Turns out that a very common use scenario is for my phone to be on the table next to my laptop, and for me to reach over and use it sort of "calculator style" without picking it up. So I have to pick it up and look at the stupid thing and then put it back down and blah blah… also I'll sometimes try to use it while walking or standing around on the subway and use it at hip-height instead of pulling it up to my face. 

    Apple made the right call, they shouldn't be designing around the target audience of Me, but… le sigh.

    Also it's gonna be too hot for jackets soon… I may switch back to my old SE for the rest of the summer.
  • Reply 13 of 25
    Been using FaceID with iPhone XS for months as well.  It is far superior to TouchID.  Apple made the correct choice.  As to Samsung's sensor being fooled?  Same happened with TouchID when it was announced.  Security researchers captured a finger print off the touch screen, extracted the print, created a mold, formed a model and used heat transfer from another finger through the mold.  It unlocked the iPhone.  You want real security, set a strong passcode and either do not use TouchID/FaceID or be prepared to disable it quickly.  
    watto_cobra
  • Reply 14 of 25
    StrangeDaysStrangeDays Posts: 12,844member
    Been using FaceID with iPhone XS for months as well.  It is far superior to TouchID.  Apple made the correct choice.  As to Samsung's sensor being fooled?  Same happened with TouchID when it was announced.  Security researchers captured a finger print off the touch screen, extracted the print, created a mold, formed a model and used heat transfer from another finger through the mold.  It unlocked the iPhone.  You want real security, set a strong passcode and either do not use TouchID/FaceID or be prepared to disable it quickly.  
    That isn’t the same thing as the 3D print in the verge article. 
    cornchipwatto_cobra
  • Reply 15 of 25
    genovellegenovelle Posts: 1,480member
    jgreg728 said:
    Honestly Samsung nailed it with the "pulse" idea they up with for fingerprint sensing under the screen.
    Minus the fact that a 3d print fools it.


    Capacitive sensors, like it used before and Apple used in Touch ID prevents this kind of attack.
    Meh, it’s a convenience feature, not a security feature. /s
    Lol
    watto_cobra
  • Reply 16 of 25
    melliottmelliott Posts: 3member
    13485 said:
    Technically feasible I suppose, but the holes would quickly fill with lint, epithelials and other junk in short order.
    The holes being proposed were not in the cover glass. Basically tiny holes in the space between pixels in the LCD/OLED layer.
    watto_cobra
  • Reply 17 of 25
    SpamSandwichSpamSandwich Posts: 33,407member
    jgreg728 said:
    Honestly Samsung nailed it with the "pulse" idea they up with for fingerprint sensing under the screen.
    Except it doesn’t work well or consistently.
    cornchipwatto_cobra
  • Reply 18 of 25
    TheCireTheCire Posts: 1unconfirmed, member
    Also, wouldn't those holes have been susceptible to getting dirt/dust in them?

    Oh, after reading additional comments, others have pointed out what I said..... that'll learn me to sign up and repeat others lol
    edited May 2019 watto_cobra
  • Reply 19 of 25
    chasmchasm Posts: 3,275member
    jgreg728 said:
    Honestly Samsung nailed it with the "pulse" idea they up with for fingerprint sensing under the screen.
    In addition to the “little problem” Mike pointed out, real-world use reveals that it is slower and less accurate than Touch ID. What is it with Samsung and untested, non-mature technology?
    watto_cobra
  • Reply 20 of 25
    thrangthrang Posts: 1,007member
    I prefer Touch ID and hope Apple does find a way to bring it back as an option. FaceID is a bit finicky and a bit slow, and requires more deliberate alignment to unlock. With Touch ID, I could pick up a locked phone from the table in any orientation and it would already be unlocked by my thumb well before (comparatively) the FaceID equivalent action. Also, Apple's decision to position the sensors on the iPad in the portrait orientation was not well thought out at all. I suspect 95% of iPad use is in the landscape mode, yet the portrait orientation often results in the sensors being obscured by your fingers/hands/arm motion as you naturally hold the sides of the iPad. The sensors need to be on the long axis. I'm not sure how this was not considered, as it seems obvious after a day of using the iPad.
Sign In or Register to comment.