Largan confirms it will ship facial recognition 3D sensors this year, in time for Apple's ...

in iPhone edited June 2017
Though the company stopped short of saying who the components are for, Apple supplier Largan has affirmed that it will ship 3D sensors capable of scanning a user's face and unique iris, capabilities rumored to be a part of the anticipated "iPhone 8."

Largan's 3D sensing module for smartphones will ship in the second half of the year, CEO Adam Lin said in a press conference this week, according to Nikkei.

The report went further in citing Yuanta Investment Consulting analyst Jeff Pu, who believes that Apple will be the only smartphone maker to ship an iris scanner this year. Rumors have claimed Apple plans to include game changing facial recognition technology in this year's flagship iPhone, known colloquially as the "iPhone 8."

According to analyst Ming-Chi Kuo of KGI Securities, who has a strong track record in predicting Apple's future product plans, the so-called "iPhone 8" will include an infrared transmitting module and infrared receiving module with its forward facing FaceTime camera. According to Kuo, the new system will enable the camera to accomplish 3D sensing and modeling, as well as advanced biometric.

Notably, Kuo said he expects Largan to provide lenses for both the front camera and infrared receiver on the "iPhone 8," along with competing suppliers Foxconn, Genius, and Kantatsu. Together with image sensors from Sony, they are rumored to play a crucial role in Apple's new advanced front 3D camera system.

While Largan didn't mention Apple, the company did reveal it plans to nearly double its workforce in order to meet demand for products, adding some 4,500 employees and building a new $661 million facility in Taiwan.

Apple's "iPhone 8" is expected to include a customized 1.4-megapixel image sensor in the infrared receiving model that will detect changes in light signals. The system is believed to use technology that Apple acquired from its $345 million purchase of PrimeSense back in 2013.

Combined with facial recognition technology from Apple's acquisition of augmented reality firm Metaio in 2015, Apple's system is expected to be far beyond anything currently available on the market.

Leading up to the announcement of VRKit in iOS 11 last week, Apple made a number of key augmented reality and facial recognition acquisitions this year, including the purchase of RealFace for $2 million, and Emotient for an unknown amount.

Beyond the front facing camera, Apple's so-called "iPhone 8" is expected to feature an edge-to-edge OLED display with glass back. It is expected that key components-- including the Touch ID fingerprint sensor -- could be embedded within the display.


  • Reply 1 of 10
    gatorguygatorguy Posts: 23,300member
    Anyone who listened to Forbes and their Largan VR/AR expectations a year ago will do well this month in the stock market. 
  • Reply 2 of 10
    So the news about the iPhone 8 entering production last week are incorrect then? I guess November shipments might be about right. If they announce it in September with a November delivery, then I get a feeling that the APPL shorters will be making hay again.
  • Reply 3 of 10
    fallenjtfallenjt Posts: 4,040member
    What is this sensor for, second layer of security for financial transactions on top of Touch ID or just to unlock the phone?
  • Reply 4 of 10
    Maybe the iris scanning will replace the Touch ID fingerprint and home button. Now, if I lift the iPhone 6s or 7, the phone already 'wakes up'. And instead of touching the home-button, the iris scanner can directly unlock the iPhone. This solves the whole home-button problem. Also the dock and multitasking on the iPad has been changed in iOS 11. Maybe this more in line with how an iPhone without home button will function. Maybe swiping from the bottom or on the side of the phone?
  • Reply 5 of 10
    sockrolidsockrolid Posts: 2,789member
    Will I need to take off my Ray-Bans when unlocking my iPhone?
    Will it work in total darkness when I'm trying to use the iPhone's flashlight feature?
    Will I need to look directly into the FaceTime camera when unlocking?
  • Reply 6 of 10
    sflocalsflocal Posts: 5,816member
    I'm not buying this.  It doesn't make sense to me that Apple implements this on an iPhone.  It really sounds like something Samsung would certainly do but I don't get what Apple would be doing with this for their iPhone.  As far as I'm concerned, facial recognition is a gimmick.
  • Reply 7 of 10
    SpamSandwichSpamSandwich Posts: 33,408member
    It's the end of the world for Apple! Or it's the biggest news in the history of Apple! If this is Wednesday, are we anti- or pro-Apple?
  • Reply 8 of 10
    melgrossmelgross Posts: 33,117member
    I thought that Samsung already has a (poorly performing) iris scanner in its Galaxy S8?

    even if this one works well, it can't replace the Touch ID. As an additional measure, then yes.
  • Reply 9 of 10
    MarvinMarvin Posts: 14,612moderator
    fallenjt said:
    What is this sensor for, second layer of security for financial transactions on top of Touch ID or just to unlock the phone?
    It could be, it can also be used for 3D scanning for FaceTime and 3rd party apps. ARKit is really impressive in what it can already do without depth sensors:

    The first video takes about 10 seconds to become stable. Once the tracking is stable, the user can walk around completely virtual characters as though they were standing there. If someone was using FaceTime, a front depth sensor can build a model in real-time of the face and to the other user it can look as though the person is in front of them rather than on a 2D screen, although some data would be missing like the back, which would just be filled in. It would be useful to have a depth sensor on the back of the device too to help make the tracking stable more quickly and on the back, parents can film their kids and other family members could see the children as though they were playing on the floor in front of them.

    For 3rd party apps like Snapchat, it can do better tracking of the face orientation so that the filter overlays don't disappear when people turn to the side:

    That would be a trivial thing to add a hardware feature for but developers can use depth sensors for a whole range of things.

    It can be used for portrait mode on the front camera and 3rd party photography apps get a lot more control over the scene so they can do advanced adjustments like adjusting lighting/focus accurate to the surface data.
  • Reply 10 of 10
    radarthekatradarthekat Posts: 3,402moderator
    This whole misunderstanding of the terms is what's causing the debate over biometric identification/authentication.  FACIAL recognition is not FACE recognition.  It can be used as a step in a face recognition process, but facial recognition is simply the process of identifying each feature of a face; eyes, nose, mouth, smiling, frowning, etc, but not used to determine whose face it is.

    Facial recognition returns the result, here is the mouth and it is smilin,' which is useful to map that feature onto the face of an avatar or game character, or useful in mapping an overlay onto the person's actual face, ala Snapchat.  But facial recognition does NOT return the result, 'this is Phil Schiller's face.'  That biometric identification step is done by a method called face recognition.  I know, I know, they sound the same.  But facial recognition and face recognition, and face detection, are all three different things.  Apple's acquisition of Faceshift and others suggests Apple will be employing facial recognition for AR apps, NOT for biometric identification.  It'll remain Touch ID for authentication.  Trust me.  
    edited June 2017 tycho_macuser
Sign In or Register to comment.