Apple's acoustic imaging technology might replace Touch ID in future iPhones
Apple is rumored to replace iPhone's Touch ID home button with alternative bio-recognition hardware when it launches an OLED version of the handset later this year, but the technology enabling that transition is so far an unknown. A patent application published Thursday, however, indicates where the company's research is headed.
Source: USPTO
In its filing for "Acoustic imaging system architecture," Apple proposes a method by which a conventional capacitive fingerprint sensor like Touch ID might be replaced by an array of acoustic transducers laid out beneath a device display or its protective housing.
As described in some embodiments, the transducers in a first mode generate acoustic waves, or pulses, capable of propagating through a variety of substrates, including an iPhone's coverglass. Operation, or driving, of the transducers is managed by a controller.
The same transducer hardware then enters a second sensing mode to monitor reflections, attenuations, and diffractions to the acoustic waves caused by a foreign object in contact with the input-responsive substrate. Resulting scan data in the form of electrical signals is read by an onboard image resolver, which creates an approximated two-dimensional map.
Applied to biometric recognition solutions, Apple's acoustic imaging system might be configured to read a user's fingerprint. According to the filing, ridges in a finger pad introduce an acoustic impedance mismatch that causes the mechanical waves generated by a transducer to reflect or diffract in a known manner.
Like other biometric security solutions, the digital maps obtained by an acoustic imaging system are ultimately compared against a database of known assets to authenticate a user.
Thanks to its design, the acoustic system can be installed practically anywhere in a device chassis, including directly under a display. Other potential points of integration include a screen's perimeter or bezel, around buttons and in non-input areas of a housing, like a rear chassis. The latter coupling method would allow an example system to sample a user's entire handprint.
Indeed, in some cases the system might be configured to scan for particular body parts like a user's ear or a skin pattern in order to determine how the device is being held. Depending on the implementation, an acoustic imaging system might also serve as a robust replacement for other legacy components like an iPhone's proximity sensors.
The patent application goes on to cover system details like ideal materials, transducer placement, controllers, drive chips, component layouts and more.
Whether Apple plans to incorporate acoustic imaging technology in this year's "iPhone 8" is unknown. Noted KGI analyst Ming-Chi Kuo predicts the company will ditch existing Touch ID technology in favor of a dual -- or perhaps two-step -- bio-recognition solution utilizing an optical fingerprint reader and facial recognition hardware.
Kuo further elaborated on Apple's face-recognizing technology in a note to investors this week, saying Apple plans to incorporate a "revolutionary" front-facing camera that integrates infrared emitters and receivers to enable 3D sensing and modeling.
Apple's acoustic imaging patent application was first filed for in August 2016 and credits Mohammad Yeke Yazandoost,Giovanni Gozzini, Brian Michael King, Marcus Yip, Marduke Yousefpor, Ehsan Khajeh and Aaron Tucker as its inventors.
Source: USPTO
In its filing for "Acoustic imaging system architecture," Apple proposes a method by which a conventional capacitive fingerprint sensor like Touch ID might be replaced by an array of acoustic transducers laid out beneath a device display or its protective housing.
As described in some embodiments, the transducers in a first mode generate acoustic waves, or pulses, capable of propagating through a variety of substrates, including an iPhone's coverglass. Operation, or driving, of the transducers is managed by a controller.
The same transducer hardware then enters a second sensing mode to monitor reflections, attenuations, and diffractions to the acoustic waves caused by a foreign object in contact with the input-responsive substrate. Resulting scan data in the form of electrical signals is read by an onboard image resolver, which creates an approximated two-dimensional map.
Applied to biometric recognition solutions, Apple's acoustic imaging system might be configured to read a user's fingerprint. According to the filing, ridges in a finger pad introduce an acoustic impedance mismatch that causes the mechanical waves generated by a transducer to reflect or diffract in a known manner.
Like other biometric security solutions, the digital maps obtained by an acoustic imaging system are ultimately compared against a database of known assets to authenticate a user.
Thanks to its design, the acoustic system can be installed practically anywhere in a device chassis, including directly under a display. Other potential points of integration include a screen's perimeter or bezel, around buttons and in non-input areas of a housing, like a rear chassis. The latter coupling method would allow an example system to sample a user's entire handprint.
Indeed, in some cases the system might be configured to scan for particular body parts like a user's ear or a skin pattern in order to determine how the device is being held. Depending on the implementation, an acoustic imaging system might also serve as a robust replacement for other legacy components like an iPhone's proximity sensors.
The patent application goes on to cover system details like ideal materials, transducer placement, controllers, drive chips, component layouts and more.
Whether Apple plans to incorporate acoustic imaging technology in this year's "iPhone 8" is unknown. Noted KGI analyst Ming-Chi Kuo predicts the company will ditch existing Touch ID technology in favor of a dual -- or perhaps two-step -- bio-recognition solution utilizing an optical fingerprint reader and facial recognition hardware.
Kuo further elaborated on Apple's face-recognizing technology in a note to investors this week, saying Apple plans to incorporate a "revolutionary" front-facing camera that integrates infrared emitters and receivers to enable 3D sensing and modeling.
Apple's acoustic imaging patent application was first filed for in August 2016 and credits Mohammad Yeke Yazandoost,Giovanni Gozzini, Brian Michael King, Marcus Yip, Marduke Yousefpor, Ehsan Khajeh and Aaron Tucker as its inventors.
Comments
If there was a requirement that patents be backed by working models, even of just the core claims, these far reaching abstract concept patents would have more credibility and perhaps we'd see real results in less than a decade or two. I'd love to see the patent office require that abstract concepts presented in patents be brought to market or implemented in a real product or working model within a reasonable time frame, say 5 years max, or else the patent is revoked. IMHO, the vagueness of abstract concept patents and the lack of required implementation is one of the root causes of so many of the patent related lawsuits we've been seeing for the past couple of decades. The patent owner ends up sitting on the concept and when someone actually builds something that vaguely resembles the concept they get sued for infringement. Doesn't make sense if one of the purposes of patents being enforced at the societal level is to incite innovation that benefits both the inventors and society at large.
A research team discovers, intentionally or otherwise, some great new process, but -- as you pointed out -- the process happens to be expensive and/or difficult to execute -- maybe the tech to accomplish it economically is 10 years out (small / fast enough chips, for example). The team is unable to convince their university / company / etc. to plow enough money into the discovery to get it from concept to prototype -- with your requirement, they would be unable to obtain a patent. So ...
A ) Would the team even reveal the process to the world at all? This would be essentially a philanthropic act b/c they have no legal claim to it. If they do not, then the knowledge would be buried.
B ) If the team does share the knowledge, a company (or otherwise wealthy entity) could bring the new process "to market" (e.g. Apple), making money off it while the research team makes nothing.
The patent system has a lot of flaws, but not requiring a model doesn't seem to be one.
How about a simulation? Wouldn't that be as useful until the technology catches up? This patent doesn't seem all that vague, and certainly could be proved out at low resolution. As for Apple sitting on it, they aren't going to do that if there is potential innovation as the end result. After all, all of that innovation is why Apple has some $200 B in hand.
A favourite scene from Blade Runner: www.youtube.com/watch?v=g-DkoGvcEBw
As far as requiring models/prototypes, I know this is not required by the patent agencies, but my hope is that putting a time limit on converting ideas and concepts into reality could potentially limit the weaponization focus of patent portfolios. If patent protection laws afforded by governments are actually intended to protect and reward invention and spur innovation for the greater good the need to translate ideas into reality within a specific time-frame seems like a reasonable compromise.
Of course there are patent trolls, but not many compared to the number of patents issued each year. And the difficulty is troll patents are still issued patents that have value, like any other property or thing. You can't just claim that because some one-person firm bought a pile a patents and he's somehow not entitled to get anything for them, or that only the original patent holder can get paid for his/her invention. It's not an easy situation to resolve unless you can invalidate the patents, which is also not easy.
As much as there are flaws in the system, basically the whole world uses the same standards for patents, so it's a system that largely works as intended. Innovation is protected (there are many thousands of new patented inventions every single day), not stifled.
First; you assume that Apple doesn't have working models of this technology. That is a big mistake all on its own.
Second; you really don't have a technology unless you can describe the science behind it. Beyond that you describe this as abstract when there is nothing abstract about it. They have apparently researched and developed this technology to the point that they have something to patent. If any thing this is a very concrete example of a new tech for user verification.
Third; This is also a big point many people simply don't grasp. A lot of new tech comes from very small companies often less then ten people and often a single individual. They don't always have the resources to bring a product forward. With a patent they protect their developments and allow established industry to produce and market it. I have been involved in a company that in fact bought out such small companies to market their developments. In doing so they took a product that was initially very expensive to manufacture and turned it into a mass market product that almost anybody can afford. Sometimes all you need to do good is a lot of capital and one way to get that capital is to sell to a company that can leverage your tech. By the way incase you are wondering, I'm just a technician at this company. A company by the way that still employs people in the USA to manufacture stuff.
Fourth; While Apple patents a lot of stuff they also license a lot of stuff thus paying royalties. Paying royalties on a product is a way to reward innovators and frankly rewards people that have even somewhat abstract patents. Most of those royalties though go to people that develop real technologies. There is a great deal of IP in the iPhone that Apple has had developed by other companies or bought through a licensing agreement. Every thing from the ARM technology to the DSP used for voice processing. Interestingly the licensed ARM technology is very abstract as they don't use an ARM designed core. If it wasn't for the reality of abstract IP we wouldn't even have Apples ARM cores today.
Fifth; The patent system has been very good at inciting innovation. As you know you can't manufacture a patented device for commercial use without some sort of legal agreement with the patent holder. No big deal there. However that is exactly what spurs innovation as it creates a need to come up with competing technologies. This is why we see a variety of GPU designs all that accomplish similar goals. Frankly the patent system is what is responsible for innovation within the USA and and why many other economies have become copy cat industries.
Sixth; A patent is a limited lifetime property. "The patent owner ends up sitting on the concept and when someone actually builds something that vaguely resembles the concept they get sued for infringement". What you seem to demonstrate by this whining is that people should have those rights to their property. It is like telling someone that buys a 100 acres of woodland that they have to develop that product or else. There should be limits on what they can do with the property, forest fires are out of the question, but if they want to use that property to enjoy wildlife then they are free to do so. As a patent holder you are likewise free to do as you chose with you property with the full understanding that your rights are limited to a short period of time. If I developed a tech for a fusion power reactor that could be placed in every bodies basement tomorrow, I'm free to sit on that patent until it expires. That may be entirely stupid as a fusion reactor would make for a business bigger than Apple is right now. After all I developed the tech and have the patent so I can be as stupid as permitted with it.
You do remember how large ultrasound machine where in the past? The tech to do this is not as overwhelming as one might imagine. This especially considering some to the efforts underway with in various institutions to extend ARM processors for super computer like operation. With a coming process shrink Apple could easily have 4 ARM cores of very high performance available to them.
Though I acknowledge the issue of computational requirements I suspect that people are over estimating just how much of a problem this would be in a custom SoC like Apple builds. This especially considering that the acoustic image results only have to be good enough to "see" the finger prints. Somebody someplace made an estimate of the number of processors already in an iPhone and i believe the total came to 7. This includes DSP's, dedicated ARM processors, the main processors and the GPU processors. Anyone of these could have extra CPU cycles available to aid in this acoustic image interpretation. More so it is a given that all of these processors will be more powerful in the next iPhone. Since Apple is free to alter architecture of the device anyway they like they could extend the GPU to handle recognition of the finger print. In a nut shel the power of a ultrasound machine isn't needed but then again we aren't that far from that sort of performance level anyways.
I think people forget just how powerful an iPhone is.