New Apple touch patents show body part sensing, fingernail input
In what's considered the definitive overview of Apple's attempts to patent its multi-touch interface, a deluge of newly-public filings show Apple considering technology that hasn't yet been used in the iPhone or the Mac -- including sensing the difference between body parts, explaining gestures through activities, and even responding differently to input from fingernails.
A majority of the patents just published on Thursday not only relate to multi-touch in the iPhone, iPod touch, and Macs but were all originally submitted on January 3rd, 2007 -- a week before the iPhone's first showing, when Apple chief Steve Jobs touted over a hundred patents for the handset.
Many also credit the inventions to either John Elias or Wayne Westerman, both of whom founded FingerWorks. The company was one of the pioneers of multi-touch input and was subsumed into Apple in early 2005 along with Elias and Westerman.
While some of these patents relate to the basic operation of Apple's multi-touch devices as users already know them, others reveal directions that Cupertino, Calif.-based Apple has never taken with its current touch hardware.
In some cases, this includes advancements the touch panel itself can recognize. One filing for input discrimination would compare the pixel values in images captured by the touchscreen and intelligently recognize different parts of the body. A device could change its behavior depending on not just its proximity to a user's face (by visualizing the ear) but to nearly any event: an iPhone could automatically wake itself on a finger clasp that takes the device out of a pocket, or ignore its owner's accidental palm presses.
An extra approach, described as a multi-event system, would also let Apple add context-sensitive menus, guides, and other information by detecting whether or not input comes from skin or an inanimate source. A user could tap items with the flesh of their finger for normal input but flip over to their fingernail and apply pressure for a separate menu, according to Apple.
Opening a menu using a fingernail.
The electronics firm has also buffered itself against the possibility that multi-touch gestures may become too complex to be immediately intuitive. Instead of just providing video demos, as users see today for multi-touch trackpads on some MacBooks, a gesture learning system could guide the user through mimicking pinches, rotations, and swipes to understand how they work, including through color trails that follow fingers or the outline of a hand that performs the correct version of a gesture at the same time.
As in a test, the system could actively provide audiovisual or text feedback until the user performs the action correctly.
An example Apple's proposed method for learning gestures.
The patent filings reveal a more complex vision of multi-touch in Apple products than first thought and show that even the late stages of iPhone development produced inventions that never reached the finished product. However, it's unclear as to whether this is simply a snapshot of Apple's discovery process in January of last year or whether some, if any, of the described technology is ultimately part of a roadmap for future computers, media players, or phones.
All of Apple's directly touch-related patent filings revealed today are available below.
Gesture learning
Multi-touch auto scanning
Projection scan multi-touch sensor array
Multi-Touch Input Discrimination
Error compensation for multi-touch surfaces
Double-sided touch-sensitive panel with shield and drive combined layer
Periodic sensor panel baseline adjustment
Double-sided touch sensitive panel and flex circuit bonding
Scan sequence generator
Advanced frequency calibration
Front-end signal compensation
Master/slave mode for sensor processing devices
Full scale calibration measurement for multi-touch surfaces
Minimizing mismatch during compensation
Multi-touch surface stackup arrangement
Proximity and multi-touch sensor detection and demodulation
MULTI-EVENT INPUT SYSTEM
NOISE DETECTION IN MULTI-TOUCH SENSORS
FAR-FIELD INPUT IDENTIFICATION
SIMULTANEOUS SENSING ARRANGEMENT
PERIPHERAL PIXEL NOISE REDUCTION
IRREGULAR INPUT IDENTIFICATION
Noise reduction within an electronic device using automatic frequency modulation
Automatic frequency calibration
Analog boundary scanning based on stray capacitance
A majority of the patents just published on Thursday not only relate to multi-touch in the iPhone, iPod touch, and Macs but were all originally submitted on January 3rd, 2007 -- a week before the iPhone's first showing, when Apple chief Steve Jobs touted over a hundred patents for the handset.
Many also credit the inventions to either John Elias or Wayne Westerman, both of whom founded FingerWorks. The company was one of the pioneers of multi-touch input and was subsumed into Apple in early 2005 along with Elias and Westerman.
While some of these patents relate to the basic operation of Apple's multi-touch devices as users already know them, others reveal directions that Cupertino, Calif.-based Apple has never taken with its current touch hardware.
In some cases, this includes advancements the touch panel itself can recognize. One filing for input discrimination would compare the pixel values in images captured by the touchscreen and intelligently recognize different parts of the body. A device could change its behavior depending on not just its proximity to a user's face (by visualizing the ear) but to nearly any event: an iPhone could automatically wake itself on a finger clasp that takes the device out of a pocket, or ignore its owner's accidental palm presses.
An extra approach, described as a multi-event system, would also let Apple add context-sensitive menus, guides, and other information by detecting whether or not input comes from skin or an inanimate source. A user could tap items with the flesh of their finger for normal input but flip over to their fingernail and apply pressure for a separate menu, according to Apple.
Opening a menu using a fingernail.
The electronics firm has also buffered itself against the possibility that multi-touch gestures may become too complex to be immediately intuitive. Instead of just providing video demos, as users see today for multi-touch trackpads on some MacBooks, a gesture learning system could guide the user through mimicking pinches, rotations, and swipes to understand how they work, including through color trails that follow fingers or the outline of a hand that performs the correct version of a gesture at the same time.
As in a test, the system could actively provide audiovisual or text feedback until the user performs the action correctly.
An example Apple's proposed method for learning gestures.
The patent filings reveal a more complex vision of multi-touch in Apple products than first thought and show that even the late stages of iPhone development produced inventions that never reached the finished product. However, it's unclear as to whether this is simply a snapshot of Apple's discovery process in January of last year or whether some, if any, of the described technology is ultimately part of a roadmap for future computers, media players, or phones.
All of Apple's directly touch-related patent filings revealed today are available below.
Gesture learning
Multi-touch auto scanning
Projection scan multi-touch sensor array
Multi-Touch Input Discrimination
Error compensation for multi-touch surfaces
Double-sided touch-sensitive panel with shield and drive combined layer
Periodic sensor panel baseline adjustment
Double-sided touch sensitive panel and flex circuit bonding
Scan sequence generator
Advanced frequency calibration
Front-end signal compensation
Master/slave mode for sensor processing devices
Full scale calibration measurement for multi-touch surfaces
Minimizing mismatch during compensation
Multi-touch surface stackup arrangement
Proximity and multi-touch sensor detection and demodulation
MULTI-EVENT INPUT SYSTEM
NOISE DETECTION IN MULTI-TOUCH SENSORS
FAR-FIELD INPUT IDENTIFICATION
SIMULTANEOUS SENSING ARRANGEMENT
PERIPHERAL PIXEL NOISE REDUCTION
IRREGULAR INPUT IDENTIFICATION
Noise reduction within an electronic device using automatic frequency modulation
Automatic frequency calibration
Analog boundary scanning based on stray capacitance
Comments
PLEAAASSEEE. I love this phone but pleaasssee give us copy and paste.
Cut, copy, and paste.
I bet the same mechanism will be used for the clipboard (and maybe Undo/Redo too). Hopefully sooner rather than later. We'll have a ton of apps soon.... so give us a quick way to share data between them on the fly!
As for phones that can sense "body parts" other than fingertips... I think I'll pass
Cut, copy, and paste.
Can we get 'cut' on Mac OS X first?
Can we get 'cut' on Mac OS X first?
there is... command + x
(just like in Windows, if your that kind of person)
there is... command + x
(just like in Windows, if your that kind of person)
The kind of person that uses keyboard shortcuts? I sure am, but cutting doesn't work in Finder. There is a Finder playlist option to enable Cut but it doesn't actually work.
The kind of person that uses keyboard shortcuts? I sure am, but cutting doesn't work in Finder. There is a Finder playlist option to enable Cut but it doesn't actually work.
I don't seem to have that issue. I just selected some text using Finder in a file name on the desk top and was able to cut using any method I wished. Am I misunderstanding you?
I don't seem to have that issue. I just selected some text using Finder in a file name on the desk top and was able to cut using any method I wished. Am I misunderstanding you?
I think what was meant that you could "cut" a file and "paste" it into a new folder.
I don't seem to have that issue. I just selected some text using Finder in a file name on the desk top and was able to cut using any method I wished. Am I misunderstanding you?
I think what was meant that you could "cut" a file and "paste" it into a new folder.
Yes, cutting a file, not text of a file name. In other words, moving a file without using drag-and-drop or the 'mv' command within Terminal.
http://forums.appleinsider.com/showthread.php?t=85751
I think that Apple is purposely holding back on this since they do not want to make royalty payments to Ireland.
Hey Ireland, if you wrote to Apple legal and told them you'd forego all royalties for implementing your idea, perhaps they might come out with it?
Please?
Yes, cutting a file, not text of a file name. In other words, moving a file without using drag-and-drop or the 'mv' command within Terminal.
Gotchya, I knew I had to be missing the point as it was you
I think that Apple is purposely holding back on this since they do not want to make royalty payments to Ireland.
Hey Ireland, if you wrote to Apple legal and told them you'd forego all royalties for implementing your idea, perhaps they might come out with it?
Please?
I did that 10 months ago!
http://forums.appleinsider.com/showthread.php?t=85751
love your mock up. needs to happen...
while we're at it. how about to-do and notes sync?
rob
"Apple chief Steve Jobs touted over a hundred patents for the handset.
Just for the record, Jobs touted over 200 iPhone patents, and then said emphatically, as best I remember, "and we intend to actively defend all of them!"