Mega Apple filing details next-gen 'multi-touch input surface'
The latest patent filing from Apple Inc. hints at a next-generation multi-touch 'surface' that would combine typing, pointing, scrolling, and handwriting capabilities into a single ergonomic design aimed at replacing traditional input devices such as the keyboard, mouse, and drawing tablet.
Like last week's filing for an advanced multi-touch trackpad, the concepts outlined in the massive 80-page document are partially credited to Wayne Westerman of Fingerworks, a company absorbed by Apple several years ago as part of its quest to deliver iPhone and a new generation of input devices.
The filing, submitted four times on July 30, 2007 with varying title descriptions, calls for a generic device that tightly integrates yet clearly distinguishes the different types of input without providing excess strain or muscle movement.
"It should therefore appear modeless to the user in the sense that the user should not need to provide explicit mode switch signals such as buttonpresses, arm relocations, or stylus pickups before switching from one input activity to another," Westerman wrote. "Epidemiological studies suggest that repetition and force multiply in causing repetitive strain injuries. Awkward postures, device activation force, wasted motion, and repetition should be minimized to improve ergonomics."
Therefore, the multi-touch creator explained that an ideal implementation of his concept calls for multi-touch surface apparatus which is both compliant and contoured to be comfortable and ergonomic under extended use. It would "provide tactile key or hand position feedback without impeding [a] hand resting on the surface or smooth, accurate sliding across the surface."
The surface would include an electronic system which could provide images of flesh proximity to an array of sensors with such resolution that a variety of hand configurations can be distinguished. This would allow the device to identify different hand parts as they contact the surface so that a variety of hand configurations can be recognized and used to distinguish different kinds of input activity, Westerman said.
Yet another objective of the multi-touch surface would be to reliably extract rotation and scaling, as well as translation degrees of freedom from the motion of two or more hand contacts to aid in navigation and manipulation of two-dimensional electronic documents. Furthermore, it would be capable of reliably extracting tilt and roll degrees of freedom from hand pressure differences to aid in navigation and manipulation of three-dimensional environments.
In its preferred embodiment, Westerman said the surface would be large enough to comfortably accommodate both hands and would be arched to reduce forearm pronation. Text input, pointing, scrolling and some media manipulation functions of the surface would function similarly to that of Apple's existing implementation on the iPhone and iPod touch. In addition, a form of handwriting recognition would be added via a "pen grip detection module."
While in pen grip mode, the module would determine whether the inner gripping fingers of a hand in a pen-gripping mode are actually touching the surface. If so, the module would begin to generate inking events from the path parameters of the inner fingers and append them to the outgoing event queue of the host communication interface.
"These inking events can either cause 'digital ink' to be laved on the display for drawing or signature capture purposes, or they can be intercepted by a handwriting recognition system and interpreted as gestures or language symbols," Westerman explained. "If the inner fingers are lifted, [the module] sends stylus raised events to the host communication interface to instruct the handwriting recognition system of a break between symbols."
"In some applications the user may need to indicate where the 'digital ink' or interpreted symbols are to be inserted on the display by positioning a cursor," Westerman continued. "Though on a multi-touch surface a user could move the cursor by leaving the pen grip configuration and sliding a finger chord, it is preferable to allow cursor positioning without leaving the pen grip configuration. This can be supported by generating cursor positioning events from slides of the palm heels and outer knuckles. Since normal writing motions will also include slides of the palm heels and outer knuckles, palm motions should be ignored until the inner fingers have been lifted for a few hundred milliseconds."
Should the user actually pick up a conductive stylus and attempt to write with it, the tip of the stylus would essentially takes the place of the index fingertip for identification purposes, remaining at or above the vertical level of the knuckles, according to the filing. "Thus the pen grip detector can function in essentially the same way when the user writes with a stylus, except that the index fingertip path sent to the host communication interface will in actuality be caused by the stylus."
In addition to Westerman, the filing is credited to Apple employee John Elias. The concepts presented within the filing are eerily indicative of functions that would be present in a next-generation Apple Newton MessagePad or tablet slate.
Indeed, AppleInsider this past September exclusively reported on Apple's plans for such a device in a report titled: Up next for Apple: the return of the Newton. The project, upon last check, remains very much work-in-progress, as it has been met with the usual assortment of bumps and bruises.
Like last week's filing for an advanced multi-touch trackpad, the concepts outlined in the massive 80-page document are partially credited to Wayne Westerman of Fingerworks, a company absorbed by Apple several years ago as part of its quest to deliver iPhone and a new generation of input devices.
The filing, submitted four times on July 30, 2007 with varying title descriptions, calls for a generic device that tightly integrates yet clearly distinguishes the different types of input without providing excess strain or muscle movement.
"It should therefore appear modeless to the user in the sense that the user should not need to provide explicit mode switch signals such as buttonpresses, arm relocations, or stylus pickups before switching from one input activity to another," Westerman wrote. "Epidemiological studies suggest that repetition and force multiply in causing repetitive strain injuries. Awkward postures, device activation force, wasted motion, and repetition should be minimized to improve ergonomics."
Therefore, the multi-touch creator explained that an ideal implementation of his concept calls for multi-touch surface apparatus which is both compliant and contoured to be comfortable and ergonomic under extended use. It would "provide tactile key or hand position feedback without impeding [a] hand resting on the surface or smooth, accurate sliding across the surface."
The surface would include an electronic system which could provide images of flesh proximity to an array of sensors with such resolution that a variety of hand configurations can be distinguished. This would allow the device to identify different hand parts as they contact the surface so that a variety of hand configurations can be recognized and used to distinguish different kinds of input activity, Westerman said.
Yet another objective of the multi-touch surface would be to reliably extract rotation and scaling, as well as translation degrees of freedom from the motion of two or more hand contacts to aid in navigation and manipulation of two-dimensional electronic documents. Furthermore, it would be capable of reliably extracting tilt and roll degrees of freedom from hand pressure differences to aid in navigation and manipulation of three-dimensional environments.
In its preferred embodiment, Westerman said the surface would be large enough to comfortably accommodate both hands and would be arched to reduce forearm pronation. Text input, pointing, scrolling and some media manipulation functions of the surface would function similarly to that of Apple's existing implementation on the iPhone and iPod touch. In addition, a form of handwriting recognition would be added via a "pen grip detection module."
While in pen grip mode, the module would determine whether the inner gripping fingers of a hand in a pen-gripping mode are actually touching the surface. If so, the module would begin to generate inking events from the path parameters of the inner fingers and append them to the outgoing event queue of the host communication interface.
"These inking events can either cause 'digital ink' to be laved on the display for drawing or signature capture purposes, or they can be intercepted by a handwriting recognition system and interpreted as gestures or language symbols," Westerman explained. "If the inner fingers are lifted, [the module] sends stylus raised events to the host communication interface to instruct the handwriting recognition system of a break between symbols."
"In some applications the user may need to indicate where the 'digital ink' or interpreted symbols are to be inserted on the display by positioning a cursor," Westerman continued. "Though on a multi-touch surface a user could move the cursor by leaving the pen grip configuration and sliding a finger chord, it is preferable to allow cursor positioning without leaving the pen grip configuration. This can be supported by generating cursor positioning events from slides of the palm heels and outer knuckles. Since normal writing motions will also include slides of the palm heels and outer knuckles, palm motions should be ignored until the inner fingers have been lifted for a few hundred milliseconds."
Should the user actually pick up a conductive stylus and attempt to write with it, the tip of the stylus would essentially takes the place of the index fingertip for identification purposes, remaining at or above the vertical level of the knuckles, according to the filing. "Thus the pen grip detector can function in essentially the same way when the user writes with a stylus, except that the index fingertip path sent to the host communication interface will in actuality be caused by the stylus."
In addition to Westerman, the filing is credited to Apple employee John Elias. The concepts presented within the filing are eerily indicative of functions that would be present in a next-generation Apple Newton MessagePad or tablet slate.
Indeed, AppleInsider this past September exclusively reported on Apple's plans for such a device in a report titled: Up next for Apple: the return of the Newton. The project, upon last check, remains very much work-in-progress, as it has been met with the usual assortment of bumps and bruises.
Comments
this is why the iPhone is so apealing and superior to the blizzard of "touch" phones coming out.
And this is why I am very interested in seeing them come out with a tablet. I have only recently converted to one on the Tablet Watch, and I still doubt that I would be an early adopter, but the posibilities get me excited.
Im ready!
I am really excited to see what they have come up with. And that includes their mega-iphone like tablety device.
It's good to see Apple looking ahead, even if change comes as slowly as I expect. The "tactile feedback" aspect interests me.
I can't wait, this will be much better than the current Wacom Cintiq tablet/display.
Minority Report, here we come...
I just wanted to say that . Go Apple!!!!
"Where is my Minority Report!?"
We're getting closer...god that'd be awesome.
*Imagines editing by cutting with my finger and dragging*
Minority Report, here we come...
Or The Matrix Reloaded.
FYI, for those of you thinking this is the beginning of a matrix or minority report interface, hate to tell you this, but those of us lucky enough to have fingerworks boards were doing this years ago. I mean I would be spinning windows and saving files and sending stuff off with perl and applescripts with flicks of my fingers and twists of my hands; I could interact with a workstation far faster than with two hands on a keyboard and the occasional mouse movement (and this is coming from a command line junkie).
You don't know how sublime it can be to save a document just by closing your fingers, or bring up an open dialog box just by turning your fingers in one direction (like you were popping open a bottle), or send a file off somewhere by flicking your fingers in some direction... so while I'm still bitter about Apple absorbing Fingerworks and seemingly not going anywhere with them very rapidly (the stuff you can do on an iphone with gestures was like 1% of what you could do on a Fingerworks board), I'm hoping this is the start of a return to form.
The project, upon last check, remains very much work-in-progress, as it has been met with the usual assortment of bumps and bruises.
[ View this article at AppleInsider.com ]
Yes, upon last check. You're not lying when you say that. But when was your last check? One of these days you'll be saying 'we told you so', maybe much sooner than later.
Steve had something to announce at Macworld that was taken out of the line up due to some problem, I'm sure. I don't think it'll last long 'til they've worked it out.
Here thestreet.com is communicating a rumor from site looprumors.com, they're saying a bigger more powerful than iPhone device with Multi-Touch 2.0 will be available by mid year.
If there's an event soon and such a device is announced, will you still state 'last we checked it was a work-in-progress'?! Sure, it's just wind and it's got you covered. Sort of.
We're probably a long way from replacing the tried-and-true mouse and keyboard... but someday it will happen. The same-old controls won't be the standard until the end of time.
It's good to see Apple looking ahead, even if change comes as slowly as I expect. The "tactile feedback" aspect interests me.
When the first tablets were coming out waaaaay back in 1990 or so, I remember getting really excited about them. It seemed like such a perfect kind of device. I remember telling a classmate how pen input was going to be The Way. He said it wasn't gonna happen and pointed out how much faster we type than write. Turned out he was right and I was wrong. For certain things, writing is okay, and sometimes even more appropriate. But the human-computer bandwidth.... it's just too low with a pen. And although I adore my iPhone, I don't think multitouch changes that enough. The only thing with higher human-computer bandwidth that a keyboard is Voice.
When voice works, the keyboard can disappear.
But I'm old enough to remember how long we've been looking forward to THAT ONE, too.
I think you're right, though, that the mouse-and-keyboard won't be the standard for eternity. Maybe half that
Or The Matrix Reloaded.
At risk of sounding like a complete geek I'd like to point out that they had this in Startrek the Next Generation before the Matrix or Minority Report... apple are a bit slow to catch on to be honest...
At risk of sounding like a complete geek I'd like to point out that they had this in Startrek the Next Generation before the Matrix or Minority Report... apple are a bit slow to catch on to be honest...
The story was written in the mid-50s. Did Dick write about it back then or was added for the big screen?
I can't wait to see Macs with these features. Hopefully we get some insight at WWDC 2208.
Like Marshall said. I had a fingerworks board (three actually) and those things were beyond awesome, though the learning curve at the beginning could be pretty bad (not a lot of tactile feedback).
FYI, for those of you thinking this is the beginning of a matrix or minority report interface, hate to tell you this, but those of us lucky enough to have fingerworks boards were doing this years ago. I mean I would be spinning windows and saving files and sending stuff off with perl and applescripts with flicks of my fingers and twists of my hands; I could interact with a workstation far faster than with two hands on a keyboard and the occasional mouse movement (and this is coming from a command line junkie).
You don't know how sublime it can be to save a document just by closing your fingers, or bring up an open dialog box just by turning your fingers in one direction (like you were popping open a bottle), or send a file off somewhere by flicking your fingers in some direction... so while I'm still bitter about Apple absorbing Fingerworks and seemingly not going anywhere with them very rapidly (the stuff you can do on an iphone with gestures was like 1% of what you could do on a Fingerworks board), I'm hoping this is the start of a return to form.
Still have mine. Still enjoying it. Hoping for an Apple version soon.
I always wanted one of those, and if Apple made it as a stand-alone input device (for mac and PC) I'm sure they would sell like hotcakes.
Nooooo! No soup for Windows! The Mighty Mouse doesn't have full functionality with Windows. Neither do Apple's keyboards. I'm selfish. Keep this another Mac exclusive.
Looking forward to seeing this sometime before WWDC 09. Group this kind of technology with the motion/light sensors and some low power chips and 3G/WiMax wireless technology and we have won
Two of my senior projects out of school we're focused on the archaic input devices we use and how our productivity and creativity has been limited by the devices which are themselves limited.
The QWERTY Layout dates back to the late 19th Century and the Mouse is in its 5th decade of life... time to move on yeah?
Two of my senior projects out of school we're focused on the archaic input devices we use and how our productivity and creativity has been limited by the devices which are themselves limited.
Your projects sound really interesting! You got them kicking about in digital form anywhere??