Apple developing configurable multi-touch gesture dictionary
In an apparent bid to extend the use of its multi-touch technology to devices other than iPhone, Apple in a new patent filing discusses the concept of a software-based multi-touch gesture dictionary that would let users to assign custom multi-touch finger movements to computer actions.
"Many expect that the use of multi-finger, touch-sensitive user interfaces will become widely adopted for interacting with computers and other electronic devices, allowing computer input to become even more straightforward and intuitive," Apple wrote in the January 3, 2007 filing.
"Users of these multi-touch interfaces may make use of hand and finger gestures to interact with their computers in ways that a conventional mouse and keyboard cannot easily achieve. A multi-touch gesture can be as simple as using one or two fingers to trace out a particular trajectory or pattern, or as intricate as using all the fingers of both hands in a complex sequence of movements reminiscent of American Sign Language."
The Cupertino-based electronics maker further explains that each motion of the hands and fingers, whether complex or not, would convey a specific meaning or action that is acted upon by the computer or electronic device at the behest of the user:
"The number of multi-touch gestures can be quite large because of the wide range of possible motions by fingers and hands. It is conceivable that an entirely new gesture language might evolve that would allow users to convey complex meaning and commands to computers and electronic devices by moving their hands and fingers in particular patterns."
To manage the new language, Apple's patent proposal calls for a "dictionary of multi-touch gestures" that is interactively presented to a user of a computer system having a multi-touch user interface. In one embodiment, the company said the dictionary may take the form of a dedicated computer application that identifies a chord (e.g., a combination of fingers, thumbs, and/or other hand parts) presented to the multi-touch interface by the user and displays a dictionary entry for the identified chord.
A dictionary entry may include, for example, visual depictions of one or more motions that may be associated with the chord and meanings of the gestures including the identified chords and the various motions.
"The visual depictions may take the form of motion icons having a graphical depiction of the motion and a textual description of the meaning of the gesture," Apple wrote. "The visual depictions may also take the form of animations of the one or more motions. The application could also identify one or more motions of the chord by the user and provide visual and/or audible feedback to the user indicating the gesture formed and its meaning."
In another embodiment, Apple said a dictionary application can run in the background while other applications on the computer systems are used. In this scenario, if a user presents a chord associated with a gesture without a motion completing the gesture, the dictionary application can present a dictionary entry for the presented chords.
Apple explains: "As in other embodiments, the dictionary entry may include visual depictions of one or more motions and meanings of the gestures comprising the identified chord and the various motions. Also as in other embodiments, the visual depictions may take the form of motion icons or animations of the motions. A user guided by the dictionary entry may perform a motion completing a gesture, and the system may execute a meaning of the gesture and may also provide visual and/or audible feedback indicating the meaning of the gesture. "
In yet another approach to its dictionary concept, Apple discusses the concept of an interactive computer application that allows a user to assign meanings to multi-touch gestures is provided. The computer application may display a dictionary entry (like those described above, for example) and accept inputs from the user to assign a meaning to one or more of the gestures in the dictionary entry.
Apple further explained that the application could be used to assign meanings to gestures that do not have default meanings selected by a system designer or may be used to change the meanings of gestures that do have default meanings assigned by a system designer. The application, the company added, may also include program logic to selectively present only those motions that may be more easily performed in a form different from those motions that may be more difficult to perform. Alternatively, the more difficult motions may not be displayed at all.
Finally, Apple also explains that the gesture dictionary applications could be triggered by events other than presentation of a chord:
"These events may include hand parts hovering over a multi-touch surface, audible events (for example, voice commands), activation of one or more buttons on a device, or applying a force and/or touch to a force and/or touch sensitive portion of a device. These events may correspond to chords and invoke a dictionary entry corresponding to such a chord. Alternatively or additionally, these events may correspond to other groupings of gestures not based on chords, such as custom dictionary entries. In yet another variation, the event triggering a gesture dictionary application may not correspond to a gesture grouping at all. In these cases, a dictionary index may be invoked, allowing a user to select from a plurality of dictionary entries."
According to the filing -- credited to Apple employees John Elias, Wayne Westerman, and Myra Haggerty -- the multi-touch gesture dictionary concept could be deployed on a notebook computer, tablet computer, handheld computer, personal digital assistant, media player, mobile telephone, or other consumer electronics devices of similar nature.
"Many expect that the use of multi-finger, touch-sensitive user interfaces will become widely adopted for interacting with computers and other electronic devices, allowing computer input to become even more straightforward and intuitive," Apple wrote in the January 3, 2007 filing.
"Users of these multi-touch interfaces may make use of hand and finger gestures to interact with their computers in ways that a conventional mouse and keyboard cannot easily achieve. A multi-touch gesture can be as simple as using one or two fingers to trace out a particular trajectory or pattern, or as intricate as using all the fingers of both hands in a complex sequence of movements reminiscent of American Sign Language."
The Cupertino-based electronics maker further explains that each motion of the hands and fingers, whether complex or not, would convey a specific meaning or action that is acted upon by the computer or electronic device at the behest of the user:
"The number of multi-touch gestures can be quite large because of the wide range of possible motions by fingers and hands. It is conceivable that an entirely new gesture language might evolve that would allow users to convey complex meaning and commands to computers and electronic devices by moving their hands and fingers in particular patterns."
To manage the new language, Apple's patent proposal calls for a "dictionary of multi-touch gestures" that is interactively presented to a user of a computer system having a multi-touch user interface. In one embodiment, the company said the dictionary may take the form of a dedicated computer application that identifies a chord (e.g., a combination of fingers, thumbs, and/or other hand parts) presented to the multi-touch interface by the user and displays a dictionary entry for the identified chord.
A dictionary entry may include, for example, visual depictions of one or more motions that may be associated with the chord and meanings of the gestures including the identified chords and the various motions.
"The visual depictions may take the form of motion icons having a graphical depiction of the motion and a textual description of the meaning of the gesture," Apple wrote. "The visual depictions may also take the form of animations of the one or more motions. The application could also identify one or more motions of the chord by the user and provide visual and/or audible feedback to the user indicating the gesture formed and its meaning."
In another embodiment, Apple said a dictionary application can run in the background while other applications on the computer systems are used. In this scenario, if a user presents a chord associated with a gesture without a motion completing the gesture, the dictionary application can present a dictionary entry for the presented chords.
Apple explains: "As in other embodiments, the dictionary entry may include visual depictions of one or more motions and meanings of the gestures comprising the identified chord and the various motions. Also as in other embodiments, the visual depictions may take the form of motion icons or animations of the motions. A user guided by the dictionary entry may perform a motion completing a gesture, and the system may execute a meaning of the gesture and may also provide visual and/or audible feedback indicating the meaning of the gesture. "
In yet another approach to its dictionary concept, Apple discusses the concept of an interactive computer application that allows a user to assign meanings to multi-touch gestures is provided. The computer application may display a dictionary entry (like those described above, for example) and accept inputs from the user to assign a meaning to one or more of the gestures in the dictionary entry.
Apple further explained that the application could be used to assign meanings to gestures that do not have default meanings selected by a system designer or may be used to change the meanings of gestures that do have default meanings assigned by a system designer. The application, the company added, may also include program logic to selectively present only those motions that may be more easily performed in a form different from those motions that may be more difficult to perform. Alternatively, the more difficult motions may not be displayed at all.
Finally, Apple also explains that the gesture dictionary applications could be triggered by events other than presentation of a chord:
"These events may include hand parts hovering over a multi-touch surface, audible events (for example, voice commands), activation of one or more buttons on a device, or applying a force and/or touch to a force and/or touch sensitive portion of a device. These events may correspond to chords and invoke a dictionary entry corresponding to such a chord. Alternatively or additionally, these events may correspond to other groupings of gestures not based on chords, such as custom dictionary entries. In yet another variation, the event triggering a gesture dictionary application may not correspond to a gesture grouping at all. In these cases, a dictionary index may be invoked, allowing a user to select from a plurality of dictionary entries."
According to the filing -- credited to Apple employees John Elias, Wayne Westerman, and Myra Haggerty -- the multi-touch gesture dictionary concept could be deployed on a notebook computer, tablet computer, handheld computer, personal digital assistant, media player, mobile telephone, or other consumer electronics devices of similar nature.
Comments
Self-programmable keyboards are on the way.
I would love to say... I told you so!
There were some doubters out there, but it is obvious now: multitouch is here to stay and grow.
Self-programmable keyboards are on the way.
I would love to say... I told you so!
I would love to say--"You told us so" (as did ohers like kolchak).
Bring on the multi-touch ultralights, "regular" and tablets.
Fujitsu, Sony, Toshiba, Panasonic, Asus, HP...., beware--the Mac MTULs with OS X are coming. As Japan resident and poster ElectricMonk keeps saying, if AAPL makes these 1-1.5 kg units, Japan goes from wasteland to boomtown. And, for those who don't venture outside the Apple mothership, ULs are not a niche market--just take a stroll around Wall Street or go to a large science or business conference-- ULs on parade.
http://forums.appleinsider.com/showt...ght=multitouch
I
According to the filing -- credited to Apple employees John Elias, Wayne Westerman, and Myra Haggerty -- the multi-touch gesture dictionary concept could be deployed on a notebook computer, tablet computer, handheld computer, personal digital assistant, media player, mobile telephone, or other consumer electronics devices of similar nature.
Multi-touch Ultraportable or iMac anyone?
the multi-touch gesture dictionary concept could be deployed on a notebook computer, tablet computer, handheld computer, personal digital assistant, media player, mobile telephone, or other consumer electronics devices of similar nature.
I know they're just covering their arses with this patent filing, but I'm betting on multi-touch on iPods this fall, and on Macs next year (or 2009 at the latest)...
Or are we looking at a revival of the Newton??
Apple would blow everyone away if next Tuesday's iMac update implemented multi-touch - but I think it's still too early for that.
Bring it on. LOL
Multi-touch Ultraportable or iMac anyone?
I'll take two of each!
There were some doubters out there, but it is obvious now: multitouch is here to stay and grow.
Self-programmable keyboards are on the way.
I would love to say... I told you so!
So would I.
I'll take two of each!
I'd prefer a minitower but I guess I could settle for a multi-touch iMac (possibly upgradeable? )
So much wishing so little time
Many years ago, Mentor Graphics started using "strokes" (or mouse gestures) to define functions instead of mouse clicks in their EDA schematic capture and layout tools. It's very effective and saves a ton of time. However, instead of a finger, you use your mouse.
If they add method to select a piece of text (or maybe this exists already on the iPhone), then they have a way now to implement copy/paste on it.
I take it you didn't look at the images?
I take it you didn't look at the images?
I did take a look, in fact these images were my inspiration, particularly the cut, copy, and paste gestures. I simply could not find a gesture for the selecting action, which has to precede any cut or copy action.
If they add method to select a piece of text (or maybe this exists already on the iPhone), then they have a way now to implement copy/paste on it.
I take it you didn't look at the images?
Many years ago, Mentor Graphics started using "strokes" (or mouse gestures) to define functions instead of mouse clicks in their EDA schematic capture and layout tools. It's very effective and saves a ton of time. However, instead of a finger, you use your mouse.
That would qualify as different, however, there could be overlapping patent issues.
That would qualify as different, however, there could be overlapping patent issues.
Apple seems to be going much further though. Mentor's concept was just for specific usage. Apple is using this as a general user input device interface. Not what Mentor envisioned. Also using an older patented idea for a newer purpose can be justified. It depends on the new restrictions about that use just handed down from the Supreme Court.