Apple's "Methods and apparatus for processing combinations of kinematical inputs" describes a system that translates a variety of force and velocity data detected by an input device into commands for a computer's user interface, such as control of a mouse cursor.
The invention calls for an input device with one or more motion sensors to feed a receiving system adapted to convert gesture data into GUI navigation commands.
In order to achieve precise calculations, the input device can use a number of motion sensors including one or more gyroscopes, optical sensors and accelerometers, among others. By taking force and vector data from the device, the receiving computer generates a "gesture profile" that is associated with a certain system command. For example, if a user slides the input device across a plane or object, a cursor will move in that direction.
Two gestures can be combined, for example below a sliding gesture is performed in conjunction with a tilt gesture, thus enabling a different UI command than would a simple lateral movement of the device.
Illustration of "sliding"gesture with tilt.
The system is programmable, meaning different gestures can be assigned to various commands. A "nudge" gesture can equate to the waking of the input device when it is in sleep mode or moving a mouse cursor slightly in one direction. Tilting, tapping and other gravity-based gestures are supported by the system and can likewise reproduce any number of UI commands on a computer's screen.
Gesture profiles can be multi-step operations. In the example below, the input device starts at the left at a speed of 0 meters per second. Once the start motion is detected, a minimum velocity must be reached in order to enable the gesture. When the magnitude of the force vector reaches zero, that is when the unit is lifted off the desk, the conditions satisfy the second gesture circumstance and the command is triggered.
Illustration of "brush" gesture triggered by two-step enablement.
It appears from the patent drawings that Apple could possibly incorporate the additional motion gesture control into an upcoming mouse. While just speculation, the iPhone and iPod lineups could also be used asinput devices as they carry integrated accelerometers, gyroscopes and imaging sensors. There are existing apps in the iOS ecosystem that "transform" iDevices into usable input peripherals, like R.P.A.Tech's Mobile Mouse, but Apple may one day use the invention's technology to offer a built-in first-party solution.
Apple's patent application was first filed in April and credited Jean L. Lee is credited as its inventor.