Apple exploring Kinect-like iPhone motion sensor, force-detecting MacBook trackpad

Posted:
in Future Apple Hardware edited March 2014
Two new Apple patent applications published Thursday shed light on the company's research into bringing three-dimensional imaging to mobile devices, as well as enabling more immersive ways to interact with laptop computers.

Patent Drawing


The first application, dubbed "Imaging Range Finding Device and Method," details a system similar to Microsoft's Kinect motion sensor. An array of light emitters and photodetectors are situated behind an optical lens, which is used to direct light toward a target object.

Light from the device that bounces back through the lens -- as well as light emitted by the object itself -- can be analyzed to determine the object's size and location. The addition of one or more moveable lenses increases the system's accuracy, according to the filing, and saves power while enabling use of the system in adverse environments such as rooms filled with smoke or fog.

Apple envisions a wide variety of uses for the range finder:
  • to scan and map interior space; for 3D object scanning and pattern matching;
  • as a navigation aid for the visually-impaired to detect landmarks, stairs, low tolerances, and the like;
  • as a communication aid for the deaf to recognize and interpret sign language for a hearing user;
  • for automatic foreground/background segmentation; for real-time motion capture and avatar generation;
  • for photo editing;
  • for night vision; to see through opaque or cloudy environment, such as fog, smoke, haze;
  • for computational imaging, such as to change focus and illumination after acquiring images and video;
  • for autofocus and flash metering; for same-space detection of another device;
  • for two-way communication;
  • for secure file transfers;
  • to locate people or objects in a room;
  • and to capture remote sounds.
The system is similar to technology developed by Israeli outfit PrimeSense, which Apple acquired last year for $360 million. Prior to the Apple sale, PrimeSense was working on a 3D sensor called the Capri 1.25 which was small enough to be integrated into a mobile device.

Apple credits Matthes Emanuel Last of Davis, Calif. for the invention of U.S. Patent Application No. 61698375.

Patent Drawing


The second application, "Optical Sensing Mechanisms for Input Devices," reveals a new type of trackpad whose position is determined by reflected light. The trackpad would be allowed to move freely along multiple planes in a manner that could best be described as similar to a large, flat joystick.

An optical sensing mechanism would detect the trackpad's vertical, lateral, and angular displacement as well as the velocity, force, acceleration, and pressure of movements. Those parameters would be used to detect gestures and initiate actions based on different force velocities or pressures.

Users could, for instance, instruct the operating system to prioritize one task over others by clicking an icon "harder." They could also define different profiles that would allow the same gesture to perform multiple actions depending on the force with which the gesture was executed.

Additionally, combining the various input parameters could eliminate the need for multiple independent control surfaces, such as buttons on a gamepad:
Consider, for example, a user playing an auto racing simulation. A track pad, as described herein, may be used to control and/or coordinate several aspects of the game, generating simultaneous inputs. Lateral motion of the track pad may steer the vehicle. Force exerted on the track pad (downward force, for example) may control acceleration. Finger motion on a capacitive-sensing (or other touch-sensing) surface may control the view of the user as rendered on an associated display. Some embodiments may even further refine coordinated inputs. Taking the example above, downward force at a first edge or area of the track pad may control acceleration while downward force at a second edge or area may control braking.
Apple credits Joel S. Armstrong-Muntner of San Mateo, Calif. for the invention of U.S. Patent Application No. 61700767.

Comments

  • Reply 1 of 12
    SpamSandwichSpamSandwich Posts: 33,407member
    The patent applications will continue!
  • Reply 2 of 12
    flaneurflaneur Posts: 4,526member
    Seems like a new era in interfaces is on its way. Fingers will interact with light in 3D instead of capacitance in 2D. Precision in pointing and manipulating screen objects should increase considerably.
  • Reply 3 of 12
    tylerk36tylerk36 Posts: 1,037member

    A new concept for online dating.  Imagine the possibilities?

  • Reply 4 of 12

    I hope Apple won't add any features that make us look like ecstatic kids flapping their arms about who have discovered a new trick.

     

    Actually I never could understand the benefit of gestures (unless you want to look ridiculous). I couldn't understand why couldn't you just flick the screen to browse through images, rather than waving your hands about. I also couldn't understand why do you have to perform a ritual dance in front a TV to change a channel or change volume, when all that is required is a lazy touch of a button on the remote controller.

     

    I really don't understand gesture (I really ask this honestly). Can someone kindly ruminate on this?

  • Reply 5 of 12
    flaneurflaneur Posts: 4,526member
    crysisftw wrote: »
    I hope Apple won't add any features that make us look like ecstatic kids flapping their arms about who have discovered a new trick.

    Actually I never could understand the benefit of gestures (unless you want to look ridiculous). I couldn't understand why couldn't you just flick the screen to browse through images, rather than waving your hands about. I also couldn't understand why do you have to perform a ritual dance in front a TV to change a channel or change volume, when all that is required is a lazy touch of a button on the remote controller.

    I really don't understand gesture (I really ask this honestly). Can someone kindly ruminate on this?

    Here's one rumination: We're talking about fingers over a sensor here, not arm waving. Buttons have the same disadvantage as they did for phones—hard to learn, locked-in, fiddle-y, non-reprogrammable. Also, they're two-dimensional, a minefield of error opportunities because they're spread out in a field.

    This gesture-based sensor technology promises a programmable language based on multidimensional imaginary tools, instead of hardware buttons.
  • Reply 6 of 12
    jungmarkjungmark Posts: 6,926member
    crysisftw wrote: »

    Actually I never could understand the benefit of gestures (unless you want to look ridiculous).
    Obviously you never been in traffic around Philly during rush hour. Oh, not those gestures.
  • Reply 7 of 12
    ochymingochyming Posts: 474member
    Quote:

    Originally Posted by tylerk36 View Post

     

    A new concept for online dating.  Imagine the possibilities?


     

    naughty thought.

  • Reply 8 of 12
    tylerk36 wrote: »
    A new concept for online dating.  Imagine the possibilities?

    Ha!

    Patently Apple has an article today:

    Apple Reveals New Camera Lens Attachment System for iDevices

    http://www.patentlyapple.com/patently-apple/2014/03/apple-reveals-new-camera-lens-attachment-system-for-idevices.html#more


    In this article they discuss ways to securely attach external lenses to devices like iPhones -- while, at the same time, allowing the lenses to detach themselves to minimize damage [to both the lens and the iPhone] if the device is dropped.

    Here is how they describe this "drop event" in patent-speak:
    Additionally, the force required to separate the attachment mechanisms without rotation may be adjusted by changing a stiffness of the compliance member. In this regard, a softer material may decrease the force required to separate the attachment mechanisms without rotation. Conversely a harder material may increase the force required to separate the attachment mechanisms without rotation. Adjustment of the stiffness of the compliance member may also affect the torque required to secure and release the attachment mechanisms via rotation in the same manner. In this regard, a stiffer compliance member may require more torque for rotational attachment and release, and a softer compliance member may require less torque for rotational attachment and release.
  • Reply 9 of 12
    flaneur wrote: »
    Seems like a new era in interfaces is on its way. Fingers will interact with light in 3D instead of capacitance in 2D. Precision in pointing and manipulating screen objects should increase considerably.

    Something like the device interactions portrayed in the first few seconds of this video:


    [VIDEO]http://player.vimeo.com/video/86946802?autoplay=1&wmode=opaque[/VIDEO]
  • Reply 10 of 12
    jungmark wrote: »
    crysisftw wrote: »

    Actually I never could understand the benefit of gestures (unless you want to look ridiculous).
    Obviously you never been in traffic around Philly during rush hour. Oh, not those gestures.

    Ahh ... The Crossovers :D
  • Reply 11 of 12
    Cant wait till we have real hand gestures to control our computers (NCIS LA style)
    no more keyboards soon i hope
  • Reply 12 of 12
    SpamSandwichSpamSandwich Posts: 33,407member
    mrsneezy wrote: »
    Cant wait till we have real hand gestures to control our computers (NCIS LA style)
    no more keyboards soon i hope

    Sorry, but keyboards have many advantages over voice and gesture input for a whole host of tasks, which is not to say that new methods are not a good idea, but let the tool fit the task.
Sign In or Register to comment.