Apple exploring 3D gestures to control devices from a distance

Posted:
in Future Apple Hardware edited January 2014
Apple is investigating new ways of interacting with devices, such as using hand gestures to navigate and control a video recording system without touching anything.



Apple's interest in hands-off control of a device like an iPhone, iPad or Mac was revealed this week in a new patent application made public by the U.S. Patent and Trademark Office. Entitled "Real Time Video Process Control Using Gestures," the filing, discovered by AppleInsider, is related to remotely controlling and editing video recordings on a mobile device.



Such editing could be done with gestures on a touchscreen, much like is already available on the iPhone and iPad. But within the application, Apple also makes mention of hand gestures that can be performed without touching the device.



The filing notes that a device could be controlled with hand gestures accomplished in either two or three dimensions, and these could be interpreted through infrared sensors, optical sensors, or other methods. These gestures could be used as a replacement for, or even in concert with, traditional touchscreen-based gestures.



"As with the touch based gestures applied on or near the touch sensitive input device, the hand gestures can be interpreted to provide instructions for real time processing of the video by the video capture device," the filing reads.



Apple's goal is to simplify and minimize the need for user input partially because the size of recording devices, like an iPhone or iPad, has become so small. The filing notes that placing a finger on a touch-sensitive display can cause a video capture device to move, and that movement is then translated to the video recording.



With Apple's method, a remote camera could be controlled wirelessly from a second, separate device. An iPhone or iPad are specifically mentioned in the filing as potential options for a "control device."







One image accompanying the application shows a video being recorded on an iPhone. That video is then transmitted wirelessly, via Bluetooth, to an iPad, where the user can view the video in real-time and make adjustments.



Given the volume of data that must be wirelessly transmitted, Apple's solution is to automate real-time video processing as much as possible, identifying objects and individual people's faces captured in a video. The filing even states that a system could help to determine how entities captured in the video relate to one other.



In one example provided, a video of two tennis players playing against each other could be analyzed to have a "negative correlation," as one player is hitting the ball while the other is not.



"Therefore, by determining the relative correlation between these two players, an implicit association can be assigned to each," the application reads.



Using this kind of data, the image could be framed according to user specifications. For example, after recognizing a specific face, a video capture device could zoom in and track that individual in real time, with minimal or no input from the user.



Apple's proposed invention, published this week by the USPTO, was originally filed in April of 2010. It is credited to Benjamin A. Rottler and Michael Ingrassia Jr. I.

Comments

  • Reply 1 of 17
    asciiascii Posts: 5,936member
    Of course computers recognising gestures with a camera is old news. But the idea of using an infrared sensor is a good one, the hands should stand out more than in the ordinary spectrum.
  • Reply 2 of 17
    MacProMacPro Posts: 19,727member
    I wonder if Apple regret turning down the Kinect opportunity they were offered before Microsoft.
  • Reply 3 of 17
    jensonbjensonb Posts: 532member
    Oh god please no. That kind of interface is terrible.
  • Reply 4 of 17
    This is it for the Apple HDTV next year.



    Of course, there could be:



    "Oh, I'm tired, (stretching arms over head)."



    Whoops, the channel changed..
  • Reply 5 of 17
    Just like an average human conversation, the device could be developed to blend a combination of controls from touch, gesture and voice control, which could provide much finer combinations of commands for the device to interpret and respond to. For example, muting could be done by touch, voice or a gesture (slash across throat gesture). The dtatbase for this would have to be fairly robust and would require contextuality an order of magnitude igher than voice control currently offers. You would also have to decide if it will support traditional culturally derived gestures, or a subset of common/universal gestures. Inriguing concept!
  • Reply 6 of 17
    are they trying to patent Kinect ?
  • Reply 7 of 17
    MacProMacPro Posts: 19,727member
    Quote:
    Originally Posted by [email protected] View Post


    This is it for the Apple HDTV next year.



    Of course, there could be:



    "Oh, I'm tired, (stretching arms over head)."



    Whoops, the channel changed..



    Or ...



    "George. why is it every time you are alone with my sister in there the TV changes to the Playboy channel?"
  • Reply 8 of 17
    pendergastpendergast Posts: 1,358member
    Quote:
    Originally Posted by Ricardo Dawkins View Post


    are they trying to patent Kinect ?



    I don't think it works the same way.



    It's like saying all touchscreens are the same; there's pressure, capacitive, etc.
  • Reply 9 of 17
    eriamjheriamjh Posts: 1,642member
    Quote:
    Originally Posted by digitalclips View Post


    Or ...



    "George. why is it every time you are alone with my sister in there the TV changes to the Playboy channel?"



    The infrared sensor is detecting more than just his hand!
  • Reply 10 of 17
    I recall reading a book "I Robot" where the owners of robots would interact with them by using gestures. The future just got a little closer.
  • Reply 11 of 17
    I've always thought that this was the next big input method after touch (rather, complimenting touch). Capacitive touch is intuitive, responsive, and fast. But it's not very precise, and there's limited opportunity for secondary input.



    For example, there's no varied input, like a pressure-sensitive touchscreen or the trigger on a game controller. There's no equivalent of a "Mouse over", to bring up help, explanations, etc. Apple's been pretty clever about stretching Multitouch as far as it can go: Swiping, multi-finger swiping, pinching, long-hold, double-tap, etc.



    But it brings up even more possibilities if you could say, bring up a menu simply by floating your hand for a moment, then using touch to select. Or using depth (distance) to convey extra precision. This would be especially helpful if Apple makes the jump to Multitouch desktops, TVs, etc.
  • Reply 12 of 17
  • Reply 13 of 17
    nagrommenagromme Posts: 2,834member
    Gestures take both time and energy?especially in the air with no resting surface. They?re cool, but I think they?ll remain a special-use interface, never becoming a primary control method. What special uses? Games of course! Also optional shortcuts for certain things?and a TV/sofa setup would make sense for that.



    I hope nobody tries to patent my special gesture for when the SAME AD comes on for the 5th time today!
  • Reply 14 of 17
    dobbydobby Posts: 797member
    Could make things interesting for the .xxx sites and their channels. Certain body movements could search for the channels your interested in.
  • Reply 15 of 17
    This is obviously for a future TV set, not for the iPad.



    ...and thank god....I had been hoping they would allow for air gestures for scrolling through program guides etc. instead of relying on voice input and this shows that gestures will indeed be part of the interface - woot!
  • Reply 16 of 17
    Quote:
    Originally Posted by Pendergast View Post


    I don't think it works the same way.



    It's like saying all touchscreens are the same; there's pressure, capacitive, etc.



    According to the little in the article, it would be using the same type of sensors: "and these could be interpreted through infrared sensors, optical sensors, or other methods. "
  • Reply 17 of 17
    philboogiephilboogie Posts: 7,675member
    Quote:
    Originally Posted by Eriamjh View Post


    The infrared sensor is detecting more than just his hand!



    Pointing up one big finger won't happen as the Apple TV won't' display that kind of content. Same as in the AppStore.
Sign In or Register to comment.