Apple confirms PrimeSense acquisition, deal rumored to be worth $360M

13

Comments

  • Reply 41 of 78
    ireland wrote: »
    nikilok wrote: »
    And does there in-house sensor perform as good as Prime Sense's sensor ?

    And more importantly , would Microsoft's in-house sensor technology come out as a copy of Prime Sense's tech ?


    Does Prime Sense hold patents to there tech ?

    Patents are the worst thing ever to happen this industry.

    What an appallingly uninformed assertion! Yikes....
  • Reply 42 of 78
    gatorguygatorguy Posts: 24,213member

    It is also a bonus negative impact to their competitors with them grabbing all of the technology and patents and taking this companies products off the market. 

    As IP goes Intel may have as much in the way of 3D gesture patents as anyone else, and certainly much more than PrimeSense. Look for Intel-based PC's with gesture control hitting store shelves this next year.
    http://gigaom.com/2013/09/05/intels-gesture-control-tech-will-be-built-into-pcs-from-2014-but-heres-how-to-try-it-sooner/

    All in all Apple is a bit late to the party but with deeper pockets than any of the others they could certainly catch up quickly if it's a priority. They'll probably need to license IP from a few of their tech neighbors tho. Companies like MS, Intel and Qualcomm (perhaps even Google) have a significant head start.
  • Reply 43 of 78
    MarvinMarvin Posts: 15,323moderator
    ascii wrote: »
    I wonder what they will use that technology for? The Apple TV of course but maybe the iMac too.

    There are loads of potential uses for 3D sensors as they give computers depth perception.

    They can use them for the camera so that when you take a photo, it takes a snapshot of the depth information in the scene allowing you to adjust focal blur after taking the picture.
    This also allows better augmented reality and could even be used as an assistive device for the blind as it can detect shapes and fast moving vehicles and talk in the Siri voice to describe what's ahead via shape recognition.
    This can be used by app developers to make apps that will measure objects so if you have to ship a parcel for example, you'd just point the iPhone at it and it will tell you length, width, height without a tape measure. It can be used for measuring clothing/shoe sizes too as you'd have someone point the sensor at you and it can tell you leg length, walk around and it can tell chest size, neck size etc.
    On the front of a device, it would allow them to simulate pressure sensitivity. So you could use a passive stick as a stylus without requiring a special display but any object can be used so if you lose the object being used, it's no big deal and a finger would work the same anyway. This can distinguish between a touch and a press. It can turn an iPad into a Wacom device.
    A front-facing sensor can also do face detection as it can measure the size and shape of the face. If they ever make glasses-free 3D displays, it can be used for augmented Facetime or something as simple as a Face unlock.
    On the Macs, it can be used to get rid of the keyboard and mouse or they can at least be passive objects. Some of the sensors have lower latency than peripherals but it would mean Apple can ship keyboards and mice with no batteries and no wireless electronics inside. The sensor would detect the object and your hand interaction with it. No replacing batteries, lower weight, lower cost peripherals, faster response, no wake-up time.
    The entire surface of the laptops could be flattened with no moving parts. They can laser-perforate the surface like the sleep light (as long as it didn't affect strength) and have a flat backlight shining through an e-ink sheet with a customisable keyboard pattern. This allows contextual interaction with software and multi-language keyboards in a single version or custom layouts e.g numpads, dvorak, bootcamp etc. They can make the sensation tangible with vibrational feedback like the Steam controller. If you don't need to type, you get the full surface as a mouse pad. At the very least, they can make the keys passive and they can detect which keys you are about to press and do a type-ahead so you can tell if you are about to make a typo.
    It can scan 3D objects in a very crude way but this can be useful. If you wanted to do a mockup of a new iPhone, you could just hold the current one over the laptop surface and it would scan a 3D model into the computer that gives you a base to work from. If you sell something on eBay and someone isn't sure of the dimensions of what you sell, you can send a scan and they can use augmented reality to check if it will fit their use whether it's shoes, jewellery or a cabinet.
    It can be used by the deaf to sign to someone who doesn't understand sign language and the other end can translate the gestures along with lip reading.
  • Reply 44 of 78
    asciiascii Posts: 5,936member
    Quote:

    Originally Posted by AppleSauce007 View Post

     

     

    How about a Hollodeck like the ones in Star Trek, or iBeacon enhancements or new iRobots with computer vision like Data form Star Trek...

     

    This technology has a lot of applications.

     

    Time will tell.


    There are degrees of speculation. One can reason from their current products and past history without being in the realm of Star Trek.

  • Reply 45 of 78
    asciiascii Posts: 5,936member
    Quote:

    Originally Posted by Marvin View Post





    There are loads of potential uses for 3D sensors as they give computers depth perception.



    They can use them for the camera so that when you take a photo, it takes a snapshot of the depth information in the scene allowing you to adjust focal blur after taking the picture.

    This also allows better augmented reality and could even be used as an assistive device for the blind as it can detect shapes and fast moving vehicles and talk in the Siri voice to describe what's ahead via shape recognition.

    Lots of clever and practical ideas there. I especially like the one about helping the blind and that the keyboard could just become a prop.

  • Reply 46 of 78

    Big question for the industry is will Apple continue to support current customers of PrimeSense? Will they expand in these areas? Will these companies, many in the health care field be abandoned? 

     

    Will PrimeSense continue to exist as a wholly owned subsidiary? 

  • Reply 47 of 78
    truffol wrote: »
    Would be great if Apple is incorporating this into the Apple TV for a built-in gaming console. They need to move their existing iOS gaming platform to the living room!

    They have already with AirPlay
  • Reply 48 of 78
    All this talk about TV, but I'm thinking their Capri 3D sensor might be more inline with what Apple would be interested in.

    http://www.primesense.com/market/mobile/

    Here is an article on engadget about the demos they had at Google IO where they stuck a sensor on the back of a Nexus 10 tablet. 

    http://www.engadget.com/2013/05/15/primesense-demonstrates-capri-3d-sensor/

    <img alt="" class="lightbox-enabled" data-id="35368" data-type="61" src="http://forums.appleinsider.com/content/type/61/id/35368/width/350/height/700/flags/LL" style="; width: 350px; height: 232px">


    It is also a bonus negative impact to their competitors with them grabbing all of the technology and patents and taking this companies products off the market. 

    I imagine this same sensor arrangement could be translated into a Google Glass-like device and it is smart to remove that card from Google's deck.
  • Reply 49 of 78
    andysolandysol Posts: 2,506member
    solipsismx wrote: »
    That was my second thought after the obvious Apple HDTV HW. The current solutions on mobile devices by other vendors aren't very good.

    Side note- good to have you back Soli, you were missed.
  • Reply 50 of 78
    solipsismxsolipsismx Posts: 19,566member
    gqb wrote: »
    I tend to doubt that this is as much for iOS as it is for Apple TV and for OSX.
    AtV for obvious reasons. OSX because I don't think apple (or many beyond MS for that matter) think touch makes sense on a desktop, but gestures do.
    As for iOS, what do gestures really give you when you altpready have your hands on the device? Just look at Samsung's eyeball control to see this 'solution in search of a problem' approach.

    I agree the Apple (HD)TV is the obvious one but one thing that does seem to lessen my user experience with iOS are the steps some actions require. This is of course the constant struggle to maximize usable space while having on-screen controls.

    For instance, Safari in iOS 7 is now full screen but if I want to switch my Safari window I need scroll toward the to of the page about 20%(?) of the window, then choose the switch window button, and then choose the new window (or create a new one). I'd love to have a simple air gesture that would do this on one move would be a much faster, more graceful, and more natural motion.

    I also wonder if this sensor, up close, could be used for sign language. I don't use the accessibility options much — except for inverting colors in a movie theater before the film starts — but I have to wonder if there are people that can't touch some of the smaller aspects of a touch display with ease that would be better suited with air gestures.

    I know there are already gestures on at least the Galaxy S4 but it seems very clunky. I would expect the difference between what Apple and Samsung offers to be as wide as the gap between Apple's Touch ID and what HTC's fingerprint scanner on the HTC One Max.
  • Reply 51 of 78
    Big question for the industry is will Apple continue to support current customers of PrimeSense? Will they expand in these areas? Will these companies, many in the health care field be abandoned? 

    Will PrimeSense continue to exist as a wholly owned subsidiary? 

    Perhaps the real question should be, will Apple move into the health care field in a more direct way? I see a near-future Apple where AI assistants help people analyze and diagnose and solve complex problems as one might hire a high-priced consultant, but these AI's will be built in to Apple's products.
  • Reply 52 of 78
    gatorguygatorguy Posts: 24,213member
    I imagine this same sensor arrangement could be translated into a Google Glass-like device and it is smart to remove that card from Google's deck.
    Google has been incorporating gesture controls in Google Glass for quite some time, with more on the way. I don't think Apple acquiring Primesense will have any effect on that.
  • Reply 53 of 78
    gatorguy wrote: »
    Google has been incorporating gesture controls in Google Glass for quite some time, with more on the way. I don't think Apple acquiring Primesense will have any effect on that.

    As far as I know, Glass does not incorporate 3-D sensors, only a single camera.
  • Reply 54 of 78
    solipsismx wrote: »
    I agree the Apple (HD)TV is the obvious one but one thing that does seem to lessen my user experience with iOS are the steps some actions require. This is of course the constant struggle to maximize usable space while having on-screen controls.

    For instance, Safari in iOS 7 is now full screen but if I want to switch my Safari window I need scroll toward the to of the page about 20%(?) of the window, then choose the switch window button, and then choose the new window (or create a new one). I'd love to have a simple air gesture that would do this on one move would be a much faster, more graceful, and more natural motion.

    I also wonder if this sensor, up close, could be used for sign language. I don't use the accessibility options much — except for inverting colors in a movie theater before the film starts — but I have to wonder if there are people that can't touch some of the smaller aspects of a touch display with ease that would be better suited with air gestures.

    I know there are already gestures on at least the Galaxy S4 but it seems very clunky. I would expect the difference between what Apple and Samsung offers to be as wide as the gap between Apple's Touch ID and what HTC's fingerprint scanner on the HTC One Max.

    Would be fascinating use of such a sensor array to get real time sign language translation.
  • Reply 55 of 78
    By using Samsung jury award this acquisition is essentially free!
  • Reply 56 of 78

     


    Originally Posted by ascii View Post



    I wonder what they will use that technology for? The Apple TV of course but maybe the iMac too.

    Quote:


     

    How about leveraging this acquisition with the power found in the new Mac Pro and an "under the radar" 3-D printer Apple has in it's skunk works along with their Liquid Metal technology?

     

    Could be a game changer in the 3-D printer market ... a very "HOT" tech sector with a multi-billion $$ opportunity. 

     

    Remember it was Apple that ushered in the Laser printer era years ago. 

     

    Anyone know if Liquid Metal could be used as a 3-D material?

  • Reply 57 of 78
    gatorguygatorguy Posts: 24,213member

    Anyone know if Liquid Metal could be used as a 3-D material?

    Based on an Apple patent app they apparently think it can. I don't know it would be used outside of industrial or in-house mockups tho. Kinda pricey.
  • Reply 58 of 78
    Quote:
    Originally Posted by Gatorguy View Post





    Based on an Apple patent app they apparently think it can. I don't know it would be used outside of industrial or in-house mockups tho. Kinda pricey.

    Thanks for the quick reply. I agree about industrial being the big market but it's also one of the fastest growing segments with Auto and AeroSpace paying big bucks for prototyping and actual production parts. Apple could potentially take the premium market in 3-D metal printers that couldn't be matched.

     

    I need to do more research into the specifics of Liquid Metal, but just the name "Liquid Metal" sounds as tho it's perfect for "ink-jet" like printing. 8-)

  • Reply 59 of 78
    solipsismx wrote: »
    I agree the Apple (HD)TV is the obvious one but one thing that does seem to lessen my user experience with iOS are the steps some actions require. This is of course the constant struggle to maximize usable space while having on-screen controls.

    For instance, Safari in iOS 7 is now full screen but if I want to switch my Safari window I need scroll toward the to of the page about 20%(?) of the window, then choose the switch window button, and then choose the new window (or create a new one). I'd love to have a simple air gesture that would do this on one move would be a much faster, more graceful, and more natural motion.

    I also wonder if this sensor, up close, could be used for sign language. I don't use the accessibility options much — except for inverting colors in a movie theater before the film starts — but I have to wonder if there are people that can't touch some of the smaller aspects of a touch display with ease that would be better suited with air gestures.

    I know there are already gestures on at least the Galaxy S4 but it seems very clunky. I would expect the difference between what Apple and Samsung offers to be as wide as the gap between Apple's Touch ID and what HTC's fingerprint scanner on the HTC One Max.

    Would be fascinating use of such a sensor array to get real time sign language translation.

    I think Apple is more interested in what you will be able to do when you're in back of the camera rather than in front of it.
  • Reply 60 of 78
    I think Apple is more interested in what you will be able to do when you're in back of the camera rather than in front of it.

    I'd not make any large bets one way or another yet...other than consider buying more AAPL.
Sign In or Register to comment.