Hypereality

About

Username
Hypereality
Joined
Visits
32
Last Active
Roles
unconfirmed, member
Points
191
Badges
1
Posts
44
  • Apple's $4,999 all-in-one iMac Pro launches Thursday, Dec. 14

    Now the AR and VR developers at Apple have something fast enough to develop on. 

    For me its a tad expensive, I can get 22k Geekbench on my iMac 5k and 32Gb RAM is enough.  

    I would like this machine as its nearly twice as fast with the 10 core, but guessing its going to be nearer £6,000.00 in UK pounds, so hard to justify. 

    For Swift developers that speed is worth it as the compiler ain't fast enough yet.
    argonaut
  • Himax reportedly joining 3D sensor supply chain for 'iPhone 8'

    Would the phone need 2 front-facing cameras to capture 3D?
    There are quite a few different ways to capture 3D with a monocular sensor.  The way it works on Google Tango is that there is a laser that projects a 'structured light' network over the scene (think graph paper that the eye cannot see).  Then an infrared sensor picks up the laser light and this is used to construct a depth map (point cloud) by reverse calculating from the distorted grid from the camera back to the transformation needed to make the structure normal. As the phone moves around the depth map is corrected and built up over time and progressively becomes more accurate.  

    So while there is only one camera, there is structured light emitted from a separate location on the device.  In this article both a sensor is discussed as well as a VCSEL laser.  One laser to illuminate the scene and a sensor to record it.  That's enough to read a 3D point cloud from the scene. 




    ai46slprescott