ARKit augmented reality tools in iOS 11 will make iPhone, iPad world's largest AR platform...
Making good on the previous promise that Apple would be taking an interest in augmented reality in the future, Apple Vice President Craig Federighi announced ARKit, a developer toolset that it will make available to nearly istantaneously make the iPhone and iPad the largest AR platform in the world.
In a demonstration of software produced by the new ARKit, the software identified a table surface, and applied a virtual coffee cup properly scaled to the surface. Following a lamp's addition to the surface, the lighting model adjusted dynamically as Federighi moved the lamp around the cup.
Federighi noted that Apple's integration with the iOS and the iPhone hardware allowed for the technology. Companies said to be given demonstrations of the technology include Lego, Ikea, and more.
Wingnut AR is "Lord of the Rings" director Peter Jackson's company involved in augmented reality, and demonstrated live gameplay in Unreal Engine 4, with the audience clearly in the shot. A demo of Wingnut AR is expected in late 2017.
Apple will be providing code to developers for the ARKit at WWDC. The software will accompany Apple's iOS 11 beta releases.
In a demonstration of software produced by the new ARKit, the software identified a table surface, and applied a virtual coffee cup properly scaled to the surface. Following a lamp's addition to the surface, the lighting model adjusted dynamically as Federighi moved the lamp around the cup.
Federighi noted that Apple's integration with the iOS and the iPhone hardware allowed for the technology. Companies said to be given demonstrations of the technology include Lego, Ikea, and more.
Wingnut AR is "Lord of the Rings" director Peter Jackson's company involved in augmented reality, and demonstrated live gameplay in Unreal Engine 4, with the audience clearly in the shot. A demo of Wingnut AR is expected in late 2017.
Apple will be providing code to developers for the ARKit at WWDC. The software will accompany Apple's iOS 11 beta releases.
Comments
Basically the same goes for the VR demo with John Knoll. Was that demo supposed to show us the potential for a film maker to be "in" the scene and build it from the inside? If so, I can see how that might be a big jump in how movies (or even games) are produced. Maybe the director could be "in" the scene and try things from different angles and perspectives until they feel the shot is right. Or if it was for game development how everything is fitting together. Awesome! As a consumer I don't follow (I know, it's a developers conference), how would this tech be good for me? Keep in mind, the girl with the VR headset on walked into the green screen at one point early on. Granted, the green screen was there for our benefit so they could key her into the live VR stream. But that just helps to demonstrate how "walking around" a VR scene has some very real, physical constraints.
They had real-time performance capture and the computers showed a lower quality rendered shot of the CGI scenes so the director could see how the CGI would look while directing the movie.
The Agents of Shield TV show uses an iPad with a depth sensor to do some visual effects, the 3D scan lets them do surface reconstruction so they don't have to do this step manually, which gives faster turnaround times for TV schedules:
https://www.geekwire.com/2016/occipitals-structure-sensor-helps-transform-visual-effects-of-marvels-agents-of-s-h-i-e-l-d/
Around 1:06 in the following video, you can see a similar 3D reconstruction:
Apple's AR demos show placing virtual objects on the table and a Star Wars hologram chess game:
There are a number of applications for this tech for average users. Facetime could for example show someone sitting in front of you rather than in a 2D view. Apple showed Pokemon Go with more realistic integration with the real world.
It can work for Snapchat filters or even just selfies e.g 'hey Siri, make my body look fitter in this selfie' or 'adjust my makeup' and it can do real-time tracking and adjustment. It can overlay virtual clothing onto your body to let you see how an outfit would look on you before you buy it online. Amazon can provide 3D clothing data. It can let you see yourself with different hairstyles. It can do object removal like remove other tourists from a photo. It can show what a product looks like in your home like a new sofa, curtains, carpet, wall paint.
These applications are better with a full depth sensor but Apple's demos show pretty good tracking as they have a motion sensor in the device and the cameras can figure out depth from moving it around. They have a depth API now for the dual camera iPhone that can help here.
Being able to be inside a VR environment is for VR developers. If you are building a VR game or experience, it's better to be able to see what people using it will see while building it.
VR doesn't have a very broad appeal. Sony announced 1 million PSVR sales recently vs ~60 million consoles:
https://www.theverge.com/2017/6/5/15719382/playstation-vr-sony-sales-one-million
That data also shows sales slowing down because they sold 915k in the first 4 months on the market and <100k in the 4 months up until now. Facebook shut down their VR animation studio recently, the XBox guys are supporting it but see it as very early days:
http://www.gamasutra.com/view/news/295830/QA_With_Scorpio_rising_Phil_Spencer_looks_to_the_future_of_Xbox.php
http://stevivor.com/news/xboxs-phil-spencer-vr-will-come-project-scorpio-doesnt-feel-like-demos-experiments/
"I think we're on like a decade-long journey with VR, and we're still right at the beginning."
It has to lose the tether to have mainstream appeal, which means being able to wirelessly send very high-bandwidth, high FPS video or get the computing power into the headset. They could maybe have a holster for an iPad with a cable attached; strapping a smartphone to your face isn't a good way to go for long-term use.
AR/VR are at a development stage and it's worth showing to developers to see if they can make something out of it. It might have better traction in the education and experience industry. A classroom can go on a virtual tour to a foreign country to learn about things like explore pyramids, jungles etc - things that keep people engaged but don't need long-term use.
It can work for certain professions too like mechanics showing overlays on vehicles, some medical applications, cooking where it shows a table of ingredients and you put the ingredients where they are supposed to go so you don't make a mistake.
There are a number of possibilities, most might never become practical or financially worth doing but Apple has it covered regardless. It's better that they support it and do it right with the tracking than that they don't support it at all.
The point was not whether the game play was interactive or not, but that the AR kept the game scene rock solid 'on the table' regardless of the motions of this second person.. that was pretty impressive..
The next steps are that developers will exploit this capability in their games and apps.. That's where the benefit to you will come from..