ARKit augmented reality tools in iOS 11 will make iPhone, iPad world's largest AR platform...

Posted:
in iPhone edited December 2019
Making good on the previous promise that Apple would be taking an interest in augmented reality in the future, Apple Vice President Craig Federighi announced ARKit, a developer toolset that it will make available to nearly istantaneously make the iPhone and iPad the largest AR platform in the world.




In a demonstration of software produced by the new ARKit, the software identified a table surface, and applied a virtual coffee cup properly scaled to the surface. Following a lamp's addition to the surface, the lighting model adjusted dynamically as Federighi moved the lamp around the cup.

Federighi noted that Apple's integration with the iOS and the iPhone hardware allowed for the technology. Companies said to be given demonstrations of the technology include Lego, Ikea, and more.

Wingnut AR is "Lord of the Rings" director Peter Jackson's company involved in augmented reality, and demonstrated live gameplay in Unreal Engine 4, with the audience clearly in the shot. A demo of Wingnut AR is expected in late 2017.

Apple will be providing code to developers for the ARKit at WWDC. The software will accompany Apple's iOS 11 beta releases.

Comments

  • Reply 1 of 9
    slprescottslprescott Posts: 765member
    Very exciting.  Tim's comments about AR & VR over the past 18 months are finally manifesting themselves in technology (for developers),and the impact on consumers will be huge as future hardware becomes available to exploit the software.
    cali
  • Reply 2 of 9
    tallest skiltallest skil Posts: 43,388member
    I hope that the demo in the image above wasn’t 100% pre-rendered, because if you could get a game (or whatever) with assets that dynamically respond to the “terrain” that YOU have set up (that guy falling off the table, for example), that would be absolutely amazing.
    SpamSandwichpscooter63cali
  • Reply 3 of 9
    foggyhillfoggyhill Posts: 4,767member
    The Iphone coming out at the of the year will be a powerhouse, would not be surprised to see the first fully custom Apple GPU coming out then.
    cali
  • Reply 4 of 9
    hexclockhexclock Posts: 1,252member
    What a sick demo. I can't wait to see more of that. 
  • Reply 5 of 9
    Very exciting.  Tim's comments about AR & VR over the past 18 months are finally manifesting themselves in technology (for developers),and the impact on consumers will be huge as future hardware becomes available to exploit the software.
    Don't need new hardware... existing hardware is powerful enough to git 'er done.

  • Reply 6 of 9
    While the AR demo was cool I don't really understand what I was supposed to take away from it or what advantage AR provided in that scenario.  I also wasn't sure if they were showing us a movie or if it was supposed to be a game.  I don't think it was a game, as it didn't appear that anyone was playing.  All the 2nd, non-speaking guy did was move the iPad around so we could get a different perspective of the scene, I don't think he was actually controlling what was going on in the scene at all.  If it was a movie, I'm still stumped as to why I would want it.  Again, the tech is cool, but if the idea is to watch a movie play on my dining room table with my kitchen in the background I can't really say I'd be interested.

    Basically the same goes for the VR demo with John Knoll.  Was that demo supposed to show us the potential for a film maker to be "in" the scene and build it from the inside?  If so, I can see how that might be a big jump in how movies (or even games) are produced.  Maybe the director could be "in" the scene and try things from different angles and perspectives until they feel the shot is right.  Or if it was for game development how everything is fitting together.  Awesome!  As a consumer I don't follow (I know, it's a developers conference), how would this tech be good for me?  Keep in mind, the girl with the VR headset on walked into the green screen at one point early on.  Granted, the green screen was there for our benefit so they could key her into the live VR stream.  But that just helps to demonstrate how "walking around" a VR scene has some very real, physical constraints.
    cali
  • Reply 7 of 9
    MarvinMarvin Posts: 15,322moderator
    While the AR demo was cool I don't really understand what I was supposed to take away from it or what advantage AR provided in that scenario.  I also wasn't sure if they were showing us a movie or if it was supposed to be a game.  I don't think it was a game, as it didn't appear that anyone was playing.  All the 2nd, non-speaking guy did was move the iPad around so we could get a different perspective of the scene, I don't think he was actually controlling what was going on in the scene at all.  If it was a movie, I'm still stumped as to why I would want it.  Again, the tech is cool, but if the idea is to watch a movie play on my dining room table with my kitchen in the background I can't really say I'd be interested.

    Basically the same goes for the VR demo with John Knoll.  Was that demo supposed to show us the potential for a film maker to be "in" the scene and build it from the inside?  If so, I can see how that might be a big jump in how movies (or even games) are produced.  Maybe the director could be "in" the scene and try things from different angles and perspectives until they feel the shot is right.  Or if it was for game development how everything is fitting together.  Awesome!  As a consumer I don't follow (I know, it's a developers conference), how would this tech be good for me?  Keep in mind, the girl with the VR headset on walked into the green screen at one point early on.  Granted, the green screen was there for our benefit so they could key her into the live VR stream.  But that just helps to demonstrate how "walking around" a VR scene has some very real, physical constraints.
    The AR/VR tech would be better applied to smart glasses than having to hold up an iPad all the time but they have to use what's available. This has been used in movies for years already e.g Avatar in 2009, skip to 5:00:



    They had real-time performance capture and the computers showed a lower quality rendered shot of the CGI scenes so the director could see how the CGI would look while directing the movie.

    The Agents of Shield TV show uses an iPad with a depth sensor to do some visual effects, the 3D scan lets them do surface reconstruction so they don't have to do this step manually, which gives faster turnaround times for TV schedules:

    https://www.geekwire.com/2016/occipitals-structure-sensor-helps-transform-visual-effects-of-marvels-agents-of-s-h-i-e-l-d/

    The series of illustrations demonstrates the scanning process from what was scanned using the Structure Sensor on the iPad and given to the Visual Effects house FuseFX the effects being applied by the VFX house and the image as seen during the broadcast

    Around 1:06 in the following video, you can see a similar 3D reconstruction:



    Apple's AR demos show placing virtual objects on the table and a Star Wars hologram chess game:





    There are a number of applications for this tech for average users. Facetime could for example show someone sitting in front of you rather than in a 2D view. Apple showed Pokemon Go with more realistic integration with the real world.

    It can work for Snapchat filters or even just selfies e.g 'hey Siri, make my body look fitter in this selfie' or 'adjust my makeup' and it can do real-time tracking and adjustment. It can overlay virtual clothing onto your body to let you see how an outfit would look on you before you buy it online. Amazon can provide 3D clothing data. It can let you see yourself with different hairstyles. It can do object removal like remove other tourists from a photo. It can show what a product looks like in your home like a new sofa, curtains, carpet, wall paint.

    These applications are better with a full depth sensor but Apple's demos show pretty good tracking as they have a motion sensor in the device and the cameras can figure out depth from moving it around. They have a depth API now for the dual camera iPhone that can help here.

    Being able to be inside a VR environment is for VR developers. If you are building a VR game or experience, it's better to be able to see what people using it will see while building it.

    VR doesn't have a very broad appeal. Sony announced 1 million PSVR sales recently vs ~60 million consoles:

    https://www.theverge.com/2017/6/5/15719382/playstation-vr-sony-sales-one-million

    That data also shows sales slowing down because they sold 915k in the first 4 months on the market and <100k in the 4 months up until now. Facebook shut down their VR animation studio recently, the XBox guys are supporting it but see it as very early days:

    http://www.gamasutra.com/view/news/295830/QA_With_Scorpio_rising_Phil_Spencer_looks_to_the_future_of_Xbox.php
    http://stevivor.com/news/xboxs-phil-spencer-vr-will-come-project-scorpio-doesnt-feel-like-demos-experiments/

    "I think we're on like a decade-long journey with VR, and we're still right at the beginning."

    It has to lose the tether to have mainstream appeal, which means being able to wirelessly send very high-bandwidth, high FPS video or get the computing power into the headset. They could maybe have a holster for an iPad with a cable attached; strapping a smartphone to your face isn't a good way to go for long-term use.

    AR/VR are at a development stage and it's worth showing to developers to see if they can make something out of it. It might have better traction in the education and experience industry. A classroom can go on a virtual tour to a foreign country to learn about things like explore pyramids, jungles etc - things that keep people engaged but don't need long-term use.

    It can work for certain professions too like mechanics showing overlays on vehicles, some medical applications, cooking where it shows a table of ingredients and you put the ingredients where they are supposed to go so you don't make a mistake.

    There are a number of possibilities, most might never become practical or financially worth doing but Apple has it covered regardless. It's better that they support it and do it right with the tracking than that they don't support it at all.
    radarthekatcali
  • Reply 8 of 9
    SpamSandwichSpamSandwich Posts: 33,407member
    While the AR demo was cool I don't really understand what I was supposed to take away from it or what advantage AR provided in that scenario.  I also wasn't sure if they were showing us a movie or if it was supposed to be a game.  I don't think it was a game, as it didn't appear that anyone was playing.  All the 2nd, non-speaking guy did was move the iPad around so we could get a different perspective of the scene, I don't think he was actually controlling what was going on in the scene at all.  If it was a movie, I'm still stumped as to why I would want it.  Again, the tech is cool, but if the idea is to watch a movie play on my dining room table with my kitchen in the background I can't really say I'd be interested.

    Basically the same goes for the VR demo with John Knoll.  Was that demo supposed to show us the potential for a film maker to be "in" the scene and build it from the inside?  If so, I can see how that might be a big jump in how movies (or even games) are produced.  Maybe the director could be "in" the scene and try things from different angles and perspectives until they feel the shot is right.  Or if it was for game development how everything is fitting together.  Awesome!  As a consumer I don't follow (I know, it's a developers conference), how would this tech be good for me?  Keep in mind, the girl with the VR headset on walked into the green screen at one point early on.  Granted, the green screen was there for our benefit so they could key her into the live VR stream.  But that just helps to demonstrate how "walking around" a VR scene has some very real, physical constraints.
    They were just demos of the potential for developers and both were very impressive in displaying the power of ARKit... and the Dark Side of the Force.
    edited June 2017
  • Reply 9 of 9
    All the 2nd, non-speaking guy did was move the iPad around so we could get a different perspective of the scene, I don't think he was actually controlling what was going on in the scene at all.  If it was a movie, I'm still stumped as to why I would want it.  

    The point was not whether the game play was interactive or not, but that the AR kept the game scene rock solid 'on the table' regardless of the motions of this second person..  that was pretty impressive..   
    The next steps are that developers will exploit this capability in their games and apps..  That's where the benefit to you will come from..
    edited June 2017 radarthekatstantheman
Sign In or Register to comment.