Apple's ARKit 4 anchors 3D reality into real-world Maps locations

Posted:
in iPad edited June 2020
With ARKit 4, Apple is building the foundation for a virtual world of animated, interactive 3D "reality" explorable by anyone with a newer iPhone in their pocket.

ARKit 4
in ARKit 4, virtual objects can be anchored to real world locations


At WWDC20, Apple outlined futuristic new features coming to ARKit 4. Previous releases of the company's augmented reality platform first enabled basic 3D graphics to appear fixed in place in the video stream from an iOS 11 camera using Visual Inertial Odometry. This let a user explore a virtual object from all sides or play an interactive game fixed via AR onto a table surface.

ARKit 2 in 2018's iOS 12 introduced shared AR worlds, where two users could see different views of the same virtual scene, enabling multiuser gameplay in augmented reality apps. Last years' ARKit 3 introduced motion capture and people occlusion, which allowed virtual objects to move in front and behind people, while understanding how a person was positioned in front of the camera.

Apple's investments in a collection of AR and related technologies use computer vision, machine learning, and motion detection to build increasingly sophisticated, interactive worlds of computer graphics fixed in place within a camera view. It has ended up being far more successful and impactful than competitive efforts to sell smartphone VR over the last several years.

Location Anchors

ARKit 4 in the freshly announced iOS 14 adds Location Anchors, which can fix an AR model at a specific location in the real world, defined by latitude, longitude, and elevation. This could be used to present virtual artwork, as Apple demonstrated with a KAWS installation positioned at the Ferry Building in San Francisco. It could also be used to position labels that are fixed in space at a specific location, or to build entire AR experiences at a given spot.

To provide more accuracy in anchoring a location than GPS can provide on its own, Apple demonstrated ARKit 4 using visual location, which uses machine learning to match up landmarks seen by the camera with a localization map downloaded from Apple Maps corresponding to the current location. This appears to be data collected by Apple Maps vehicles to build out the explorable Look Around Maps in major cities.

Apple's #WWDC20 presentation of #ARKit4 casually mentions that Apple Maps has a localization map that can accurately position your iPhone using the camera and machine learning. https://t.co/dM7flEOBR6

-- Daniel Eran Dilger (@DanielEran)


The location matching process is done locally on your phone, with no private information being sent back to Apple's servers. This impressive intersection of technologies could further be paired with other features Apple has been detailing at WWDC20, including App Clips to provide instant functionality, including payment using Apple Pay or a user login using "Sign in with Apple."

Future applications of these technologies offer obvious use cases for wearable AR, including the rumored "Apple Glass" that could navigate points of interest and allow a user to interact with anchored AR experiences to find more information or handle a variety of sophisticated app transactions.

Depth API and LiDAR


In addition to anchoring an AR experience to a fixed point the real world, ARKit 4 also now provides advanced scene understanding capabilities with a new Depth API. This enables it to use the LiDAR scanner on the newest iPad Pro -- and which is rumored to appear on an upcoming iPhone 12 model-- to rapidly capture a detailed mesh of depth information of the surrounding environment.

Rather than scanning the scene with the camera first, LiDAR enables immediate placement of virtual objects in an AR experience, such as a game that interacts with real-world objects in the room.

Apple demonstrated its new Depth and Scene Geometry APIs that use #LiDAR to rapidly map out and identify objects in a scene for rich AR experiences #ARKit4 #WWDC20 pic.twitter.com/rD2KQys4vk

-- Daniel Eran Dilger (@DanielEran)


The new Scene Geometry API can create a topological map of the environment, which can be used together with semantic classification to identify physical objects and distinguish between the floor, walls and other objects, and understand the depth of objects in a scene and how they are arranged.

ARKit 4 can then place virtual objects in the scene in front of or behind occluded people or identified objects in the scene; use game physics to simulate realistic interactions between virtual and physical objects; and realistically light using raycasting to blur the line between what's real and the digital content augmenting reality.

Face and hand tracking

A third major advancement in ARKit 4 expands face tracking beyond devices equipped with a TrueDepth camera to the new iPhone SE and other products with at least an A12 Bionic processor.

Face tracking captures face anchors and geometry to allow graphics to be applied to the user's face, either to create a Memoji-like avatar that captures the user's expressions to animate a virtual character, or to apply virtual makeup or other lighting effects similar to Portrait Lighting in the iOS camera.

Apple has also added hand tracking to its Vision framework, enabling an iOS device to recognize not just full-body movements but individual poses of the fingers of a hand. One demonstration showed how a user could spell out words in the air, just by the camera watching and identifying precise hand movements.

Vision framework hand capture

Reality tools for building AR experiences

Apple is also providing a new Reality Converter app for bringing 3D models developed in an outside digital content creator tool into the usdz file format used by ARKit, as well as an update to Reality Composer for building and testing AR experiences and exporting them back into the portable usdz.

RealityKit also adds support for applying a video texture within an ARKit scene, such as a virtual television placed on a wall, complete with realism attributes such as light emission, alpha, and texture roughness.

The work Apple is doing in AR overlaps existing work others have started, notably Google. But one thing dramatically supporting Apple's efforts in AR is the company's vast installed base of high-end, sophisticated devices with structure sensors like TrueDepth on iPhone X and later and the new LiDAR sensor on new iPad Pros.

Apple launched ARKit three years ago and immediately became the world's largest AR platform, meaning that developers have a variety of opportunities for building experiences that large numbers of real-world users can experience. The company is still just getting started, and we can expect it to increasingly deploy new technologies that extend ARKit features into new directions, potentially including wearables glasses and vehicle windshields in addition to its current installed base of iOS and iPadOS handheld mobile devices.
watto_cobra

Comments

  • Reply 1 of 13
    EsquireCatsEsquireCats Posts: 1,268member
    Curious if the various tracking technologies could advance to the point where it forms a new input method on Apple Watch. (E.g. Gestural based hand writing recognition from the built in accelerometer. 
    Dan_Dilgerlolliverwatto_cobra
  • Reply 2 of 13

    At WWDC20, Apple outlined futuristic new features coming to ARKit 4. Previous releases of the company's augmented reality platform first enabled basic 3D graphics to appear fixed in place in the video stream from an iOS 11 camera using Visual Inertial Odometry. This let a user explore a virtual object from all sides or play an interactive game fixed via AR onto a table surface. 
    Aside from some cool demos and the one that allows people to explore Apple Park I have yet to see anything worthwhile come out of this. Are there a bunch of interactive ARKit games on the App Store that I’m missing? Also, what’s the point? How is playing a game that looks like it’s on my table better than playing that game regularly on my phone? I’d have to try it to find out but haven’t actually seen any games that do it.

    Apple had LEGO on stage last year. The demoed some things that actually looked like fun. Has any of that caught on? I have no idea, I don’t live in a LEGO household but it seemed like a good way for a kid to have more interactive play with something that’s already interactive.

    I tried that Lamborghini ARKit demo from their website a month or so ago. It was really lame. The car basically looked like poor video game graphics. When my SO walked around it appeared to attempt to use people occlusion so she would be blocking the appropriate parts of the view but it didn’t work well (lots of flashing of the overlay in her area, sometimes the car would block her so it appeared she came out of the hood instead of standing in front of the hood, that sort of thing). It was really bad and NOTHING like the demos. This was on a 2018 iPad Pro so I expected more.

    Apple launched ARKit three years ago and immediately became the world's largest AR platform, meaning that developers have a variety of opportunities for building experiences that large numbers of real-world users can experience. The company is still just getting started, and we can expect it to increasingly deploy new technologies that extend ARKit features into new directions, potentially including wearables glasses and vehicle windshields in addition to its current installed base of iOS and iPadOS handheld mobile devices.
    So, we’re 3 years into ARKit. Developers have had all that time to give us “experiences that large number of real-world users can experience”, but where are they?

    I fully believe Apple is working on glasses and most of these advancements are aimed at that. While true about Apple immediately have the largest AR platform I don’t think, currently, that really means anything. Has there been anyone ever that was excited enough about ARKit that it compelled them to upgrade their phone? I doubt it. However, I personally know several people who have upgraded due to camera enhancements.
    muthuk_vanalingambonobob
  • Reply 3 of 13
    Dan_DilgerDan_Dilger Posts: 1,584member

    So, we’re 3 years into ARKit. Developers have had all that time to give us “experiences that large number of real-world users can experience”, but where are they?

    I fully believe Apple is working on glasses and most of these advancements are aimed at that. While true about Apple immediately have the largest AR platform I don’t think, currently, that really means anything. Has there been anyone ever that was excited enough about ARKit that it compelled them to upgrade their phone? I doubt it. However, I personally know several people who have upgraded due to camera enhancements.
    I think you should first read:  https://appleinsider.com/articles/20/05/23/how-tim-cooks-augmented-reality-vision-paid-off-for-apple
    because that seems to answer your questions. 
    fastasleeplolliverwatto_cobra
  • Reply 4 of 13
    Does this recently-announced Android ability to measure AR depth with a single camera and no other tech signal that superior software enablements are allowing them to catch up to Apple in the field without resorting to more sophisticated and expensive tools? I have no idea how fast the Android method works, and what is the relative level of its precision. What say DED, or anyone else who knows more than me. 

    https://www.androidheadlines.com/2020/06/google-arcore-depth-api-single-camera-ar-public-launch.html
  • Reply 5 of 13
    Dan_DilgerDan_Dilger Posts: 1,584member
    Does this recently-announced Android ability to measure AR depth with a single camera and no other tech signal that superior software enablements are allowing them to catch up to Apple in the field without resorting to more sophisticated and expensive tools? I have no idea how fast the Android method works, and what is the relative level of its precision. What say DED, or anyone else who knows more than me. 

    https://www.androidheadlines.com/2020/06/google-arcore-depth-api-single-camera-ar-public-launch.html
    Google has delivered its own version of a variety of Apple's APIs, from CarPlay to Apple Pay to wearables support. It also has debuted a variety of its own clever ideas, which Apple has copied (like App Clips, Look Around, and Visual location). But think about the commercial relevance of Google Play and apps development. Sure there may be some simple AR experiences that use non-LiDAR/TrueDepth depth to deliver free promotional ad/demo/branding stuff like Snapchat filters, but there's no real commercial force driving apps on cheap commodity phones. The installed base of Samsung phones with ToF is negligible. So it's hard to imagine how Google will retain its interest in AR long enough to matter. It already threw in the towel on VR, after all that Cardboard went nowhere. 
    gatorguyfastasleeplolliverdaveedvdvRayz2016watto_cobra
  • Reply 6 of 13
    macplusplusmacplusplus Posts: 2,115member

    At WWDC20, Apple outlined futuristic new features coming to ARKit 4. Previous releases of the company's augmented reality platform first enabled basic 3D graphics to appear fixed in place in the video stream from an iOS 11 camera using Visual Inertial Odometry. This let a user explore a virtual object from all sides or play an interactive game fixed via AR onto a table surface. 
    Aside from some cool demos and the one that allows people to explore Apple Park I have yet to see anything worthwhile come out of this. Are there a bunch of interactive ARKit games on the App Store that I’m missing? Also, what’s the point? How is playing a game that looks like it’s on my table better than playing that game regularly on my phone? I’d have to try it to find out but haven’t actually seen any games that do it.

    Apple had LEGO on stage last year. The demoed some things that actually looked like fun. Has any of that caught on? I have no idea, I don’t live in a LEGO household but it seemed like a good way for a kid to have more interactive play with something that’s already interactive.

    I tried that Lamborghini ARKit demo from their website a month or so ago. It was really lame. The car basically looked like poor video game graphics. When my SO walked around it appeared to attempt to use people occlusion so she would be blocking the appropriate parts of the view but it didn’t work well (lots of flashing of the overlay in her area, sometimes the car would block her so it appeared she came out of the hood instead of standing in front of the hood, that sort of thing). It was really bad and NOTHING like the demos. This was on a 2018 iPad Pro so I expected more.

    Apple launched ARKit three years ago and immediately became the world's largest AR platform, meaning that developers have a variety of opportunities for building experiences that large numbers of real-world users can experience. The company is still just getting started, and we can expect it to increasingly deploy new technologies that extend ARKit features into new directions, potentially including wearables glasses and vehicle windshields in addition to its current installed base of iOS and iPadOS handheld mobile devices.
    So, we’re 3 years into ARKit. Developers have had all that time to give us “experiences that large number of real-world users can experience”, but where are they?

    I fully believe Apple is working on glasses and most of these advancements are aimed at that. While true about Apple immediately have the largest AR platform I don’t think, currently, that really means anything. Has there been anyone ever that was excited enough about ARKit that it compelled them to upgrade their phone? I doubt it. However, I personally know several people who have upgraded due to camera enhancements.
    AR is not new. It exists since last century. Google Earth or Mars flyby animations are to name a few. What Apple is trying to do is to introduce AR into the daily life of mere mortals. Apple Maps already do that with 3D views of cities. Maybe ARKit and MapKit will merge in the future, who knows?

    I remember the very first couple of Quicktime clips from around 1990, figuring in the Apple promo CDs. Those were simple animations. Quicktime was named "ReelTime" then. And look where we are now. Quicktime has reshaped the whole video and entertainment industry. In the AR field we made more progress than those early days of Quicktime. And I'm sure in AR Apple will succeed without waiting the arrival of Steve Jobs, because they have already the knowledge and production infrastructure in place.
    edited June 2020 watto_cobra
  • Reply 7 of 13
    mdriftmeyermdriftmeyer Posts: 7,503member
    It’s 2.5D hence Augmented not Virtual Reality.
  • Reply 8 of 13
    KITAKITA Posts: 409member
    Does this recently-announced Android ability to measure AR depth with a single camera and no other tech signal that superior software enablements are allowing them to catch up to Apple in the field without resorting to more sophisticated and expensive tools? I have no idea how fast the Android method works, and what is the relative level of its precision. What say DED, or anyone else who knows more than me. 

    https://www.androidheadlines.com/2020/06/google-arcore-depth-api-single-camera-ar-public-launch.html
    Google has delivered its own version of a variety of Apple's APIs, from CarPlay to Apple Pay to wearables support. It also has debuted a variety of its own clever ideas, which Apple has copied (like App Clips, Look Around, and Visual location). But think about the commercial relevance of Google Play and apps development. Sure there may be some simple AR experiences that use non-LiDAR/TrueDepth depth to deliver free promotional ad/demo/branding stuff like Snapchat filters, but there's no real commercial force driving apps on cheap commodity phones. The installed base of Samsung phones with ToF is negligible. So it's hard to imagine how Google will retain its interest in AR long enough to matter. It already threw in the towel on VR, after all that Cardboard went nowhere. 

    While it's an enterprise focused product, Microsoft's HoloLens and its services are an excellent demonstration of AR done right.

    Apple and Google are still struggling to show an equally meaningful use of AR that can be applied in the consumer market.
  • Reply 9 of 13
    fastasleepfastasleep Posts: 6,452member

    At WWDC20, Apple outlined futuristic new features coming to ARKit 4. Previous releases of the company's augmented reality platform first enabled basic 3D graphics to appear fixed in place in the video stream from an iOS 11 camera using Visual Inertial Odometry. This let a user explore a virtual object from all sides or play an interactive game fixed via AR onto a table surface. 
    Aside from some cool demos and the one that allows people to explore Apple Park I have yet to see anything worthwhile come out of this. Are there a bunch of interactive ARKit games on the App Store that I’m missing? Also, what’s the point? How is playing a game that looks like it’s on my table better than playing that game regularly on my phone? I’d have to try it to find out but haven’t actually seen any games that do it.

    Apple had LEGO on stage last year. The demoed some things that actually looked like fun. Has any of that caught on? I have no idea, I don’t live in a LEGO household but it seemed like a good way for a kid to have more interactive play with something that’s already interactive.

    I tried that Lamborghini ARKit demo from their website a month or so ago. It was really lame. The car basically looked like poor video game graphics. When my SO walked around it appeared to attempt to use people occlusion so she would be blocking the appropriate parts of the view but it didn’t work well (lots of flashing of the overlay in her area, sometimes the car would block her so it appeared she came out of the hood instead of standing in front of the hood, that sort of thing). It was really bad and NOTHING like the demos. This was on a 2018 iPad Pro so I expected more.

    Apple launched ARKit three years ago and immediately became the world's largest AR platform, meaning that developers have a variety of opportunities for building experiences that large numbers of real-world users can experience. The company is still just getting started, and we can expect it to increasingly deploy new technologies that extend ARKit features into new directions, potentially including wearables glasses and vehicle windshields in addition to its current installed base of iOS and iPadOS handheld mobile devices.
    So, we’re 3 years into ARKit. Developers have had all that time to give us “experiences that large number of real-world users can experience”, but where are they?

    I fully believe Apple is working on glasses and most of these advancements are aimed at that. While true about Apple immediately have the largest AR platform I don’t think, currently, that really means anything. Has there been anyone ever that was excited enough about ARKit that it compelled them to upgrade their phone? I doubt it. However, I personally know several people who have upgraded due to camera enhancements.
    AR is not new. It exists since last century. Google Earth or Mars flyby animations are to name a few. What Apple is trying to do is to introduce AR into the daily life of mere mortals. Apple Maps already do that with 3D views of cities. Maybe ARKit and MapKit will merge in the future, who knows?

    I remember the very first couple of Quicktime clips from around 1990, figuring in the Apple promo CDs. Those were simple animations. Quicktime was named "ReelTime" then. And look where we are now. Quicktime has reshaped the whole video and entertainment industry. In the AR field we made more progress than those early days of Quicktime. And I'm sure in AR Apple will succeed without waiting the arrival of Steve Jobs, because they have already the knowledge and production infrastructure in place.
    None of those things you described are augmented reality.
    bonobobroundaboutnowwatto_cobra
  • Reply 10 of 13
    fastasleepfastasleep Posts: 6,452member

    It’s 2.5D hence Augmented not Virtual Reality.
    That's not what that means. These are 3D graphics, not 2D graphics simulating 3D. 

    Augmented Reality = superimposition of *any* sort of graphics/etc over the real-world environment as it appears in front of you
    Virtual Reality = fully simulated 3D environment 
    lolliverbonobobroundaboutnowdaveedvdvRayz2016rundhvidwatto_cobra
  • Reply 11 of 13
    The AR achievement I'm most looking forward to is an AR monitor. I.e., instead of having the limited-view-cone screen of a laptop, wouldn't it be nice to have a much larger virtual monitor projected in my eyes by my glasses (with prescription ;-)? I could even envision a future where the compute unit would be my iPhone, and it could unfold into a reasonable-size keyboard (maybe even a touch+haptic based keyboard).
    watto_cobra
  • Reply 12 of 13
    Rayz2016Rayz2016 Posts: 6,957member
    Curious if the various tracking technologies could advance to the point where it forms a new input method on Apple Watch. (E.g. Gestural based hand writing recognition from the built in accelerometer. 
    That’s what I was thinking with the “draw in the air” thing: a future iPhone could recognise handwriting by gesturing above the screen. 
    watto_cobra
  • Reply 13 of 13
    fastasleepfastasleep Posts: 6,452member
    daveedvdv said:
    The AR achievement I'm most looking forward to is an AR monitor. I.e., instead of having the limited-view-cone screen of a laptop, wouldn't it be nice to have a much larger virtual monitor projected in my eyes by my glasses (with prescription ;-)? I could even envision a future where the compute unit would be my iPhone, and it could unfold into a reasonable-size keyboard (maybe even a touch+haptic based keyboard).
    Have you not seen the myriad articles on Apple's AR glasses project? Might want to go hit that search on the home page.
    watto_cobra
Sign In or Register to comment.