Video: A sneak peak at iOS 11 Augmented Reality at the Apple Park Visitor Center

Posted:
in Genius Bar edited September 2017
At WWDC17, Apple introduced ARKit, iOS 11's new dev tools for Augmented Reality. It demonstrated interactive 3D games spilling off a table on stage and offered a hands-on look at a simple app that placed virtual objects on a table its iPads could recognize as a 3D plane. This week, it unveiled a practical new real-world app for exploring the Apple Park project.


Something old, something new, something that's $108M just for you

The day before Apple's big event in the Steve Jobs Theater, AppleInsider took some photos of the soon to be opened, new Visitor Center created for Apple Park, which sits just across from the main pedestrian security checkpoint entrance leading to the walking paths into the sprawling campus. The new Augmented Reality experience created for Apple Park shows off the potential of ARKit in iOS 11

What wasn't visible until Apple opened up its Visitor Center as a sneak peak reception after the event was the new Augmented Reality experience created for Apple Park, showing off the potential of ARKit in iOS 11.

When it opens to the public, the Visitor Center will primarily serve as an Apple Store, paired with a cafe. It also houses a vast, 11,000 pound model of the Apple Park site (below), which serves as an aluminum foil for the real star of the show: a virtual tour presented in AR using provided iPads. This new feature distinguishes the new site from the company's other retail locations.





Flagship af

The new Apple Park Visitor Center store also shares a lot in common with Apple's other big budget flagship locations. Despite being located in the sleepy suburbs of Cupertino, California (it straddles the city limits of Santa Clara; just beyond its rear wall there are quiet, tree-lined culs-de-sac of single family homes) the new location shares the same kind of spectacular architecture as some the company's most glamorous pinnacles of merchandising.


Grand Central Terminal



Paris Opera



London Covent Garden


The new store features wonderfully crafted stone stairways (below) that convey the same sort of timeless craftsmanship felt in the stores integrated into historic gems such as New York City's Grand Central Terminal (above top), the former bank at Paris Opera (above middle) or London's spectacular Covent Garden site (above bottom).


Timeless stone stairways...



... and modern walls of glass


The new Apple Park store also feels incredibly modern, with double-high walls of optically-pure glass that--like the breathtaking entrance to the Steve Jobs Theater--act as structural components, not just windows.


Hong Kong IFC



San Francisco's Union Square



The Oculus at World Trade Center


Unlike the cool urban sophistication of glass and steel seen in Hong Kong's Hysan Place and IFC tower (above top), San Francisco's Union Square (above middle) or The Oculus at NYC's World Trade Center (above bottom), the new Apple Park store blends into its orchard-like surroundings by making use of earthy warm wood slats in its ceilings and across its rooftop deck, which offers a view of the vast Spaceship hiding behind a forest of mature trees just across the street.


The earthy, modern AR-enhanced model room.



A view of the Spaceship from the roof

Al;AR: augmented aluminum with an iPad app

The aluminum site model is impressive in its heft but only offers a simplistic rendering of the buildings and landscape of Apple Park. However, when you hold up one of the AR iPads that workers hand you, a new AR exploring app lets you see detailed, photo-realistically rendered images of the actual buildings, complete with streets that feature animated vehicles driving down the road as if in SimCity.

You can flick on the rooftops of buildings to see inside them, where you can see the arrangements of internal walls, desks and even animated people walking around. Because the app is depicting the scene using AR, you can look up, across, inside and around the vast model to explore its mini-world of site details.

There's also a time-of-day slider that renders the AR graphics with lighting and shadows from early morning to high noon and into the evening, when the lights are switched on to get a feel of what it would look like to fly over the site at night. If you look at the grand cafeteria in the early morning, you can see an animated depiction of its massive window-wall-doors opening up to expose the vast indoor space to the outside

At dusk there's a red sky, and if you look at the grand cafeteria in the early morning, you can see an animated depiction of its massive window-wall-doors opening up to expose the vast indoor space to the outside, taking advantage of the Bay Area's mild climate (similar to the huge doors of the company's recently opened new San Francisco Union Square store).

In addition to looking under the roof to see the inner workings of the Spaceship, you can also lift the lid on the Fitness Center (which has contrasting stone walls, rather than glass like the rest of the buildings on the site).

You can even lift up the historic barn that's sat on the site for decades--long before Hewlett Packard paved the original orchards to put up a sea of parking lots for its executive briefing center. When you lift the barn, all that's left behind are some bales of hay and what looks like it could be Clarus the dogcow.

You can also lift the lid of the Steve Jobs Theater, which exposes the grand round display area that leads into the theater pit below. Phase 2 buildings are included in the model, but are not yet interactive to look inside (or perhaps that's just to keep the magical mysteries of the new R&D buildings under wraps).



Because the AR experience is dynamically created in software, Apple will be able to continuously update its AR exploration iPads to add new details and quirky Easter eggs to find-- perhaps they'll add a timeline depicting the site's seven years of construction, or a seasonal depiction of Apple Park as its trees lose their fall leaves or sprout out new ones in the spring.

Beyond looking at the structures, you can also use the AR iPad app to see how the site's solar panels and energy distribution work. In energy mode, the vast solar arrays on the Spaceship, the main parking structures and the Phase 2 parking and data center building glow in a sunny gold, and energy distribution is depicted by animated, sparkling tubes that connect the various buildings.

A third airflow mode lets you look at how ventilation systems help to passively cool the buildings. As you walk around the model, you can sense how cool blue air currents blowing in from the Pacific Ocean to the west flow over the site, picking up heat from human activity and the server farms inside, turning into red currents as they pass through.

Apple's un-goggled vision for AR is already here

The AR experience isn't just an interesting way to explore details of the campus that won't open to the public; it also serves as a depiction of how third party developers can use ARKit to create explorable, augmented worlds that are not just fun to fly over but can also serve as interactive, configurable presentations of information that isn't obvious or visible.

Beyond playing games that turn your iOS device into a window looking into a fantasy world, ARKit can also enable users to peel back layers of structure and overlay depictions of invisible forces such as electrical energy, magnetic fields and heat onto real world objects.


Directive Games "The Machines"


ARKit can be used to create apps that interactively visualize the activity inside of a volcano on the horizon, see elevators rising inside an office tower, observe the migration patterns of birds around a globe, track the real world movement of subway trains below your feet in a city, or visualize what you'd look like wearing a new sweater or a specific cut of jeans.

Apple's use of iPads to debut its visitor center AR tour also shows how useful it is to have an immediately intuitive multitouch interface for configuring your augmented view: you can simply touch and swipe to select what layers you want to explore, what invisible ideas you want to visualize and what time of day you want to see depicted in your AR environment.

This comes in contrast to the popular narrative that claims "real" AR or "mixed reality" inherently (or ultimately) demands the kind of goggles that Google and Microsoft have shown off, paired with gloves or handheld controllers to interact with elements in the virtual world.

The hardware expense of Glass, Hololens or Oculus, paired with a way to interact with what you see, combines to create the current barrier of expense that has blunted their mainstream adoption.As soon as iOS 11 ships on September 19, the ARKit platform will represent by far the largest installed base of AR devices

Additionally, goggle-based VR experiences are notorious for triggering nausea in many users because it divorces your body position and head movements from your visual perception of the world, something that Apple's screen-based, handheld AR doesn't.

While immersive VR goggles have entertaining and practical applications, the new AR worlds Apple has made available in iOS 11 allow the user to contrast regular and augmented views in the same way you might hold up a magnifying glass, seeing a before and after view of the difference in reality and the digital augmentation.

ARKit will also serve a much broader market with an installed base nearing 1 billion devices, enabling today's iOS users to experience AR without spending lots of money on an entirely new platform and new hardware up front. As soon as iOS 11 ships on September 19, the ARKit platform will represent by far the largest installed base of AR devices.

Apple hasn't yet detailed when it will open its new Apple Park visitor center to the public, but it's sure to become a popular destination for millions of people to visit, as its other spectacularly crafted retail locations have already become.

Comments

  • Reply 1 of 14
    Apple's ARKit would seem to be a great educational tool if used to its fullest. I'd like to have instruction/assembly manuals with that type of breakdown where you can see exploded views and layers. Whatever changes are made in a product can be instantly updated in an AR manual. However, in some respects it seems like overkill but I'm sure this is the way things are headed for the future. I honestly believe Apple could have quite an advantage with so many devices already able to support AR. Apple should be able to build up a nice AR app center for Apple Services. Apple app developers should be able to quickly get on board and start making money.
    napoleon_phoneapart
  • Reply 2 of 14
    This is the type of stuff one would see in a movie and it's happening today right now! Can't wait to see where this goes and how advance it can get. 
    napoleon_phoneapartradarthekatwatto_cobra
  • Reply 3 of 14

    The AR app for star gazing was really impressive. I think this opens up the potential for some real interactive learning.

    I am stoked about the possibilities of this making education really interesting for my kids.

    edited September 2017 Rayz2016radarthekatwatto_cobra
  • Reply 4 of 14
    slurpyslurpy Posts: 5,382member
    Too fucking cool. Loved how you could modify the time of day and see visualizations for air flow, energy, etc. This makes me want to visit the campus even more.
    watto_cobra
  • Reply 5 of 14
    I can really see Apple Park becoming a huge tourist attraction! It seems that Apple is about so much more than just selling computers and phones. It's an experience! Definitely the "Disney" of electronic manufactures! I can't wait to plan a trip to see this amazing campus!
    radarthekatSpamSandwichwatto_cobra
  • Reply 6 of 14
    Imagine AP take-apart views of complex electrical or mechanical devices, small or large. This would be a huge value to on-site technicians in so many fields.
    watto_cobra
  • Reply 7 of 14
    flaneurflaneur Posts: 4,526member
    C'mon, you guys, it's "sneak peek," not "peak."
  • Reply 8 of 14

    Al;AR: augmented aluminum with an iPad app

    The aluminum site model is impressive in its heft but only offers a simplistic rendering of the buildings and landscape of Apple Park. However, when you hold up one of the AR iPads that workers hand you, a new AR exploring app lets you see detailed, photo-realistically rendered images of the actual buildings, complete with streets that feature animated vehicles driving down the road as if in SimCity.

    You can flick on the rooftops of buildings to see inside them, where you can see the arrangements of internal walls, desks and even animated people walking around. Because the app is depicting the scene using AR, you can look up, across, inside and around the vast model to explore its mini-world of site details.

    There's also a time-of-day slider that renders the AR graphics with lighting and shadows from early morning to high noon and into the evening, when the lights are switched on to get a feel of what it would look like to fly over the site at night. If you look at the grand cafeteria in the early morning, you can see an animated depiction of its massive window-wall-doors opening up to expose the vast indoor space to the outside

    At dusk there's a red sky, and if you look at the grand cafeteria in the early morning, you can see an animated depiction of its massive window-wall-doors opening up to expose the vast indoor space to the outside, taking advantage of the Bay Area's mild climate (similar to the huge doors of the company's recently opened new San Francisco Union Square store).

    In addition to looking under the roof to see the inner workings of the Spaceship, you can also lift the lid on the Fitness Center (which has contrasting stone walls, rather than glass like the rest of the buildings on the site).

    You can even lift up the historic barn that's sat on the site for decades--long before Hewlett Packard paved the original orchards to put up a sea of parking lots for its executive briefing center. When you lift the barn, all that's left behind are some bales of hay and what looks like it could be Clarus the dogcow.

    You can also lift the lid of the Steve Jobs Theater, which exposes the grand round display area that leads into the theater pit below. Phase 2 buildings are included in the model, but are not yet interactive to look inside (or perhaps that's just to keep the magical mysteries of the new R&D buildings under wraps).



    Because the AR experience is dynamically created in software, Apple will be able to continuously update its AR exploration iPads to add new details and quirky Easter eggs to find-- perhaps they'll add a timeline depicting the site's seven years of construction, or a seasonal depiction of Apple Park as its trees lose their fall leaves or sprout out new ones in the spring.


    Soli owes me an apology... On an earlier AI thread I suggested that Apple could do an AR tour of Apple Park -- Soli said it would be a waste of time and money and too cartoonish...  Apparently Apple had different ideas -- A mockup is too cartoonish and an AR rendering is more realistic.

    You can even lift up the historic barn that's sat on the site for decades--long before Hewlett Packard paved the original orchards to put up a sea of parking lots for its executive briefing center. When you lift the barn, all that's left behind are some bales of hay and what looks like it could be Clarus the dogcow. 

    Apple's ARKit would seem to be a great educational tool if used to its fullest. I'd like to have instruction/assembly manuals with that type of breakdown where you can see exploded views and layers. Whatever changes are made in a product can be instantly updated in an AR manual. However, in some respects it seems like overkill but I'm sure this is the way things are headed for the future. I honestly believe Apple could have quite an advantage with so many devices already able to support AR. Apple should be able to build up a nice AR app center for Apple Services. Apple app developers should be able to quickly get on board and start making money.

    Yes!

    Peel back or add layers to show history.  Think of the possibilities -- from Thermopylae to Battle of the Bulge.  The movie industry could go crazy...  FWIW, it appears that the scanning used in creating Apple Maps identifies the topology including river/lake/ocean bottoms, land beneath buildings, natural as well as man-made features.

    Then there's the travel and real estate industries.

    I suspect that it won't be too long before any major construction design will require a realistic AR model instead of man-made mockup. Think of it:  Apple Park II rendered in AR on an iPad.

    Then, there's the actual construction project itself... Project Management tasks intertwined with schedules and resources -- assisted with AR to be completed on budget and on time...   I'm an old Pert guy (or was it POP) -- I can see lots of uses for AR in Project Management!

    Then there's [dis]assembly/repair/parts manuals...  Not only can Ikea use AR to let place furniture in your home -- it can show you how to put it together when it arrives (let's see, that's part BX3 goes into part C34...).

    User Guides: Welcome to my world!  To illustrate, I just bought a Bostich pancake compressor for my shop.  I kid-you-not, the User Guide is a 2' x 3' two-sided sheet of paper that unfolds and folds up (if you're lucky) like a map -- a friggin' map... Back in the day, maps were one of the major causes of divorces (right after over/under toilet paper).

    Think of how it's going to be in AR on your iPhone or iPad:
    • you bring up the UG for the tool
    • it shows a virtual image of the tool
    • you pinch zoom, rotate the image as you wish (back, front, side, top, bottom, etc.)
    • heads-up appear to show what you are looking at
    • you zoom into the area of interest and remove anything that blocks your view
    • you tap and get instructions/description that you can drill down to whatever level of detail you need
    Now, consider this: a similar process can be used to provide a Guided Tour of the Tool -- when you first get it -- if you haven't used it for a while or are tuning it up -- or when you are considering whether to buy it... Are you listening Amazon, Home Depot, Lowes...
    watto_cobra
  • Reply 9 of 14

    Think of how it's going to be in AR on your iPhone or iPad:
    • you bring up the UG for the tool
    • it shows a virtual image of the tool
    • you pinch zoom, rotate the image as you wish (back, front, side, top, bottom, etc.)
    • heads-up appear to show what you are looking at
    • you zoom into the area of interest and remove anything that blocks your view
    • you tap and get instructions/description that you can drill down to whatever level of detail you need
    Now, consider this: a similar process can be used to provide a Guided Tour of the Tool -- when you first get it -- if you haven't used it for a while or are tuning it up -- or when you are considering whether to buy it... Are you listening Amazon, Home Depot, Lowes...
    This is what I don't understand (keeping in mind that I can appreciate all this AR stuff): how is the guided tour of the Apple Park campus or what you have described here different using AR than it would be as just an app on the iPad? Aside from having to travel across the country to see the model in front of me in Cupertino.  Why can't there just be an app that I could download and do all the same things with?  How does an 11,000 lbs model on a table enhance the experience?  As is mentioned in the article, the app shows people and vehicles moving around "like SimCity".  Yeah, like SimCity, so what is the benefit to there in using AR to enhance the model?  It seems to me it's simply doing something for the sake of doing it.

    There is nothing wrong with your idea for an interactive user guide, but how is this helped by AR? It seems to me like all of that is possible without viewing whatever tool or item live through a camera lens.
  • Reply 11 of 14

    Think of how it's going to be in AR on your iPhone or iPad:
    • you bring up the UG for the tool
    • it shows a virtual image of the tool
    • you pinch zoom, rotate the image as you wish (back, front, side, top, bottom, etc.)
    • heads-up appear to show what you are looking at
    • you zoom into the area of interest and remove anything that blocks your view
    • you tap and get instructions/description that you can drill down to whatever level of detail you need
    Now, consider this: a similar process can be used to provide a Guided Tour of the Tool -- when you first get it -- if you haven't used it for a while or are tuning it up -- or when you are considering whether to buy it... Are you listening Amazon, Home Depot, Lowes...
    This is what I don't understand (keeping in mind that I can appreciate all this AR stuff): how is the guided tour of the Apple Park campus or what you have described here different using AR than it would be as just an app on the iPad? Aside from having to travel across the country to see the model in front of me in Cupertino.  Why can't there just be an app that I could download and do all the same things with?  How does an 11,000 lbs model on a table enhance the experience?  As is mentioned in the article, the app shows people and vehicles moving around "like SimCity".  Yeah, like SimCity, so what is the benefit to there in using AR to enhance the model?  It seems to me it's simply doing something for the sake of doing it.


    I think they could have an app that you download and does most of what was shown without the model.  Most of the AR tour could be projected on any flat horizontal surface: floor, table, etc.

    AFAIK, the current developer release of AR Kit doesn't, yet, support recognition of horizontal surfaces.  Until then, I think they need something like the aluminum model to introduce walls for certain parts of the demo, e.g. the part, at about 33 seconds in, showing how airflow interacts with the building.   Or, I could be wrong and there's something else going on.


    There is nothing wrong with your idea for an interactive user guide, but how is this helped by AR? It seems to me like all of that is possible without viewing whatever tool or item live through a camera lens.

    Most [tool] user guides:
    • start with several pages of safety/hazard instructions
    • over all pictures of the tools
    • some drawings -- detailing and highlighting components and features
    • setup prodedures (referencing the drawings)
    • adjustment procedures (referencing the drawings)
    • how to use the tool (referencing the drawings)
    •  
    •  


    What happens is that:
    • drawings and instructions are in different parts of the manual and you have to keep flipping back and forth
    • the nomenclature for a component may be unfamiliar so you have to do a Genisis search through the pictures, drawings and text to determine what is being discussed
    • you end up flipping back and forth and maybe run out of thumbs...

    For example:




    Several advantages to AR approach:
    • you can manipulate a virtual 3D image
    • as you get closer to a particular area, heads-up indications can appear telling you what you are seeing 
    • you can tap and get options to get instructions on: setup; adjustment; use; maintenance; repair, tic.

    You are manipulating the info according to your ad hoc needs -- rather than you compensating for info in a predetermined structure and format.


    To your last point: "It seems to me like all of that is possible without viewing whatever tool or item live through a camera lens."


    Apple maps on iOS 11 has a Flyover Mode -- in addition to the predetermined Flyover Tour.  In Flyover mode you manipulate the view by manipulating the iDevice and Multitouching its display,   Oddly, it doesn't use the camera lens -- rather the compass, accelerometer and multitouch gestures.


    Call me crazy, but I think that within a short time ARKit will be available on macOS and Safari -- substituting the Mac Display for the camera lens and Magic Mouse (or somesuch)  for iDevice multitouch gestures.

    When you think about it, Apple Maps on the Mac already lets you manipulate virtual 3D objects in a limited ad hoc fashion.

    edited September 2017
  • Reply 13 of 14

    Several advantages to AR approach:
    • you can manipulate a virtual 3D image
    • as you get closer to a particular area, heads-up indications can appear telling you what you are seeing 
    • you can tap and get options to get instrc on: setup; adjustment; use; maintenance; repair

    You are manipulating the info according to your ad hoc needs -- rather than you compensating for info in a predetermined structure and format.


    To your last point: "It seems to me like all of that is possible without viewing whatever tool or item live through a camera lens."


    Apple maps on iOS 11 has a Flyover Mode -- in addition to the predetermined Flyover Tour.  In Flyover mode you manipulate the view by manipulating the iDevice and Multitouching its display,   Oddly, it doesn't use the camera lens -- rather the compass, accelerometer and multitouch gestures.


    Call me crazy, but I think that within a short time ARKit will be available on macOS and Safari -- substituting the Mac Display for the camera lens and Magic Mouse (or somesuch)  for iDevice multitouch gestures.

    When you think about it, Apple Maps on the Mac already lets you manipulate virtual 3D objects in a limited ad hoc fashion.

    I'm still not following why this needs to be 'AR'.  I always assume that the "reality" in "AR" is meaning what I'm seeing through the camera lens and that is what is being augmented. If not, where does the reality fit into it?  To me, Maps isn't augmenting reality, it's just showing a map. Sure, it's a high quality, sometimes 3D map, that I can manipulate and zoom in to or out of and rotate around, it's great.  But it isn't 'reality'.

    Using your DeWalt manual example.  Sure, this can be confusing and time consuming.  But that's DeWalt's fault for making the manual that way.  Isn't it possible for DeWalt to make a digital manual that I could manipulate in a similar fashion to the way we use Maps so that I can zoom in or out, remove a piece of the tool to see what's underneath, etc. without using ARkit? There are already apps in the App Store that let me look at all the different systems in the human body, removing some to see what's beneath and how blood flows and muscles work and on and on, and they've been available long before ARkit was announced and I don't need to point my camera at an actual human for it to work.

    I feel like maybe I'm missing the point.

    EDIT: I guess this is what I'm trying to say: the "AR" in the article is augmenting a model of Apple Park which could be done almost identically using a 3D model that you could manipulate and 'explode' and not require viewing anything through a camera lens.  To me a "real" AR experience would be walking around the actual Apple Park using the AR to "see through" the walls at how air currents flowed and "into" offices.  The camera view would show me the actual building and the augmentation would be the extra stuff that gets overlayed.
    edited September 2017
  • Reply 14 of 14

    Several advantages to AR approach:
    • you can manipulate a virtual 3D image
    • as you get closer to a particular area, heads-up indications can appear telling you what you are seeing 
    • you can tap and get options to get instrc on: setup; adjustment; use; maintenance; repair

    You are manipulating the info according to your ad hoc needs -- rather than you compensating for info in a predetermined structure and format.


    To your last point: "It seems to me like all of that is possible without viewing whatever tool or item live through a camera lens."


    Apple maps on iOS 11 has a Flyover Mode -- in addition to the predetermined Flyover Tour.  In Flyover mode you manipulate the view by manipulating the iDevice and Multitouching its display,   Oddly, it doesn't use the camera lens -- rather the compass, accelerometer and multitouch gestures.


    Call me crazy, but I think that within a short time ARKit will be available on macOS and Safari -- substituting the Mac Display for the camera lens and Magic Mouse (or somesuch)  for iDevice multitouch gestures.

    When you think about it, Apple Maps on the Mac already lets you manipulate virtual 3D objects in a limited ad hoc fashion.

    I'm still not following why this needs to be 'AR'.  I always assume that the "reality" in "AR" is meaning what I'm seeing through the camera lens and that is what is being augmented. If not, where does the reality fit into it?  To me, Maps isn't augmenting reality, it's just showing a map. Sure, it's a high quality, sometimes 3D map, that I can manipulate and zoom in to or out of and rotate around, it's great.  But it isn't 'reality'.

    Using your DeWalt manual example.  Sure, this can be confusing and time consuming.  But that's DeWalt's fault for making the manual that way.  Isn't it possible for DeWalt to make a digital manual that I could manipulate in a similar fashion to the way we use Maps so that I can zoom in or out, remove a piece of the tool to see what's underneath, etc. without using ARkit? There are already apps in the App Store that let me look at all the different systems in the human body, removing some to see what's beneath and how blood flows and muscles work and on and on, and they've been available long before ARkit was announced and I don't need to point my camera at an actual human for it to work.

    I feel like maybe I'm missing the point.

    EDIT: I guess this is what I'm trying to say: the "AR" in the article is augmenting a model of Apple Park which could be done almost identically using a 3D model that you could manipulate and 'explode' and not require viewing anything through a camera lens.  To me a "real" AR experience would be walking around the actual Apple Park using the AR to "see through" the walls at how air currents flowed and "into" offices.  The camera view would show me the actual building and the augmentation would be the extra stuff that gets overlayed.
    A tech is worthless if you can't use it, that includes who produces the actual programs and content. The revolution here is putting the means to make AR in the hands of well everyone. You COULD do a lot of stuff before but if the effort was too high compared to how much you gain both on the dev side and the user side, it will not get done and it will not be used.

    The www came along and there were already ways to move around data on the internet. Many didn't see the point of it in 1993-1994, yet it led to an explosion of content and usage.

    Simplifying leads to expanded markets; that's were Apple has always been and this is were they are now.


Sign In or Register to comment.