Apple Maps vans adds Spain, three U.S. states to list of survey locations

Posted:
in General Discussion edited July 2017
In an update to the Apple Maps webpage on Friday, Apple announced its camera- and sensor-laden mapping vans will hit the streets of Spain and a trio of U.S. states this month.




According to the Apple Maps vehicles webpage, Apple began surveying operations in Spain on Monday, focusing on the Biscay province.

Mapping vans equipped with multiple high-resolution cameras, LiDAR arrays, GPS, distance measuring machinery and other advanced hardware are scheduled to gather information in Arratia-Nerbioi, Busturialdea, Durangaldea, Enkarterri, Greater Bilbao, Lea Artibai and Uribe-Kosta through Aug. 13.

In the U.S., July marks the start to Maps vehicle operations in New Hampshire, Rhode Island and Vermont. Apple is also moving into new areas in Arizona, Illinois, Maryland, Massachusetts, Michigan, New Jersey, New York, Pennsylvania, Texas and Washington.

In the newly added states, Apple mapping vans will be cruising the streets of the following cities:
  • New Hampshire - Carroll County, Cos County, Grafton County
  • Rhode Island - Bristol County, Kent County, Newport County, Providence County, Washington County
  • Vermont - Addison County, Caledonia County, Chittenden County, Essex County, Franklin County, Grand Isle County, Lamoille County, Orange County, Orleans County, Washington County
Apple notes information gathered during mapping van excursions will be used to enhance Apple Maps, with some of the data finding its way into future updates of the mapping app. As per the company's privacy guidelines, license plates, faces and other identifying information will be blurred in published images.

Apple's mapping vans were first spotted in the San Francisco Bay Area in early 2015. At the time, some theorized the vehicles were part of Apple's autonomous vehicle program, Project Titan. The company attempted to dispel those rumors by publishing the Apple Maps vehicle webpage in June of that year, later labeling the vans with "Apple Maps" indicia, but speculation that data will be applied to self-driving car systems persists.

How Apple intends to use the imaging data in Apple Maps has yet to be revealed, though it is assumed the company is preparing its own version of Google's Street View technology. Alternatively, the information might be used to improve Apple's 3D Flyover feature, perhaps employing augmented reality tools powered by the forthcoming ARKit platform.
«1

Comments

  • Reply 1 of 24
    buzdotsbuzdots Posts: 452member
    Seeing how I like to drive, I would love to be one of those drivers.  This has got to be one of the coolest jobs around.
    brian green
  • Reply 2 of 24
    dick applebaumdick applebaum Posts: 12,527member

     The ARKit Idea is an Interesting One… 

    If you search for ARKit on YouTube, you will see some amazing examples!

    Placing  and manipulating virtual objects in a real scene is a given...

    ...but, it appears that the tech exists to;
    • detect real objects in the scene
    • walk around them scanning with the camera
    • convert the real objects to 3D virtual objects
    • replace selected real objects in the scene with their corresponding virtual representations

    Now you have a virtual scene where the real objects can be manipulated and can interact with the virtual objects.

    Say, you want to add some virtual furniture to a room but need to rearrange the real furniture to make space...


    There are examples of animating, say, a virtual dancer...  What if you could create a vitural representation of a person, include it in the scene and manipulate and animate it?


  • Reply 3 of 24
    SoliSoli Posts: 10,035member
    I've seen one, but it was an all white Ford Transit cargo van. I really hope they've expanded their fleet because it seems a little odd that after so long they've only done 30 states and are now only getting to Spain.
    edited July 2017
  • Reply 4 of 24
    cornchipcornchip Posts: 1,950member
    so excited for this, but when will it all be integrated into products?!?! I can think of so may ways!!!
  • Reply 5 of 24
    Soli said:
    I've seen one, but it was an all white Ford Transit cargo van.
    Yeah, I saw one if those white Ford Transit vans, as well. It was May 12, in the parking lot at work (in Connecticut), driver just parked and sitting there. The side of the van said, "Apple Maps" and below that "maps.apple.com". Other than the relatively small text the van had no other markings. 

    A coworker saw it outside, took a picture, sent a few of us a text and a couple of us went out to take a look. 

    I was was hoping to talk to the guy but he was already gone when my shift ended shortly after. I doubt I would have found out much but it may have been interesting. 
    edited July 2017 Soli
  • Reply 6 of 24
    Found that photo
    pscooter63
  • Reply 7 of 24
    SoliSoli Posts: 10,035member
    Soli said:
    I've seen one, but it was an all white Ford Transit cargo van.
    Yeah, I saw one if those white Ford Transit vans, as well. It was May 12, in the parking lot at work (in Connecticut), driver just parked and sitting there. The side of the van said, "Apple Maps" and below that "maps.apple.com". Other than the relatively small text the van had no other markings. 

    A coworker saw it outside, took a picture, sent a few of us a text and a couple of us went out to take a look. 

    I was was hoping to talk to the guy but he was already gone when my shift ended shortly after. I doubt I would have found out much but it may have been interesting. 
    Yep, that's what I saw, expect I don't think the side door had windows so the text was further forward, but I can' be certain.
  • Reply 8 of 24
    dick applebaumdick applebaum Posts: 12,527member
    Thinking about the cameras on those mapping vans...

    Much more capabilities than just 3D [drive by] image capture for Maps -- with ARKit * Apple Developers can do much, much more.

    For example, from less than a year ago (before ARKit):

    Here's a 3rd-party 3D Sensor attached to an iPad, using the iPad camera to capture and manipulate AR images





    My first thought was that Apple should buy this Occipital **


    And another example:





    * When iOS 1 is released this fall, there will be more than 1 billion iDevices capable of manipulating 3D AR images

    ** As it turns out, Apple bought PrimeSense, back in 2013 -- the company that developed the 3D Sensor hardware 

  • Reply 9 of 24
    SoliSoli Posts: 10,035member
    Thinking about the cameras on those mapping vans...

    Much more capabilities than just 3D [drive by] image capture for Maps -- with ARKit * Apple Developers can do much, much more.

    For example, from less than a year ago (before ARKit):

    Here's a 3rd-party 3D Sensor attached to an iPad, using the iPad camera to capture and manipulate AR images





    My first thought was that Apple should buy this Occipital **


    And another example:





    * When iOS 1 is released this fall, there will be more than 1 billion iDevices capable of manipulating 3D AR images

    ** As it turns out, Apple bought PrimeSense, back in 2013 -- the company that developed the 3D Sensor hardware 

    To what end? How does this benefit the user? Will this not be a StreetView-like feature in any way?
  • Reply 10 of 24
    dick applebaumdick applebaum Posts: 12,527member
    Soli said:

    To what end? How does this benefit the user? Will this not be a StreetView-like feature in any way?

    Have you seen [user-controlled] 3D Flyover with AR in iOS 11?





    So, this is using the iDevice camera and motion sensor to navigate and display virtual waypoints on a 3D arial map.

    Consider that Apple's arial maps data already has that capability.  I suspect that when Apple adds StreetView to Maps they will add similar capability to iDevices including CarPlay and AppleTV.

    OK, so far?  

    That's exterior navigation enhanced by AR.

    What about interior navigation -- within a mall... within a campus... within a museum... within a burning building...

    Apple acquired WiFiSLAM in 2013 for their expertise in precise interior mapping.  You can do it with a simple walk-thru with an iPhone -- map through walls floors etc.

    Now, what about internal mapping of the body -- AR combined with body scan data -- navigating the body organs looking for anomalies?

    It's all about navigation!

    With AR -- it's about being there without going there!

    ...and for 1 billion Apple iDevice owners -- the best AR Navigator is the one you have with you.

  • Reply 11 of 24
    SoliSoli Posts: 10,035member
    Soli said:

    To what end? How does this benefit the user? Will this not be a StreetView-like feature in any way?

    Have you seen [user-controlled] 3D Flyover with AR in iOS 11?

    [video]

    So, this is using the iDevice camera and motion sensor to navigate and display virtual waypoints on a 3D arial map.

    Consider that Apple's arial maps data already has that capability.  I suspect that when Apple adds StreetView to Maps they will add similar capability to iDevices including CarPlay and AppleTV.

    OK, so far?  

    That's exterior navigation enhanced by AR.

    What about interior navigation -- within a mall... within a campus... within a museum... within a burning building...

    Apple acquired WiFiSLAM in 2013 for their expertise in precise interior mapping.  You can do it with a simple walk-thru with an iPhone -- map through walls floors etc.

    Now, what about internal mapping of the body -- AR combined with body scan data -- navigating the body organs looking for anomalies?

    It's all about navigation!

    With AR -- it's about being there without going there!

    ...and for 1 billion Apple iDevice owners -- the best AR Navigator is the one you have with you.

    I'm even more unsure of what you mean after you posted that video of FlyOver. How exactly am I going to hold my iPhone at that elevation and then turn myself to see an overview of New York? How does this work in CarPlay? I have yet to see how FlyOver is any way similar to StreetView despite the number of people who said it was better to have a bird's eye view from the distance of an airplane above a city when trying to look for a location at eye level. 

    Then you jump to "internal mapping of the body"? Let's try to get a reasonable macro-sized example working for consumers before we jump into something on a microscopic level in the medical field.
    edited July 2017
  • Reply 12 of 24
    dick applebaumdick applebaum Posts: 12,527member

    I predict that Apple will provide an AR tour of the new campus -- inside and out to showcase what can be done with iDevices and ARKit -- and it will blow us away!


    cornchip
  • Reply 13 of 24
    SoliSoli Posts: 10,035member
    I predict that Apple will provide an AR tour of the new campus -- inside and out to showcase what can be done with iDevices and ARKit -- and it will blow us away!
    Really? One, I'd rather see an edited video of the campus, not some partially finished campus where Apple then had to augment to make it seem like there are desks, people, or whatever augmentation for everyday workalike you're imagining. Two, if they have any images, video, or virtual presence of the inside it will be limited because so much of the campus is off limits to even employees. I'm not even sure executives get free reign of the building and every room available. Three, are you expecting this video with AR added in to be at the next Apple event or something they post to their website which the user can tour virtually? I understand even less what an augmented video of the campus would provide to the user. Are you talking about virtual reality or augmented reality, like with how Pokemon Go works?
  • Reply 14 of 24
    dick applebaumdick applebaum Posts: 12,527member
    Soli said:
    I predict that Apple will provide an AR tour of the new campus -- inside and out to showcase what can be done with iDevices and ARKit -- and it will blow us away!
    Really? One, I'd rather see an edited video of the campus, not some partially finished campus where Apple then had to augment to make it seem like there are desks, people, or whatever augmentation for everyday workalike you're imagining. Two, if they have any images, video, or virtual presence of the inside it will be limited because so much of the campus is off limits to even employees. I'm not even sure executives get free reign of the building and every room available. Three, are you expecting this video with AR added in to be at the next Apple event or something they post to their website which the user can tour virtually? I understand even less what an augmented video of the campus would provide to the user. Are you talking about virtual reality or augmented reality, like with how Pokemon Go works?

    You can do both -- capture a video with real elements and AR elements then edit it as you please.  Here's one I made 4 years ago:
    • capture using Apple Maps Flyover
    • add Tilt Shift effect in FCPX
    • add audio from iTunes




    Obviously, there was no ARKit available 4 years ago -- it has been available only 32 days.

    I think what you are missing:

    Like a video, or a game  --  VR can only display what an editor or programmer has predetermined what you see.

    AR is Ad Hoc -- given what is available, you determine what you see.


  • Reply 15 of 24
    SoliSoli Posts: 10,035member
    Soli said:
    I predict that Apple will provide an AR tour of the new campus -- inside and out to showcase what can be done with iDevices and ARKit -- and it will blow us away!
    Really? One, I'd rather see an edited video of the campus, not some partially finished campus where Apple then had to augment to make it seem like there are desks, people, or whatever augmentation for everyday workalike you're imagining. Two, if they have any images, video, or virtual presence of the inside it will be limited because so much of the campus is off limits to even employees. I'm not even sure executives get free reign of the building and every room available. Three, are you expecting this video with AR added in to be at the next Apple event or something they post to their website which the user can tour virtually? I understand even less what an augmented video of the campus would provide to the user. Are you talking about virtual reality or augmented reality, like with how Pokemon Go works?

    You can do both -- capture a video with real elements and AR elements then edit it as you please.  Here's one I made 4 years ago:
    • capture using Apple Maps Flyover
    • add Tilt Shift effect in FCPX
    • add audio from iTunes
    [video]

    Obviously, there was no ARKit available 4 years ago -- it has been available only 32 days.

    I think what you are missing:

    Like a video, or a game  --  VR can only display what an editor or programmer has predetermined what you see.

    AR is Ad Hoc -- given what is available, you determine what you see.
    1) AR also only lets you see what the programmer has predetermined what you see. Just check out any example of ARKit on YouTube where there are additional elements added to an otherwise natural surrounding.

    2) Why would I want ARKit to add unnatural elements to a video over Apple allowing me to move through a user-controlled, 3D video or image montage of, say, Apple Campus walkways or any other open area, not unlike StreetView?
    edited July 2017
  • Reply 16 of 24
    dick applebaumdick applebaum Posts: 12,527member
    Soli said:
    Soli said:

    To what end? How does this benefit the user? Will this not be a StreetView-like feature in any way?

    Have you seen [user-controlled] 3D Flyover with AR in iOS 11?

    [video]

    So, this is using the iDevice camera and motion sensor to navigate and display virtual waypoints on a 3D arial map.

    Consider that Apple's arial maps data already has that capability.  I suspect that when Apple adds StreetView to Maps they will add similar capability to iDevices including CarPlay and AppleTV.

    OK, so far?  

    That's exterior navigation enhanced by AR.

    What about interior navigation -- within a mall... within a campus... within a museum... within a burning building...

    Apple acquired WiFiSLAM in 2013 for their expertise in precise interior mapping.  You can do it with a simple walk-thru with an iPhone -- map through walls floors etc.

    Now, what about internal mapping of the body -- AR combined with body scan data -- navigating the body organs looking for anomalies?

    It's all about navigation!

    With AR -- it's about being there without going there!

    ...and for 1 billion Apple iDevice owners -- the best AR Navigator is the one you have with you.

    I'm even more unsure of what you mean after you posted that video of FlyOver.  How exactly am I going to hold my iPhone at that elevation and then turn myself to see an overview of New York? 

    Well, you could:

    1) spin around until the camera is pointed in the direction of interest -- then move your iPhone away and back to navigate to the desired location

    2) or with your fingers rotate the map -- then drag/zoom to the desired location

    How does this work in CarPlay? 

    I suspect that in CarPlay, Apple will optionally display a 3D map or Hybrid view with AR POI flags displayed as they come into range.  The flags could show traffic, distance, ETA, etc. based on your current driving progress.   In addition you could  apply a filter to display only POIs that are currently of interest to you.
     
    I have yet to see how FlyOver is any way similar to StreetView despite the number of people who said it was better to have a bird's eye view from the distance of an airplane above a city when trying to look for a location at eye level. 

    Currently FlyOver and StreetView are dissimilar -- but they need not be.  They could be variations on a theme -- each augmenting the other to provide you with a more complete perspective of what you are seeing.


    Here's a video made 4 years ago showing the Westin hotel across from the Tuileries Garden.  This uses both Flyover and StreetView.  Note that StreetView is more realistic (and likely, more dated) -- but Flyover, while less realistic, can take you to places where StreetView can't go -- into the Tuileries Garden and the inner courtyards of the hotel.  It isn't shown in this video, but with Flyover you can position yourself to look up the Rue de Rivoli to see the Eiffel Tower, and such.





    Then you jump to "internal mapping of the body"? Let's try to get a reasonable macro-sized example working for consumers before we jump into something on a microscopic level in the medical field.

    I've seen concept videos that show just that!

  • Reply 17 of 24
    dick applebaumdick applebaum Posts: 12,527member
    Soli said:

    1) AR also only lets you see what the programmer has predetermined what you see. Just check out any example of ARKit on YouTube where there are additional elements added to an otherwise natural surrounding.

    2) Why would I want ARKit to add unnatural elements to a video over Apple allowing me to move through a user-controlled, 3D video or image montage of, say, Apple Campus walkways or any other open area, not unlike StreetView?
    We're quibbling about details;

    Watch the video below.  In the first section, you are selecting furniture from a catalog.  Sure, some programmer has digitized 3D images of various furniture pieces -- that's the pre determined part.

    But you decide what to place, it's orientation and where to place it -- that's the ad hoc part.  You can resize it, place it in your office, on the front lawn, on your kitchen table, in the bed of your truck...

    You are in charge of manipulating whatever objects are available in ways and places unknown to the programmer.



  • Reply 18 of 24
    foggyhillfoggyhill Posts: 4,767member
    Street View gives you one pov along one path, being able to reconstruct the env, you could go into the env instead of just looking at it, thst would make street view look quaint like the nickelodeon got supplanted by actual films
  • Reply 19 of 24
    SoliSoli Posts: 10,035member
    Soli said:

    1) AR also only lets you see what the programmer has predetermined what you see. Just check out any example of ARKit on YouTube where there are additional elements added to an otherwise natural surrounding.

    2) Why would I want ARKit to add unnatural elements to a video over Apple allowing me to move through a user-controlled, 3D video or image montage of, say, Apple Campus walkways or any other open area, not unlike StreetView?
    We're quibbling about details;

    Watch the video below.  In the first section, you are selecting furniture from a catalog.  Sure, some programmer has digitized 3D images of various furniture pieces -- that's the pre determined part.

    But you decide what to place, it's orientation and where to place it -- that's the ad hoc part.  You can resize it, place it in your office, on the front lawn, on your kitchen table, in the bed of your truck...

    You are in charge of manipulating whatever objects are available in ways and places unknown to the programmer.

    [video]
    You keep posting videos of people augmenting the reality of what the camera is actually recording yet then you go on to describe something more inline with a 360° view, which already exists. You even mentioned Apple Park, and I've repeated asked you what needs to be augmented in the camera's view of Apple Park—inside or out—that has you so excited? I even gave you examples of augmentation, like chairs, which you've then just posted a video that includes a chair added to a video. How the fuck would it benefit Apple to program—I have no idea how you don't think that those added elements don't require programming—fake chairs, fake desks, fake people, fake trees, fake bikes, whatever to Apple Park over showing up actual video and images of Apple Park? I can think of no benefit to how that make viewing a completed Apple Park a better experience from behind a computer display.

    You are not in charge of the augmentation. The elements in every single video added by a developer. That's akin to saying that a video game has no programmer because you can move the character on the screen as you see fit. 
  • Reply 20 of 24
    dick applebaumdick applebaum Posts: 12,527member
    Soli said:
    Soli said:

    1) AR also only lets you see what the programmer has predetermined what you see. Just check out any example of ARKit on YouTube where there are additional elements added to an otherwise natural surrounding.

    2) Why would I want ARKit to add unnatural elements to a video over Apple allowing me to move through a user-controlled, 3D video or image montage of, say, Apple Campus walkways or any other open area, not unlike StreetView?
    We're quibbling about details;

    Watch the video below.  In the first section, you are selecting furniture from a catalog.  Sure, some programmer has digitized 3D images of various furniture pieces -- that's the pre determined part.

    But you decide what to place, it's orientation and where to place it -- that's the ad hoc part.  You can resize it, place it in your office, on the front lawn, on your kitchen table, in the bed of your truck...

    You are in charge of manipulating whatever objects are available in ways and places unknown to the programmer.

    [video]
    You keep posting videos of people augmenting the reality of what the camera is actually recording yet then you go on to describe something more inline with a 360° view, which already exists. You even mentioned Apple Park, and I've repeated asked you what needs to be augmented in the camera's view of Apple Park—inside or out—that has you so excited? I even gave you examples of augmentation, like chairs, which you've then just posted a video that includes a chair added to a video. How the fuck would it benefit Apple to program—I have no idea how you don't think that those added elements don't require programming—fake chairs, fake desks, fake people, fake trees, fake bikes, whatever to Apple Park over showing up actual video and images of Apple Park? I can think of no benefit to how that make viewing a completed Apple Park a better experience from behind a computer display.

    You are not in charge of the augmentation. The elements in every single video added by a developer. That's akin to saying that a video game has no programmer because you can move the character on the screen as you see fit. 
    As I stated Sure, some programmer has digitized 3D images of various furniture pieces.

    I guess I can’t convince you that there is value to being able to view and manipulate things that don’t exist.  I have to wonder if Apple Park would have ever been built if people couldn’t visualize something that didn’t exist (thru artist renderings and models).

    Actually, I (the user) am in charge of the augmentation – I can take a picture of, say, the family sitting around a table I built and put in the garage, back yard, neighbor’s yard, the street... or not... my choice... I can even film the processes If I chose!


    Mmm, correct me if I’m wrong, but it occurs to me that you’ve never experienced AR.  The actual experince is quite different than watching videos of AR.
    edited July 2017
Sign In or Register to comment.