Apple taking Maps 'to the next level' in iOS 12
Not content with the uneven coverage in Apple Maps since its 2012 launch, Apple on Friday revealed that it will debut limited first-party maps data in the next iOS 12 beta, expanding that coverage as time goes on.
Next week's first-stage rollout will cover just San Francisco and the rest of the Bay Area, expanding to northern California by the fall and the rest of the U.S. in the following year, the company told TechCrunch. To get the necessary data, the company has been doing first-party collection using iPhones and the Apple Maps vehicles that have been roaming cities around the world.
At some point every version of iOS will be able to see the new maps. Apple is also hoping to be quicker on the draw with road and construction changes, and make app graphics more visually detailed, depending on context. This may include enhanced foliage, pools, pedestrian paths, and ground cover.
The effort has reportedly been in progress for four years, with the ultimate goal of completely exorcizing third-party data. The hodge-podge of third-party sources Apple has used so far has sometimes been blamed for Maps' shortcomings.
"We wanted to take this to the next level," said Apple's senior VP of internet software and services, Eddy Cue. "We have been working on trying to create what we hope is going to be the best map app in the world, taking it to the next step. That is building all of our own map data from the ground up."
At the moment corrections and updates need to pass submission and validation, but Cue indicated that the company will soon be able to change anything in Maps in real-time and even more frequently.
Friday's report also exposed more information about Apple Maps vans, which have been on streets since 2015. Each one is equipped with GPS, eight cameras, and four LiDAR sensors, as well as a device attached to a rear wheel that ensures proper recording of distance and images. Inside is a Mac Pro bolted to the floor, in turn connected to an assortment of SSDs for storage and a dashboard-mounted iPad, where the actual map capture software runs.
Each driver is accompanied by an operator who makes sure the necessary roads are covered and images are collected properly. In addition to images, though, the vans are creating 3D point clouds.
After a completed run, data is saved to the SSDs, which are pulled out, packed into a case and delivered to an Apple data center where software is used to strip out private information such as faces and license plates. Both the vans and the data center have their own encryption keys.
Apple is also relying on its millions of iPhone customers to passively and actively improve data, but attempting to anonymize and dissect collection in a manner that preserves privacy.
"We specifically don't collect data, even from point A to point B," Cue claimed. "We collect data -- when we do it -- in an anonymous fashion, in subsections of the whole, so we couldn't even say that there is a person that went from point A to point B. We're collecting the segments of it. As you can imagine, that's always been a key part of doing this. Honestly, we don't think it buys us anything [to collect more]. We're not losing any features or capabilities by doing this."
To further improve content, iPhone and van data is being combined with high-resolution satellite images, and computer vision analysis to detect addresses, street signs, and points of interest. This is cross-referenced with public data, including construction projects from city planning departments. Point clouds and images are used to identify signs, lanes, and other objects, which can be assigned to different categories.
A special team at Apple is developing a toolkit that will be used by hundreds of human editors to further scrutinize street data. This includes correctly assigning 3D geometry to objects for Flyover, and adjusting the precise location of addresses so that they correspond with entrances.
"When we take you to a business and that business exists, we think the precision of where we're taking you to, from being in the right building," Cue noted. "When you look at places like San Francisco or big cities from that standpoint, you have addresses where the address name is a certain street, but really, the entrance in the building is on another street. They've done that because they want the better street name. Those are the kinds of things that our new Maps really is going to shine on. We're going to make sure that we're taking you to exactly the right place, not a place that might be really close by."
The executive added that people shouldn't expect to see a massive visual overhaul, at least in the near future.
"You're not going to see huge design changes on the maps. We don't want to combine those two things at the same time because it would cause a lot of confusion," he said.
Next week's first-stage rollout will cover just San Francisco and the rest of the Bay Area, expanding to northern California by the fall and the rest of the U.S. in the following year, the company told TechCrunch. To get the necessary data, the company has been doing first-party collection using iPhones and the Apple Maps vehicles that have been roaming cities around the world.
At some point every version of iOS will be able to see the new maps. Apple is also hoping to be quicker on the draw with road and construction changes, and make app graphics more visually detailed, depending on context. This may include enhanced foliage, pools, pedestrian paths, and ground cover.
The effort has reportedly been in progress for four years, with the ultimate goal of completely exorcizing third-party data. The hodge-podge of third-party sources Apple has used so far has sometimes been blamed for Maps' shortcomings.
"We wanted to take this to the next level," said Apple's senior VP of internet software and services, Eddy Cue. "We have been working on trying to create what we hope is going to be the best map app in the world, taking it to the next step. That is building all of our own map data from the ground up."
At the moment corrections and updates need to pass submission and validation, but Cue indicated that the company will soon be able to change anything in Maps in real-time and even more frequently.
Friday's report also exposed more information about Apple Maps vans, which have been on streets since 2015. Each one is equipped with GPS, eight cameras, and four LiDAR sensors, as well as a device attached to a rear wheel that ensures proper recording of distance and images. Inside is a Mac Pro bolted to the floor, in turn connected to an assortment of SSDs for storage and a dashboard-mounted iPad, where the actual map capture software runs.
Each driver is accompanied by an operator who makes sure the necessary roads are covered and images are collected properly. In addition to images, though, the vans are creating 3D point clouds.
After a completed run, data is saved to the SSDs, which are pulled out, packed into a case and delivered to an Apple data center where software is used to strip out private information such as faces and license plates. Both the vans and the data center have their own encryption keys.
Apple is also relying on its millions of iPhone customers to passively and actively improve data, but attempting to anonymize and dissect collection in a manner that preserves privacy.
"We specifically don't collect data, even from point A to point B," Cue claimed. "We collect data -- when we do it -- in an anonymous fashion, in subsections of the whole, so we couldn't even say that there is a person that went from point A to point B. We're collecting the segments of it. As you can imagine, that's always been a key part of doing this. Honestly, we don't think it buys us anything [to collect more]. We're not losing any features or capabilities by doing this."
To further improve content, iPhone and van data is being combined with high-resolution satellite images, and computer vision analysis to detect addresses, street signs, and points of interest. This is cross-referenced with public data, including construction projects from city planning departments. Point clouds and images are used to identify signs, lanes, and other objects, which can be assigned to different categories.
A special team at Apple is developing a toolkit that will be used by hundreds of human editors to further scrutinize street data. This includes correctly assigning 3D geometry to objects for Flyover, and adjusting the precise location of addresses so that they correspond with entrances.
"When we take you to a business and that business exists, we think the precision of where we're taking you to, from being in the right building," Cue noted. "When you look at places like San Francisco or big cities from that standpoint, you have addresses where the address name is a certain street, but really, the entrance in the building is on another street. They've done that because they want the better street name. Those are the kinds of things that our new Maps really is going to shine on. We're going to make sure that we're taking you to exactly the right place, not a place that might be really close by."
The executive added that people shouldn't expect to see a massive visual overhaul, at least in the near future.
"You're not going to see huge design changes on the maps. We don't want to combine those two things at the same time because it would cause a lot of confusion," he said.
Comments
I saw one two days ago! It was cruising along the Queens Road in Reading. I thought it was a fake because it had a sign on the side that said Apple Maps, and I thought all the mapping vans were plain white. Maybe it was one of the new ones they’re running themselves.
Anyne else in the UK seen one?
Yes, they should close down and give the money back to the shareholders.
But I still use it over Google Maps if I don’t need recommendations for cycling, so I hope they are serious about an upgrade, and I hope they’ll add cycling data soon.
As ambitious as building a first-party, electric car and worldwide charge network? (which they are doing)
I just finished using Apple Maps to navigate through 6 European cities: Munich, Berlin, Prague, Vienna, Venice, Innsbruck. It worked perfectly, including Lane Guidance which is really helpful on unfamiliar roads. I'm looking forward to the improved accuracy this article describes.
I bit the bullet and installed Google Maps, because I needed to find a place to eat breakfast with my kids. It blew my mind apart, to see how far ahead Google is than Apple. Considering the absolute poverty of Apple's data in this city, Google had street view of every street in the town. AND, it had 3D models of many major landmarks, such as the main cathedral.
I love Apple, and I am super glad to hear that they're still putting their shoulder behind the grindstone and are committed to improving Maps, because compared to Google, they have a very long way to go. Maybe that's not true in big cities like San Fran or New York, but in smaller places it is very much the case.
Not exactly a deal breaker, but I was mildly annoyed for about eight seconds.
I do, however, remain sceptical that Apple will be able to match Google maps's search content and relevance without its own search engine. The coupling of the two is really powerful when it comes to returning relevant mapping suggestions. I can still sit here in the UK, search for a partial business or whatever name and get my second or third response as an address in the US. Much as I like the US, I don't expect to pop over the pond for such trivial needs .
The lane guidance is really handy.
And now it seems to know when I’m heading home, I get a nice proactive warning for heavy traffic.