Internet discovers Apple's Photos object detection feature, controversy inevitably ensues
Twitter on Monday caught wind of how specific a now year-old object recognition feature in Apple Photos can get, with users expressing concern over the system's ability to detect "brassieres."
"Discovered" by Twitter user ellieeewbu, the Photos feature in question introduced at WWDC in 2016 integrates machine learning and AI technology to detect and tag faces, objects and places in user photos. The key word here is tag; no data is offloaded, no folders are created and no images are "collected."
Highlighted by Quartz, ellieeewbu's concern is that Apple is saving photos of women's undergarments and creating a folder containing said pictures. The assumption is incorrect in assuming Apple saves photos (the feature analyzes pictures already in a user's album), but it is correct in noting iOS creates categories -- not folders -- for viewing previous Photos searches.
On iOS and macOS, Photos features an AI trained to enable image searches for anything from dogs and cats to aquariums, haversacks and various sports. As noted by developer Kenny Yin in a Medium post detailing the 4,432 different scenes and objects Photos is capable of recognizing, the app can also detect facial expressions and popular locations. Apple has likely grown the list substantially since Yin first published his rundown over a year ago.
Importantly, and unlike other services like Google Photos, all photo analysis in Apple's app is performed locally, meaning no data is sent to the cloud. Thanks to machine learning technology, pictures are analyzed and -- if an object, person or place is recognized -- assigned metadata, enabling quick searches for "brassiere" or "brassieres" without sending picture information to an offsite processing farm.
The perceived problem could stem from a user interface quirk that automatically creates categories from past searches. For example, searching for "baseball" creates an easily accessible list of photos that were deemed to include baseball themed imagery and therefore bear that metadata. These lists are not quite folders in the traditional sense, as Photos does not move images into distinct folders. Instead, Photos categories are pulled from the larger whole, in this case a user's photo album.
In its own commentary, The Verge points out that Photos lacks similar search options for male undergarments like "boxers" or "briefs." Whether the exclusion is intentional or simply an oversight is unknown.
"Discovered" by Twitter user ellieeewbu, the Photos feature in question introduced at WWDC in 2016 integrates machine learning and AI technology to detect and tag faces, objects and places in user photos. The key word here is tag; no data is offloaded, no folders are created and no images are "collected."
Highlighted by Quartz, ellieeewbu's concern is that Apple is saving photos of women's undergarments and creating a folder containing said pictures. The assumption is incorrect in assuming Apple saves photos (the feature analyzes pictures already in a user's album), but it is correct in noting iOS creates categories -- not folders -- for viewing previous Photos searches.
On iOS and macOS, Photos features an AI trained to enable image searches for anything from dogs and cats to aquariums, haversacks and various sports. As noted by developer Kenny Yin in a Medium post detailing the 4,432 different scenes and objects Photos is capable of recognizing, the app can also detect facial expressions and popular locations. Apple has likely grown the list substantially since Yin first published his rundown over a year ago.
Importantly, and unlike other services like Google Photos, all photo analysis in Apple's app is performed locally, meaning no data is sent to the cloud. Thanks to machine learning technology, pictures are analyzed and -- if an object, person or place is recognized -- assigned metadata, enabling quick searches for "brassiere" or "brassieres" without sending picture information to an offsite processing farm.
The perceived problem could stem from a user interface quirk that automatically creates categories from past searches. For example, searching for "baseball" creates an easily accessible list of photos that were deemed to include baseball themed imagery and therefore bear that metadata. These lists are not quite folders in the traditional sense, as Photos does not move images into distinct folders. Instead, Photos categories are pulled from the larger whole, in this case a user's photo album.
In its own commentary, The Verge points out that Photos lacks similar search options for male undergarments like "boxers" or "briefs." Whether the exclusion is intentional or simply an oversight is unknown.
Comments
Oh, and..
Honestly, fuck these people. YOU took photos of yourself having sex with your phone. That's Apple's fault, why? Why did you take them if you didn't want to see them? Damn these narcissistic attention whores.
- your real name and where you live, your phone number and email addresses
- all about your family, kids, parents, mistresses
- can identify you in any photo that anyone takes or in media anywhere in the world,if ever shared on the platform or publicly online
- where you work and what you think of your boss and coworkers, and have a good idea of how much you make
- who and what you like, dislike, interests, hobbies
- where you like to go for dinner, activities, vacation
- political views, social views, sexual orientation
- and through correlation of credit card purchase info with mobile identifier can pretty much determine what you buy / when
So other than your social insurance number and banking information, they know everything about their subscribers. Today that info is not shared as it is more valuable to use for targeted advertising by them . Will that always remain the case...
But hey, Apple having on-device capability to identify scenes / objects that isn't shared is the bigger issue as usual...(face palm)
But that's fucking fine, right? Just to put an exclamation mark and the absurdity of this outrage.
It is kind of hit and miss for me. Not as amazing as I thought it would be. But I'm impressed it's there.
The Patriot Act may afford some privacy protection to some (in principle, widely questioned) yet is the potential reach only a matter of bug, hack or iOS 'upgrade'...?
It has been suggested by one (for the macOS Photos) "...try pasting in the following command into Terminal, then press enter: launchctl unload -w/System/Library/LaunchAgents/com.apple.photoanalysisd.plist"...