Nude image search gets popular photo sharing app 500px pulled from App Store

124

Comments

  • Reply 61 of 85
    MarvinMarvin Posts: 15,322moderator
    cnocbui wrote:
    I wouldn't think it should get pulled at all,  no more than I would think a Web browser should be.

    It's different from a web browser because the company owns the servers where all the images are hosted.
    cnocbui wrote:
    From hearing users characterisations of the site, I very much doubt the images are either illegal or porn.  If you start bandying around silly words like 'indecent', much of art would be targetable.

    If the service only removes actual indecent images after they are flagged by users, it still allows the possibility of people viewing offensive or illegal images. I had a browse through the 500px site and found this:

    http://500px.com/Feast?page=1

    The girls in some images look a little young and there's images of girls in leather and on all fours on a bed (I'll wait until you get through them, it took me a while because you have to do the whole click, click, right-click > inspect element > resources > drag image to desktop > encrypt). You can see why at least a handful of those pictures would be objectionable.

    The store belongs to Apple, they can put whatever they want in it. If you had a shop, I wouldn't expect to wander in naked without being told to get out. It's perfectly natural to be naked but I can accept that people might not want my crusty scrotum in plain sight. If they allow naked images, then you get images that push things a bit further with suggestive poses like someone naked sucking a lolly or covered in cream. It can be entirely tasteful but it's just easier for Apple to control with a non-nude policy, if that's what they are enforcing here.

    It seems like they just found some images to be a bit too suggestive. There was one of a nude girl covered in melted candle wax. I'm personally a bit desensitised to this stuff and I have a pretty open view of what constitutes art but I'd say some of that stuff is pushing it to the limit of tasteful nudity. I think I'll need to investigate further though.
    cnocbui wrote:
    Should Apple remove all e-book reading apps from the App Store as well, because they could be used with content such as Fifty Shades of Grey, The Story of O, etc, etc?

    I don't know if they should but they seem to have done this:

    http://www.telegraph.co.uk/technology/apple/9821311/Apple-bans-photography-app-500px-over-porn-images.html
    http://www.telegraph.co.uk/technology/apple/7911821/Apple-accused-of-censorship-after-porn-disappears-from-iPad-book-chart.html

    "Apple has a strict ban on pornographic content in the App Store. It has even purged erotic novels from the iPad book chart.

    Blonde and Wet, the Complete Story was ranked first on the iPad in a top 10 that included three other erotic novellas yesterday morning.
    But all four titles disappeared simultaneously and had been replaced with less risqué books, such as Peter Mandelson’s autobiography, by the afternoon."

    Like I say, it's up to them. There has to be a line drawn somewhere. If you just let anything in, you'll end up with stories about all sorts of perverse things and quality control comes into as well as censorship.
  • Reply 62 of 85
    evilutionevilution Posts: 1,399member


    If I want porn, I'm just going to use Safari, not some sharing app that might serve me something that wasn't porn.

  • Reply 63 of 85
    nhtnht Posts: 4,522member


    Well...this was dumb but then again several app store decisions have been dumb so nothing new.


     


    Apple seems to get more right than wrong but this was one of the wrong ones...

  • Reply 64 of 85
    sensisensi Posts: 346member
    Why can't they keep the previous app version available to shop while waiting for the publisher to "fix" the issue, if any, with an amended update? Jeez.
  • Reply 65 of 85
    hentaiboyhentaiboy Posts: 1,252member



    "Blue Steel"

  • Reply 66 of 85
    rcfarcfa Posts: 1,124member

    Quote:

    Originally Posted by hezetation View Post


    I have a huge issue with the way Apple does the 17+ rating, it is too blunt of a rating to really be helpful.  It would be really nice if Apple had a separate warning & rating for content that could allow internet access from the 17+ rating so you could better rate these apps to the type of content they might give access too.  I say this because I've run across several apps that have the 17+ rating purely because they can access the internet but there is nothing about the app that allows access to inappropriate material.  Take for example some of the kidsafe web browsers that are rated 17+ purely because they access the internet, thus defeating the purpose of restricting content based on age rating.  It's about time they re-visited this & fixed it, it's been broken for far too long.



     


    Indeed. It's funny that if you download Opera or some other browser for the iPhone and run the corresponding software updates you end up with pop-up alerts that make it look like you're just about to download some porn software or other adult content, meanwhile the factory installed Safari offers exactly the same access.


    Maybe the entire iPhone should be X-rated...


    Also, what about PhotoStreams? They can just as easily carry "objectionable" content as the 500px site or any other photography site, such as 1x.com.

  • Reply 67 of 85
    As a longtime devotee of Apple products and as someone who is very familiar with the specific circumstances that led Apple to pull the 500px app, it has been disconcerting to observe the way the issue has been framed by most of the media. Most of the media accounts have portrayed Apple's decision as a response to easy access to nudity in the app. As a consequence, many of the comments here and elsewhere have been expressions of outrage and/or disdain at Apple's seemingly engaging in censorship. Many have rightly pointed out that nudity is easily accessed from a multitude of sources. 
    The fact of the matter is that Apple responded appropriately, swiftly and reasonably when alerted to the presence of child pornography in the 500px app. This is a very different situation from easy access to "artistic" photos of nude adults. The fact of the matter is that an identifiable photographer very recently posted on 500px sexualized full frontal nude photos of females who were clearly children. For the most part, the app displayed the art of very skilled photographers, most of whom presented their photos of exotic animals, breathtaking landscapes and similar subjects. In addition, and truly with no special steps required by the viewer, were the photos of a number of photographers whose images presented adult women in various states of undress. But again, easy access to adult nudity did not appear to be the compelling reason for Apple's action. 
    In the United States there are very clear laws prohibiting the transmission of child pornography over the Internet. Most reasonable people would conclude that Apple would not want to be associated with the dessimination of child pornography, via an Apple approved app. Without even speaking to the moral aspect to this situation, about which Apple was no doubt mindful, at the very least, a sound business decision was made. Most reasonable people are also likely to conclude that the staff of the 500px app bear the responsibility to effectively monitor their product for the presence of child pornography.
    I believe that Apple deserves a lot of credit for taking this action and I applaud them.
  • Reply 68 of 85
    As a longtime devotee of Apple products and as someone who is very familiar with the specific circumstances that led Apple to pull the 500px app, it has been disconcerting to observe the way the issue has been framed by most of the media. Most of the media accounts have portrayed Apple's decision as a response to easy access to nudity in the app. As a consequence, many of the comments here and elsewhere have been expressions of outrage and/or disdain at Apple's seemingly engaging in censorship. Many have rightly pointed out that nudity is easily accessed from a multitude of sources. 
    The fact of the matter is that Apple responded appropriately, swiftly and reasonably when alerted to the presence of child pornography in the 500px app. This is a very different situation from easy access to "artistic" photos of nude adults. The fact of the matter is that an identifiable photographer very recently posted on 500px sexualized full frontal nude photos of females who were clearly children. For the most part, the app displayed the art of very skilled photographers, most of whom presented their photos of exotic animals, breathtaking landscapes and similar subjects. In addition, and truly with no special steps required by the viewer, were the photos of a number of photographers whose images presented adult women in various states of undress. But again, easy access to adult nudity did not appear to be the compelling reason for Apple's action. 
    In the United States there are very clear laws prohibiting the transmission of child pornography over the Internet. Most reasonable people would conclude that Apple would not want to be associated with the dessimination of child pornography, via an Apple approved app. Without even speaking to the moral aspect to this situation, about which Apple was no doubt mindful, at the very least, a sound business decision was made. Most reasonable people are also likely to conclude that the staff of the 500px app bear the responsibility to effectively monitor their product for the presence of child pornography.
    I believe that Apple deserves a lot of credit for taking this action and I applaud them.
  • Reply 69 of 85
    As a longtime devotee of Apple products and as someone who is very familiar with the specific circumstances that led Apple to pull the 500px app, it has been disconcerting to observe the way the issue has been framed by most of the media. Most of the media accounts have portrayed Apple's decision as a response to easy access to nudity in the app. As a consequence, many of the comments here and elsewhere have been expressions of outrage and/or disdain at Apple's seemingly engaging in censorship. Many have rightly pointed out that nudity is easily accessed from a multitude of sources. 
    The fact of the matter is that Apple responded appropriately, swiftly and reasonably when alerted to the presence of child pornography in the 500px app. This is a very different situation from easy access to "artistic" photos of nude adults. The fact of the matter is that an identifiable photographer very recently posted on 500px sexualized full frontal nude photos of females who were clearly children. For the most part, the app displayed the art of very skilled photographers, most of whom presented their photos of exotic animals, breathtaking landscapes and similar subjects. In addition, and truly with no special steps required by the viewer, were the photos of a number of photographers whose images presented adult women in various states of undress. But again, easy access to adult nudity did not appear to be the compelling reason for Apple's action. 
    In the United States there are very clear laws prohibiting the transmission of child pornography over the Internet. Most reasonable people would conclude that Apple would not want to be associated with the dessimination of child pornography, via an Apple approved app. Without even speaking to the moral aspect to this situation, about which Apple was no doubt mindful, at the very least, a sound business decision was made. Most reasonable people are also likely to conclude that the staff of the 500px app bear the responsibility to effectively monitor their product for the presence of child pornography.
    I believe that Apple deserves a lot of credit for taking this action and I applaud them.
  • Reply 70 of 85
    As a longtime devotee of Apple products and as someone who is very familiar with the specific circumstances that led Apple to pull the 500px app, it has been disconcerting to observe the way the issue has been framed by most of the media. Most of the media accounts have portrayed Apple's decision as a response to easy access to nudity in the app. As a consequence, many of the comments here and elsewhere have been expressions of outrage and/or disdain at Apple's seemingly engaging in censorship. Many have rightly pointed out that nudity is easily accessed from a multitude of sources. 
    The fact of the matter is that Apple responded appropriately, swiftly and reasonably when alerted to the presence of child pornography in the 500px app. This is a very different situation from easy access to "artistic" photos of nude adults. The fact of the matter is that an identifiable photographer very recently posted on 500px sexualized full frontal nude photos of females who were clearly children. For the most part, the app displayed the art of very skilled photographers, most of whom presented their photos of exotic animals, breathtaking landscapes and similar subjects. In addition, and truly with no special steps required by the viewer, were the photos of a number of photographers whose images presented adult women in various states of undress. But again, easy access to adult nudity did not appear to be the compelling reason for Apple's action. 
    In the United States there are very clear laws prohibiting the transmission of child pornography over the Internet. Most reasonable people would conclude that Apple would not want to be associated with the dessimination of child pornography, via an Apple approved app. Without even speaking to the moral aspect to this situation, about which Apple was no doubt mindful, at the very least, a sound business decision was made. Most reasonable people are also likely to conclude that the staff of the 500px app bear the responsibility to effectively monitor their product for the presence of child pornography.
    I believe that Apple deserves a lot of credit for taking this action and I applaud them.
  • Reply 71 of 85
    I'm sure you'd have no problem finding pictures of horrendous violence. God forbid someone see a nipple.

    Wankers.
  • Reply 72 of 85
    As a longtime devotee of Apple products and as someone who is very familiar with the specific circumstances that led Apple to pull the 500px app, it has been disconcerting to observe the way the issue has been framed by most of the media. Most of the media accounts have portrayed Apple's decision as a response to easy access to nudity in the app. As a consequence, many of the comments here and elsewhere have been expressions of outrage and/or disdain at Apple's seemingly engaging in censorship. Many have rightly pointed out that nudity is easily accessed from a multitude of sources. 
    The fact of the matter is that Apple responded appropriately, swiftly and reasonably when alerted to the presence of child pornography in the 500px app. This is a very different situation from easy access to "artistic" photos of nude adults. The fact of the matter is that an identifiable photographer very recently posted on 500px sexualized full frontal nude photos of females who were clearly children. For the most part, the app displayed the art of very skilled photographers, most of whom presented their photos of exotic animals, breathtaking landscapes and similar subjects. In addition, and truly with no special steps required by the viewer, were the photos of a number of photographers whose images presented adult women in various states of undress. But again, easy access to adult nudity did not appear to be the compelling reason for Apple's action. 
    In the United States there are very clear laws prohibiting the transmission of child pornography over the Internet. Most reasonable people would conclude that Apple would not want to be associated with the dessimination of child pornography, via an Apple approved app. Without even speaking to the moral aspect to this situation, about which Apple was no doubt mindful, at the very least, a sound business decision was made. Most reasonable people are also likely to conclude that the staff of the 500px app bear the responsibility to effectively monitor their product for the presence of child pornography.
    I believe that Apple deserves a lot of credit for taking this action and I applaud them.
  • Reply 73 of 85
    As a longtime devotee of Apple products and as someone who is very familiar with the specific circumstances that led Apple to pull the 500px app, it has been disconcerting to observe the way the issue has been framed by most of the media. Most of the media accounts have portrayed Apple's decision as a response to easy access to nudity in the app. As a consequence, many of the comments here and elsewhere have been expressions of outrage and/or disdain at Apple's seemingly engaging in censorship. Many have rightly pointed out that nudity is easily accessed from a multitude of sources. 
    The fact of the matter is that Apple responded appropriately, swiftly and reasonably when alerted to the presence of child pornography in the 500px app. This is a very different situation from easy access to "artistic" photos of nude adults. The fact of the matter is that an identifiable photographer very recently posted on 500px sexualized full frontal nude photos of females who were clearly children. For the most part, the app displayed the art of very skilled photographers, most of whom presented their photos of exotic animals, breathtaking landscapes and similar subjects. In addition, and truly with no special steps required by the viewer, were the photos of a number of photographers whose images presented adult women in various states of undress. But again, easy access to adult nudity did not appear to be the compelling reason for Apple's action. 
    In the United States there are very clear laws prohibiting the transmission of child pornography over the Internet. Most reasonable people would conclude that Apple would not want to be associated with the dessimination of child pornography, via an Apple approved app. Without even speaking to the moral aspect to this situation, about which Apple was no doubt mindful, at the very least, a sound business decision was made. Most reasonable people are also likely to conclude that the staff of the 500px app bear the responsibility to effectively monitor their product for the presence of child pornography.
    I believe that Apple deserves a lot of credit for taking this action and I applaud them
  • Reply 74 of 85
    As a longtime devotee of Apple products and as someone who is very familiar with the specific circumstances that led Apple to pull the 500px app, it has been disconcerting to observe the way the issue has been framed by most of the media. Most of the media accounts have portrayed Apple's decision as a response to easy access to nudity in the app. As a consequence, many of the comments here and elsewhere have been expressions of outrage and/or disdain at Apple's seemingly engaging in censorship. Many have rightly pointed out that nudity is easily accessed from a multitude of sources. 
    The fact of the matter is that Apple responded appropriately, swiftly and reasonably when alerted to the presence of child pornography in the 500px app. This is a very different situation from easy access to "artistic" photos of nude adults. The fact of the matter is that an identifiable photographer very recently posted on 500px sexualized full frontal nude photos of females who were clearly children. For the most part, the app displayed the art of very skilled photographers, most of whom presented their photos of exotic animals, breathtaking landscapes and similar subjects. In addition, and truly with no special steps required by the viewer, were the photos of a number of photographers whose images presented adult women in various states of undress. But again, easy access to adult nudity did not appear to be the compelling reason for Apple's action. 
    In the United States there are very clear laws prohibiting the transmission of child pornography over the Internet. Most reasonable people would conclude that Apple would not want to be associated with the dessimination of child pornography, via an Apple approved app. Without even speaking to the moral aspect to this situation, about which Apple was no doubt mindful, at the very least, a sound business decision was made. Most reasonable people are also likely to conclude that the staff of the 500px app bear the responsibility to effectively monitor their product for the presence of child pornography.
    I believe that Apple deserves a lot of credit for taking this action and I applaud them
  • Reply 75 of 85
    As a longtime devotee of Apple products and as someone who is very familiar with the specific circumstances that led Apple to pull the 500px app, it has been disconcerting to observe the way the issue has been framed by most of the media. Most of the media accounts have portrayed Apple's decision as a response to easy access to nudity in the app. As a consequence, many of the comments here and elsewhere have been expressions of outrage and/or disdain at Apple's seemingly engaging in censorship. Many have rightly pointed out that nudity is easily accessed from a multitude of sources. 
    The fact of the matter is that Apple responded appropriately, swiftly and reasonably when alerted to the presence of child pornography in the 500px app. This is a very different situation from easy access to "artistic" photos of nude adults. The fact of the matter is that an identifiable photographer very recently posted on 500px sexualized full frontal nude photos of females who were clearly children. For the most part, the app displayed the art of very skilled photographers, most of whom presented their photos of exotic animals, breathtaking landscapes and similar subjects. In addition, and truly with no special steps required by the viewer, were the photos of a number of photographers whose images presented adult women in various states of undress. But again, easy access to adult nudity did not appear to be the compelling reason for Apple's action. 
    In the United States there are very clear laws prohibiting the transmission of child pornography over the Internet. Most reasonable people would conclude that Apple would not want to be associated with the dessimination of child pornography, via an Apple approved app. Without even speaking to the moral aspect to this situation, about which Apple was no doubt mindful, at the very least, a sound business decision was made. Most reasonable people are also likely to conclude that the staff of the 500px app bear the responsibility to effectively monitor their product for the presence of child pornography.
    I believe that Apple deserves a lot of credit for taking this action and I applaud them
  • Reply 76 of 85
    As a longtime devotee of Apple products and as someone who is very familiar with the specific circumstances that led Apple to pull the 500px app, it has been disconcerting to observe the way the issue has been framed by most of the media. Most of the media accounts have portrayed Apple's decision as a response to easy access to nudity in the app. As a consequence, many of the comments here and elsewhere have been expressions of outrage and/or disdain at Apple's seemingly engaging in censorship. Many have rightly pointed out that nudity is easily accessed from a multitude of sources. 
    The fact of the matter is that Apple responded appropriately, swiftly and reasonably when alerted to the presence of child pornography in the 500px app. This is a very different situation from easy access to "artistic" photos of nude adults. The fact of the matter is that an identifiable photographer very recently posted on 500px sexualized full frontal nude photos of females who were clearly children. For the most part, the app displayed the art of very skilled photographers, most of whom presented their photos of exotic animals, breathtaking landscapes and similar subjects. In addition, and truly with no special steps required by the viewer, were the photos of a number of photographers whose images presented adult women in various states of undress. But again, easy access to adult nudity did not appear to be the compelling reason for Apple's action. 
    In the United States there are very clear laws prohibiting the transmission of child pornography over the Internet. Most reasonable people would conclude that Apple would not want to be associated with the dessimination of child pornography, via an Apple approved app. Without even speaking to the moral aspect to this situation, about which Apple was no doubt mindful, at the very least, a sound business decision was made. Most reasonable people are also likely to conclude that the staff of the 500px app bear the responsibility to effectively monitor their product for the presence of child pornography.
    I believe that Apple deserves a lot of credit for taking this action and I applaud them.
  • Reply 77 of 85
    Still waiting to hear a logical reason on hear that answers this question:

    Why was this app pulled and the Flickr app wasn't? They do almost the exact same things and what I've seen, Flickr is far far far far worse.

    Why hasn't Apple pulled the Flickr all?

    What's the difference?
    Seriously.
  • Reply 78 of 85


    What about Instagram and Hipster?  Both have far more graphic content (yes, including penetration) and both are aimed at a younger crowd (the exact group that should not have access to this content).  Far more harmful, yet Apple does nothing. This is hypocricy and discrimination in their purest forms.


     


    500px provides a toggle to bypass nudity (fine art), Instagram and Hipster do not have a way to exclude pornographic material.


     


    There is nothing correct about Apple's decision.

  • Reply 79 of 85
    MarvinMarvin Posts: 15,322moderator
    What about Instagram and Hipster?  Both have far more graphic content (yes, including penetration)

    http://instagramers.com/destacados/mama-instagram-removed-my-account/

    "On removed photos and accounts, we (Instagram Team) rely heavily on the Instagram community to help keep the photos on Instagram within our App Store guidelines and Terms of Use.
    When a photo is flagged multiple times for nudity, copyright or another violation of our Terms of Use, it’s automatically removed from the site.
    An excessive number of flags on multiple photos can result in permanent termination of an account.
    Regarding nudity on Instagram, while we respect the artistic integrity of all sorts of photos, we’re doing our best to keep our product and all the photos within it in line with our App Store’s rating for nudity, which means we must remove such content when it’s brought to our attention"
    There is nothing correct about Apple's decision.

    They do apply their rules inconsistently on occasions but they obviously haven't enough evidence to take action on the other apps. I'm sure if you have a valid complain about other apps, they'll review them too. There's no way they can police dynamic content easily so they will rely on user reports. Blame the users for reporting them.

    These kind of actions tend to get into the porn/art debate just like torrenting gets into the open internet/closed internet debate. I side with openness wherever possible but there has to be a limit or porn and digital theft just ends up everywhere. Apple doesn't want to be seen to permit anything like that. Google has store rules against pornographic images too but they want to be seen as just a little more open than Apple as it works in their favour to portray Apple as big brother and themselves as the freedom fighters.

    The addition of bokeh, a vignette and a sepia tone doesn't turn a picture of a naked girl spread eagled on a bed from porn into art because pornographers put out the same imagery. It doesn't stop it from being art but it puts it in the class of also being pornographic. If it was my App Store, I'd create an API that let users signup an adults-only username and password (this could even be a use case for a fingerprint sensor) and it could only be setup using a valid credit card. App Store developers who pushed adult content would have to hide their content behind the API. Adult App Store apps and erotic novels like the ones that are currently in the open like the sex position apps would be hidden from normal viewing and put behind an adults-only login.

    But, it's not my store and Apple has decided to go the route of keeping things relatively clean just like Google albeit to a lesser extent. The accusation here is really that Apple isn't dirty enough.
  • Reply 80 of 85
    rcfarcfa Posts: 1,124member


    Basically, what boggles the mind is this:


     


    Apple DOES have ratings on the apps, you CAN restrict apps according to age ranges.


     


    Exactly WHAT is rated 17+ if not "explicit" content? I mean, what's the point of being able to download a variety of Kamasutra apps, but you can't download an app that might *possibly* be used to access *some* pictures that *might* be construed as being pornographic?


     


    If 500px had put up its app as a 4+, 9+ or 12+ app, I'd totally understand Apple's reaction, but an app that has a 17+ restriction should not face further scrutiny unless it violates the law of the land. As long as porn is legal, and as long as the app is rated 17+, Apple should stand back. The people who don't want to be exposed to stuff like that can easily restrict app downloads to the 12+, 9+ or 4+ range, depending on how infantile they are. After all, this is the USA and not Saudi Arabia.


     


    There's really no need for a corporate entity to be the arbiter of what constitutes good or bad taste outside their very own product.


     


    Just imagine you buy a butter knife and it came with an EULA that prohibits you from using that butter knife as a putty spatula, or to spread oil paint on a canvas. Or you go to an art supply warehouse and the oil paint comes with an EULA that prohibits you from using the paint for nude paintings, because the owner of the paint manufacturer is some born again nut case. If you buy the damn knife, paint, iPhone, whatever, you should be able to do with it as you please, regardless if it's bad taste or not, that's why you pay for it; it's not a free loaner.


     


    I'm sure someone, somewhere wrapped an iPhone in something and used the vibration mode for some other than the intended purpose. Is the Apple police going to knock on their door and confiscate the device?

     

Sign In or Register to comment.