After 500px App Store flap, Tumblr update warns users of age-restricted material

Posted:
in General Discussion edited January 2014
iOS app developers are seemingly taking notice of the trouble picture-sharing app "500px" faced earlier in January, as App Store veteran Tumblr is now warning users of possible exposure to adult content with its latest update.

Tumblr
Tumblr's new age-restriction warning.Source: Tumblr via the App Store


As noted by The Verge, users downloading Wednesday's release of Tumblr version 3.2.4 were met with Apple's "age-restricted material" pop-up box, forcing them to confirm that they were over the age of 17 before installing the app.

While the update's release notes only mention "small bug fixes," the 17+ content warning was a definite change as seen in a Tumblr/id305343404%3Fmt%3D8+&cd=1&hl=en&ct=clnk&gl=us">cached version of the Tumblr version 3.2.3 iTunes Preview page.

It is unclear whether Tumblr was asked by Apple to make the change or did so proactively in light of a recent kerfuffle over artistic nude photographs in the popular image-sharing apps 500px and ISO500. In that case, the app, which has been available for over 16 months, was flagged for violating App Store policy before being pulled. 500px returned to the App Store on Tuesday with an adult-content warning.

Apple's policy clearly prohibits the distribution of nude or pornographic material through the App Store, meaning any app that has access to the internet or certain adult themed blogs should carry the 17+ rating. Only certain apps are affected by the rule, however, as evidenced by Tumblr's app which had until today gone untouched.

Adding to the issue is the new Twitter-owned app Vine, a short video sharing service that allows users to record and post six-second clips to the internet. The app was featured as an App Store "Editors' Choice" until it was discovered that hardcore pornography could easily be found through the service's hashtag search function. Vine, which ironically carries the subtitle "Make a scene," is still available in the App Store without an adult content rating.

Comments

  • Reply 1 of 11
    Are similar actions taken on GooglePlay?
  • Reply 2 of 11

    Quote:

    Originally Posted by PhilBoogie View Post



    Are similar actions taken on GooglePlay?


     


    Yes, Google also enforces a similar policy and apps were removed from Google Play for having sexual content.

  • Reply 3 of 11
    I am really confused. Why hasn't Yahoo's Flickr been mentioned in any of this? Does anyone know that Flickr is filled with objectionable material as well? Or, for some reason, it doesn't matter, or it isn't subject to the same rules? I'm confused.
  • Reply 4 of 11
    slurpyslurpy Posts: 5,384member

    Quote:

    Originally Posted by crisss1205 View Post


     


    Yes, Google also enforces a similar policy and apps were removed from Google Play for having sexual content.



     


    But noone gives a shit about that, and I never, ever see headlines being plastered around when this happens. That's the power of Anti-Apple article hit-whoring. 

  • Reply 5 of 11
    crisss1205 wrote: »
    Yes, Google also enforces a similar policy and apps were removed from Google Play for having sexual content.

    Ah, ok, good to hear. Thanks.
    I am really confused. Why hasn't Yahoo's Flickr been mentioned in any of this? Does anyone know that Flickr is filled with objectionable material as well? Or, for some reason, it doesn't matter, or it isn't subject to the same rules? I'm confused.

    Is it Flickr that has this material, or a native app? Otherwise they'd need to ban their own Safari as well. Look, the internet is filled with it, and smartphones are internet connected devices. I'm sure Apple simply does what it can.
  • Reply 6 of 11

    Quote:

    Originally Posted by Slurpy View Post


     


    But noone gives a shit about that, and I never, ever see headlines being plastered around when this happens. That's the power of Anti-Apple article hit-whoring. 



     


    Damn straight. IT'S ONLY EVIL WHEN APPLE DOES IT.

  • Reply 7 of 11


    Soon, any app that has an access to the web will be "17+".


    Bigotry has new days of glory ahead...



     The fact that Google does it too shows that it's not Apple being evil, it's society being focused on the wrong (in my opinion) issues.


    I remember reading somewhere on thos forum "over a million gun-related deaths over 30 years in the US". That, to me, seems like a very good reason to ban gun-related imagery from any app WAY BEFORE removing sex-related imagery.

  • Reply 8 of 11
    realisticrealistic Posts: 1,154member

    Quote:

    Originally Posted by lightknight View Post


    Soon, any app that has an access to the web will be "17+".


    Bigotry has new days of glory ahead...



     The fact that Google does it too shows that it's not Apple being evil, it's society being focused on the wrong (in my opinion) issues.


    I remember reading somewhere on thos forum "over a million gun-related deaths over 30 years in the US". That, to me, seems like a very good reason to ban gun-related imagery from any app WAY BEFORE removing sex-related imagery.





    The yearly death rate in car accidents has averaged 43,900 since 1970, so roughly 1,317,000 people have been killed in car accidents  in that same 30 year span. Using your logic then car related images should be banned from apps as well.

  • Reply 9 of 11
    "The yearly death rate in car accidents has averaged 43,900 since 1970, so roughly 1,317,000 people have been killed in car accidents in that same 30 year span. Using your logic then car related images should be banned from apps as well"

    What you failed to mention is people don't drive with the purpose of hurting and killing others. So your point and your logic are mute.

    T.
  • Reply 10 of 11
    MarvinMarvin Posts: 15,322moderator
    The fact that Google does it too shows that it's not Apple being evil, it's society being focused on the wrong (in my opinion) issues.
    I remember reading somewhere on those forum "over a million gun-related deaths over 30 years in the US". That, to me, seems like a very good reason to ban gun-related imagery from any app WAY BEFORE removing sex-related imagery.

    The obvious difference being that gun-related imagery doesn't give you a weapon, with sex-related imagery you already have the weapon, the image just gives you a reason to play with it.

    Violent imagery is censored too though. You don't see executions broadcast on national TV. Fox accidentally broadcast a suicide last year and later apologized for it:




    While we could all just draw the line at allowing nude imagery, there's a very short step from nude to softcore/hardcore. A beautiful office worker could wander around naked no problem, it's natural. But then someone drops a pen and it's softcore. Maybe she has an itch somewhere intimate and now it's hardcore. I think there are 4 main levels:

    - clothed
    - tasteful nude
    - softcore
    - hardcore

    Most retailers will aim to be in the fully clothed safe zone. The lingerie shops and magazines, somewhere between clothed and taseful. Photography services won't want to be too restrictive on artistic freedom and they seem to tolerate softcore but Apple probably wants to stop at tasteful nude. Where the line is between tasteful and softcore depends on how conservative you are but retailers will always want to play it safe.

    500px is definitely softcore. If you only show a woman's genitals in a photo with her underwear pulled down, that's softcore porn and there are a few images like that but it doesn't comprise the whole service and it is not the purpose of the service to provide softcore porn so it makes sense to allow it but ensure it's restricted and the same goes for similar apps.

    In the case of softcore/hardcore porn, while age restrictions could allow Apple to offer it and society should just be expected to not criticize them for doing it, sex will always have a certain reverence because it's so personal. Nobody would feel embarassed at shooting a gun at a firing range but having a personal moment at a fun park:



    that's just wrong. Hot but wrong. Some kid has to get on there after her.

    That's really the big problem: kids. Obviously if the world was full of adults then putting this stuff out there in the open would be fine but as Steve Jobs said 'you'll understand when you have kids'. It's a phrase many people who don't have kids and don't want kids hate to hear. Why should everyone else be put out just because some selfish individuals choose to have kids and expect the world to accommodate them, right? Well, we were all kids once and while I actively sought out adult imagery from a young age, it was mortifying to see it in the presence of parents because you get turned on by it. Who wants to be turned on with their parents in the room? And it's not right that little kids are subjected to it even accidentally.

    The restrictions as people well know aren't enough because when you are a parent, you just can't control everything. Information comes from so many places that companies (which employ many parents), just take steps to make it a little easier on them by default and I think they appreciate it. While others (including some parents) would appreciate the option to have more adult apps like an app that has videos of sex positions that can be used on an iPad while in bed or a map for where all the errogenous zones are (3rd party maps should be ok), Apple would have to compromise either security or their decision to keep the store clean and most of these kind of apps can be done with a webapp anyway.

    Maybe Apple just needs to allow webapps to have a bit more functionality like streaming DRM video or something like that.
  • Reply 11 of 11
    mactoidmactoid Posts: 112member


    Isn't adult content the REASON for using the Tumblr app?  image

Sign In or Register to comment.