Surveillance app stored teens' Apple IDs on unprotected servers

Posted:
in iPhone edited May 2018
An app meant to let parents monitor the phone activity of teenagers was saving the latter's Apple ID passwords in unprotected plaintext form, a report revealed on Sunday.

iPhone 7


The information collected by TeenSafe was hosted on Amazon servers, and also included device identifiers and the email addresses of parents, ZDNet said, crediting the discovery to U.K. researcher Robert Wiggins. Those servers have been temporarily pulled offline, and a TeenSafe representative stated that the company has begun notifying anyone who might be impacted.

At least 10,200 records from the past three months contained customer data, though some were duplicates.

TeenSafe markets itself as a secure, encrypted way for parents to track call, Web, and location histories, as well as read text messages, even deleted ones.

Using the app to track a teen's iPhone requires that they have two-factor authentication turned off, though, which means that any hacker who discovered the plaintext passwords could hijack a teen's Apple ID and view private content.

It's not known if any malicious attacks have been launched, but some of the affected customers had already changed their account data prior to being alerted.
«1

Comments

  • Reply 1 of 26
    maestro64maestro64 Posts: 5,043member
    What a stupid company, I am assuming they were charging for this service.
    cornchipolsdysamoriaStrangeDayswatto_cobra
  • Reply 2 of 26
    sflocalsflocal Posts: 6,092member
    This is sloppy, irresponsible work.  There should be consequences when companies put their user’s private info like this out for the world to see.
    dysamoriaStrangeDayswatto_cobra
  • Reply 3 of 26
    sflocal said:
    This is sloppy, irresponsible work.  There should be consequences when companies put their user’s private info like this out for the world to see.
    Yes let's end Google and Amazon, they sell all that data, legally, and everyone thinks it's ok.....
    olsracerhomie3nunzydysamoriawatto_cobra
  • Reply 4 of 26
    cornchipcornchip Posts: 1,945member
    What? Fukien idiots! 
    watto_cobra
  • Reply 5 of 26
    Rayz2016Rayz2016 Posts: 6,957member
    Why pay for this crap? Just use Find My Friend   

    I use it so I know when Mrs Rayz2016 is approaching the house. Gives me plenty of time to clear up the liquor bottles and start mowing the lawn. 
    edited May 2018 fotoformatracerhomie3GeorgeBMacmike1chiaboltsfan17dewmebikertwintgr1aegean
  • Reply 6 of 26
    bestkeptsecretbestkeptsecret Posts: 4,265member

    This is the reason why I hope Apple incorporates this into iOS directly. Using Family Sharing as a base, Apple could allow for more flexibility on the types of Users.

    So if I setup my wife as a parent, she will have all her privacy and I will not be able to monitor her. If I setup my kids as children, then it should provide some kind of monitoring ability.

    I am not the kind of parent that would want to read all their kids' messages and emails. I believe in giving them privacy, but if I need to opt for a way to make sure they are not being lured away by something, I'd rather have a stock solution that a third party one with which I have to share the Apple ID and password.


    I've recently reviewed a few of the monitoring and screen time apps, but I've decided against using most of them since they use their own authentication and cloud servers. I have a hard time trusting them with my information.


    GeorgeBMacchiatgr1dysamoriawatto_cobra
  • Reply 7 of 26
    Rayz2016Rayz2016 Posts: 6,957member

    I've recently reviewed a few of the monitoring and screen time apps, but I've decided against using most of them since they use their own authentication and cloud servers. I have a hard time trusting them with my information.


    And this is precisely why. 

    I see that DayOne (the journaling app) that removed support for iCloud so that they could host journals on their own servers (for an extortionate price in my opinion), recently had their sync service fail. 

    They restored backups, got the service  running again, and almost immediately began receiving reports that users were seeing entries from other people’s journals. 

    http://help.dayoneapp.com/day-one-sync/may-2018-day-one-outage-postmortem

    Actually, @StrangeDays might be interested in the explanation as to how this happened. 
    GeorgeBMaccgWerksdysamoriawatto_cobra
  • Reply 8 of 26
    Rayz2016Rayz2016 Posts: 6,957member
    And I notice that DayOne isn’t using https for its help pages. 

    They clearly don’t have a clue. 


    watto_cobra
  • Reply 9 of 26
    IreneWIreneW Posts: 303member
    How could they get these passwords in the first place, isn't all authentication supposed to go via Apple?
    cgWerks
  • Reply 10 of 26
    GeorgeBMacGeorgeBMac Posts: 11,421member

    This is the reason why I hope Apple incorporates this into iOS directly. Using Family Sharing as a base, Apple could allow for more flexibility on the types of Users.

    So if I setup my wife as a parent, she will have all her privacy and I will not be able to monitor her. If I setup my kids as children, then it should provide some kind of monitoring ability.

    I am not the kind of parent that would want to read all their kids' messages and emails. I believe in giving them privacy, but if I need to opt for a way to make sure they are not being lured away by something, I'd rather have a stock solution that a third party one with which I have to share the Apple ID and password.


    I've recently reviewed a few of the monitoring and screen time apps, but I've decided against using most of them since they use their own authentication and cloud servers. I have a hard time trusting them with my information.


    I agree... 
    This problem was created when third parties started filling a void left by Apple.

    Smart Phones and related technology have introduced a new wrinkle into parenting that didn't exist before.   Parents have always known that their kids needed to choose their friends wisely and avoid the "bad crowd".   But, smart phones and the web have opened things up beyond the local neighborhood and a kid's world now includes the entire world -- including the good as well as the bad.

    Very simply, your kid could be accessing an ISIS bomb making site or NeoNazi propaganda as easily as checking the weather.  In addition, as confirmed by the incoming president of the NRA, kids enjoy a culture of unmitigated personal violence through TV, movies and video games that are propagating actual, real life physical violence.

    Just as parents have always monitored the friends their child kept, today they need to be aware of the wide variety of online "Friends" they are keeping.

    So, what tools has Apple provided to parents to monitor and regulate their kid's online activities?   The bare, ineffective minimum left over from 30 year old technology.  

    Previously, Apple hinted that they will introduce enhanced parental controls -- perhaps with iOS 12.   One can only hope....
    gatorguywatto_cobra
  • Reply 11 of 26
    nunzynunzy Posts: 662member
    They sell mostly Android software. But every one picks on Apple.
  • Reply 12 of 26
    dewmedewme Posts: 5,332member
    Very sloppy and irresponsible practices by the TeenSafe developer. Requiring the disabling of two factor authentication should have been a red flag for users.

    I'd really like to see Apple incorporate additional sandbox-like testing in their App approval process to detect whether apps submitted for approval "leak" any Apple product attributed private data as part of their data storage and retrieval services, like sending Apple Id credentials in cleartext over any outbound communication connection. I see no reason why Apple cannot setup a test environment with something akin to WireShark to monitor all outbound data traffic to see whether anything Apple cares about (like Apple Ids) are leaving the device/machine boundary in unencrypted form. Apps that leak any Apple attributed data should not be approved and apps that send unencrypted non-Apple attributed data, i.e., private data owned by the app itself, should be granted approval but only with a Big Red Flag in the App Store that warns users that the app sends its private data in unencrypted format over the internet. Users can then decide whether they want to trust the app with the data managed by the app.  What I'm not asking for is for Apple to monitor communication transactions on the device to detect leakage. I'm only asking Apple to establish a test environment (with test ids and data) and use external analyzers to detect leakage on-the-wire.  
    watto_cobra
  • Reply 13 of 26
    gatorguygatorguy Posts: 24,176member
    nunzy said:
    They sell mostly Android software. But every one picks on Apple.
    Well that's not true at all. At $15/mo for monitoring what Android user could afford it? You know full well that Android users are cheap and won't pay for anything. In fact everyone knows that. 
    nunzy
  • Reply 14 of 26
    macxpressmacxpress Posts: 5,801member
    This just seems like a useless service to me. There are things in place on iOS that already do similar things. Find my Friends being one of them and you can also have the iPhone set to send the last known location if the phone powers down for some reason. Also, I know on Verizon the bill gives a list of every call received and placed for a specific line. I'm not a parent, but if I were I want to give my child a little privacy and not play big brother with every single aspect of their life. I can't stand parents who want to be in every single aspect of their child's life. We can't hold their hand and keep them from doing every single little thing. Let them get into trouble, make mistakes. This is how they learn. You did it growing up and so did I. We're sending our children into the world being hand held all the way up through life and then we all of a sudden let go and some don't know how to handle themselves in this world. 

    **gets off soapbox**
    watto_cobra
  • Reply 15 of 26
    gatorguygatorguy Posts: 24,176member
    macxpress said:I'm not a parent...
    All that needs said.

    Today is not the same as when we were young (if you're older than 40), but I can't tell you where it's gone wrong. I don't know. Maybe we're just trying too hard not to be our parents.
    edited May 2018 GeorgeBMaccgWerkscornchip
  • Reply 16 of 26
    arlomediaarlomedia Posts: 271member
    dewme said:
    I'd really like to see Apple incorporate additional sandbox-like testing in their App approval process to detect whether apps submitted for approval "leak" any Apple product attributed private data as part of their data storage and retrieval services, like sending Apple Id credentials in cleartext over any outbound communication connection.
    Apple already requires that HTTP communication between the app and an outside service use HTTPS. The network functions in the SDK don't work otherwise. But they have no way to control what the developers do with the data later, as was the problem in this case.

    Anyway, I'm not sure why a developer would need to collect the Apple ID and password, or save it on an outside server. I've never used iCloud storage in my apps, but I assume the SDK does the authentication from the info saved in the device settings, so developers don't have to touch the login info. But parental controls apps have to rely on various odd workarounds to do what they do.
    edited May 2018
  • Reply 17 of 26
    GeorgeBMacGeorgeBMac Posts: 11,421member
    gatorguy said:
    macxpress said:I'm not a parent...
    All that needs said.

    Today is not the same as when we were young (if you're older than 40), but I can't tell you where it's gone wrong. I don't know. Maybe we're just trying too hard not to be our parents.
    Yeh, some things have gone wrong.  Other things are just different....

    One that's different is:   When we were kids we grew up and associated with neighborhood kids.  Everyone pretty much knew who the bad ones were (although, by today's standards, they generally weren't all that bad!).  But today, kids can easily be associated with pretty much anybody and anything in the world -- from the Dahli Lhama to a paedophiles.  And, often it's pretty difficult to tell saints from sinners...

    But, it's not just "the bad guys" out there that parents must worry about:
    We have learned from Cambridge Analytica how information can be distorted and twisted to one's purpose in order to shape ideas and perceptions.

    Today, even the new president of the NRA is warning about too much violence in the media and its effects on kids...

    Times change and parenting has to change.  Letting kids free to roam the world unaccompanied by an adult is probably misguided.

    To paraphrase an old commercial:   "Do you know where your kids have been today?" 
    (And, today you don't have to physically be there to be there!)
    cgWerksRayz2016
  • Reply 18 of 26
    boltsfan17boltsfan17 Posts: 2,294member
    gatorguy said:
    macxpress said:I'm not a parent...
    All that needs said.

    Today is not the same as when we were young (if you're older than 40), but I can't tell you where it's gone wrong. I don't know. Maybe we're just trying too hard not to be our parents.
    Biggest issue now in my opinion is the lack of parenting in today's society. Social media hasn't helped either. 
    larryjw
  • Reply 19 of 26
    nunzynunzy Posts: 662member
    gatorguy said:
    macxpress said:I'm not a parent...
    All that needs said.

    Today is not the same as when we were young (if you're older than 40), but I can't tell you where it's gone wrong. I don't know. Maybe we're just trying too hard not to be our parents.
    Biggest issue now in my opinion is the lack of parenting in today's society. Social media hasn't helped either. 
    Apple could help. They could develop a ParentKit.
    GeorgeBMac
  • Reply 20 of 26
    cgWerkscgWerks Posts: 2,952member
    arlomedia said:
    Anyway, I'm not sure why a developer would need to collect the Apple ID and password, or save it on an outside server. I've never used iCloud storage in my apps, but I assume the SDK does the authentication from the info saved in the device settings, so developers don't have to touch the login info. But parental controls apps have to rely on various odd workarounds to do what they do.
    Exactly. That was my question too. Why the heck would this app be collecting the AppleID and password unless it was one incredible hack (which it probably was).

    Of course, iOS doesn't help anything by popping up dialogs asking for one's AppleID/password all over the place. It trains users to expect to be asked for it, so they might not even realize when they are giving it to Apple, vs. some poorly written app, vs. being phished.

    GeorgeBMac said:
    One that's different is:   When we were kids we grew up and associated with neighborhood kids.  Everyone pretty much knew who the bad ones were (although, by today's standards, they generally weren't all that bad!).  But today, kids can easily be associated with pretty much anybody and anything in the world -- from the Dahli Lhama to a paedophiles.  And, often it's pretty difficult to tell saints from sinners...
    macxpress said:
    This just seems like a useless service to me. There are things in place on iOS that already do similar things. Find my Friends being one of them and you can also have the iPhone set to send the last known location if the phone powers down for some reason. Also, I know on Verizon the bill gives a list of every call received and placed for a specific line. I'm not a parent, but if I were I want to give my child a little privacy and not play big brother with every single aspect of their life. I can't stand parents who want to be in every single aspect of their child's life. We can't hold their hand and keep them from doing every single little thing. Let them get into trouble, make mistakes. This is how they learn. You did it growing up and so did I. We're sending our children into the world being hand held all the way up through life and then we all of a sudden let go and some don't know how to handle themselves in this world. 

    **gets off soapbox**

    I agree a bit with both of these. There absolutely have to be boundaries and a certain amount of 'monitoring' so you can know if the other parenting work you are doing is working or not, so you can do some course-correction. At the same time, it's impossible to monitor everything, so if the foundations aren't in place, it's a hopeless mission of whack-a-mole until they get old enough to just head down the 'bad path' on their own. Plus, if monitoring is done wrong, it just adds a lack of relationship/trust and encourages a likely more tech-savvy kid to get in a game of cat-and-mouse they will likely win.

    My own personal take on this is that it's a parents right/duty to interfere or check on what the kids are doing at any time. Mine is pretty young yet, but he knows if I hear something questionable, I'm having him show me what he was watching and we talk about it. It's also important for them to understand that with maturity (not necessarily age) comes privileges and that something 'bad' for a kid might be not as bad or fine for an adult. For example, something that scares kiddo so he won't sleep, might not scare mom/dad so they don't sleep.

    And, then whenever possible, lay a foundation that would make a kiddo think for themselves that something is inappropriate or should be questioned (and brought to an adult) on their own. Actually talk to them about these things and why it is inappropriate. That can lead into talking about the difference between things that are always inappropriate vs things that are age-appropriate.

    I'm sure it gets trickier as they get older, but I don't think there is any way to keep them in some kind of bubble. And, any efforts I've ever run into to do so tend to lead to worse outcomes. The options aren't..... 1. keep them in a bubble, or 2. let them run wild. Both of those tend to be disasters.

    GeorgeBMac said:
    But, it's not just "the bad guys" out there that parents must worry about:
    We have learned from Cambridge Analytica how information can be distorted and twisted to one's purpose in order to shape ideas and perceptions.
    Yes, and it's *really*, *really* NOT just Cambridge Analytica! It's everyone from our own governments (yes, there is an actual well-funded department of the US-gov't that works with Hollywood/media to get messages and content into our media, TV, movies, etc.), to well-meaning people with un-thought-through worldviews, to all sorts of intentionally malicious stuff.

    gatorguy said:
    I don't know. Maybe we're just trying too hard not to be our parents.
    Yeah, other-ditch-itis seems to be a reoccurring societal tendency.

    I have a story about the whole 'hands off' parenting too.

    When my wife was in seminary in the SF Bay area, we were at a birthday party for a friend. Another woman was there with her pre-teen daughter. The conversation turned to what my wife was doing and when the mom heard this, she said something like,
    "Yea, I should probably teach my daughter some of that stuff. We grew up Lutheran, but we figured we'd just let her [daughter] decide for herself what to believe. But, the other day we were in a jewelry store and she noticed a Catholic crucifix and asked what it was. I was shocked that she didn't seem to know anything about Christianity."

    I suppose the atheists out there might think that a positive thing, but to be utterly ignorant and unaware of what like 1/3 of the world believes (over 1/2 w/ Islam), much of the last couple millennia of history, the foundations of the societies in which we live, etc. can't be a good thing. And, it isn't like the girl would now be picking from among the options in a well-informed manner, either. She just gets parented by the world-views of the media, movies, classroom, and culture she happens to run into (no matter how right or wrong they might be).
    edited May 2018 gatorguy
Sign In or Register to comment.