tylersdad

About

Username
tylersdad
Joined
Visits
58
Last Active
Roles
member
Points
2,020
Badges
2
Posts
310
  • Apple employees express concern over new child safety tools

    The folks working in privacy and information security were probably told to dummy up. 
    elijahgbaconstangdarkvadermike54byronlcat52zeus423
  • New iOS 15 and iPadOS 15 developer tool aggressively prioritizes 5G over Wi-Fi

    We have 5G towers all over the place here, but 5G is still pretty flaky. It’s bad enough that I turned it off. With 5G on, my phone rarely worked. With 5G off, it rarely fails. 
    llamawatto_cobra
  • Apple privacy head explains privacy protections of CSAM detection system

    jdw said:
    After reading all the debate, I have not yet been convinced it is a good idea.  And here's another reason to be highly cautious about it -- a reason you'd think Apple would care about...

    What about the mental state of all those Apple employees forced to look at kiddo porn all day long when it comes time for Apple to check all those flagged photos?

    Seriously.  I've read a lot about FaceBook moderators who have to look at horrible posts all day long and have that take a mental toll on them.  Imagine how much worse kiddy porn would be.  And yet, Apple has a human review process.  That means, your job will be to look at that stuff frequently.  I can only image the mental havoc it will have on good people who know the evils of that stuff, but image if someone in the review process was a child predator getting their jollies off the entire review process!

    I think these are legitimate concerns that few are talking about in this debate.  We need to consider the mental toll on Apple employees (or contractors) forced to constantly review flagged photos.
    Good lord. What a great point! I didn’t even consider this. 
    darkvaderjdw
  • Apple privacy head explains privacy protections of CSAM detection system

    nizzard said:
    Also important to note here, Apple doesnt claim this cant be expanded or abused by gov.  They say they will resist.  They wont have a f*cking choice once it’s built.
    Exactly right. If the Chinese government gives Apple the choice of implementing this or getting kicked out of the Chinese market, does anybody really think they’ll give up the Chinese market? Hell no!
    muthuk_vanalingambaconstang
  • Apple privacy head explains privacy protections of CSAM detection system

    mknelson said:
    First - they can't put in a generic hash of an AR-15, Snowden, Trump supporters, or gay people. It would have to be a hash of your exact picture or one that is mathematically so close as to be indistinguishable. The hashes are being provided by an independent organization.

    "SHOW ME YOUR WARRANT". Why would they need a warrant? If you put your "stash" in my house, I can search it all I want. It's IN MY HOUSE! And Apple isn't even searching it in iCloud - the hashes are run on your device.

    And yes, Apple totally mishandled this. This is a bad solution.
    I’m not sure I’m clear in this warrant thing. Are you suggesting Apple has the right to scan the content on the device that I paid for?
    baconstang