bulk001
About
- Username
- bulk001
- Joined
- Visits
- 451
- Last Active
- Roles
- member
- Points
- 1,256
- Badges
- 1
- Posts
- 828
Reactions
-
WhatsApp latest to pile on Apple over Child Safety tools
What nefarious purposes? The government is tracking you to Target? You have a subscription to a legal porn site? You are having an affair? Nobody cares! They have access to all your financial information already. And to your phone meta data. The ones who will use this against you can access it by just arresting you and forcing you to give over the data or hack into it like Saudi Arabia is doing to murder dissidents. -
Open letter asks Apple not to implement Child Safety measures
darkvader said:crowley said:iadlib said:This is admirable and I like the intention behind it. But to plays devils advocate. I bring up that this technique. Maybe. Possibly. Could be adapted for other kinds of data. Say emails. Text messages. Let’s say it becomes an API or baked in feature. What if that feature gets hijacked? By a hacker, or a government? To search for anti-state speech in China, or even industrial espionage. This is a Pandora’s box that couldn’t ever be shut.
It's a private method for comparing an image hash against a specific database of image hashes.
No indication that it'll become a generic API, or a feature for anyone else to use.
No indication that there will be any way for it to be hijacked, short of rooting the entire phone, in which case all bets are off anyway.
No indication that it could be used to search for anything other than images, or more specifically the hashes of those images.
No indication that it couldn't be shut down any time Apple wanted to shut it down.It's not FUD. There is no indication that it can't be used for scanning any type of data, and in fact it's obvious that it absolutely can be.Saying things like "oh, but they're just going after evil kiddy porn, think of the children" is a wilful misunderstanding of the technology. This is incredibly dangerous, and should absolutely be scrapped before it's ever implemented. -
What you need to know: Apple's iCloud Photos and Messages child safety initiatives
dewme said:thrang said:My take is this.
Apple saw the writing on the wall regarding government mandated back-doors. Doors that would all in all likelihood much more open with unfettered access to much if not all of your information. Perhaps begin he sense, the pressure were growing immense.
Perhaps, they decided to develop a content and technology around very narrow and focused one-way beacons (to avoid a mandated backdoor), initially to identify illegal and abhorrent possession and behavior. Perhaps evidence of murders, rapes, extortion, terrorism, may be other beacons that are sent out in the future.
I know, there will be discussions over who makes the decisions, how is it vetted, errors, misuse, hacking, etc. But essentially, to me, Apple seeking to control the process of what they see are inevitable laws that they would need to comply with and with much worse outcomes for users. One way beacons that need to be further vetted to ensure no false positives is an effort to ameliorate law enforcement concerns while still protecting legitimate private information is perhaps a very good approach.
While it feels icky upon initially hearing about it, the more you think about it this way (and what government enforced alternatives might be), it may begin to make sense.
In terms of what defines when these future beacons will be sent out, Apple will like say ask for the pertinent laws to govern such beaconing, leaving up to elected leaders to clarify/legislate/vote what kind of content is considered severely harmful, how it is to be legally obtained, and utlimately leave it up to voters to support or oust those politicians in future elections. So in this case, there is a well defined hash database for this particular category. Apple then implements an on-device methodology that is designed to keep the rest of your data protected and unavailable for unrelated sniffing about while beaconing when there is a match.
As to other categories of hash matching, governments will need to figure that out which would be subjects to immense scrutiny and public debate I'm sure...
There are caveats of course, but in principal this is how I see what has happened.I think it’ll buy Apple some time, but not much. As more and more countries including the United States continue to radicalize against not only remote outsiders, but their fellow citizens who they now consider outsiders because they don’t embrace reality bending authoritarian despots, the requests for back doors will transform into demands for back doors that cannot be denied.I’m very much in favor of what Apple is proposing, but I’m equally concerned that what they are proposing will not be enough to keep the bigger issue of massive government intrusion through mandated back doors at bay. At some point we’ll all have to assume that privacy as we used to know it no longer exists. Nothing Apple is doing will change the eventual outcome if the embrace of authoritarianism and demonization of fellow citizens is allowed to grow.
-
What you need to know: Apple's iCloud Photos and Messages child safety initiatives
darkvader said:What you REALLY need to know: Apple's iCloud Photos and Messages child safety initiatives
1. Spying on your customers is EVIL, even when it's done for ostensibly noble purposes.2. If this technology is implemented, it WILL be used for purposes other than what it originally was intended to do.3. This can easily be used to target people, both by state and non-state actors, simply by sending your target texts or emails with matchable images.4. This WILL be used by authoritarian governments for things other than its original design purpose.5. It must be stopped.What I'd propose is that if Apple continues down this dark path, the defeat is going to be overwhelming the system. The hash database exists, therefore it's going to be obtainable. What needs to happen is the creation of a large number of image files that, while being innocuous cat memes or even just line noise, will match the hashes. Then those harmless images should be so widely distributed that systems like this are utterly overwhelmed with false positives, making them useless.
and there are many more no doubt out there) this is nonsense. I am sure that the Democratic governments can too. In China and Saudi Arabia they are not going to politely ask you to give them permission to access your phone or iCloud data and if you decline, let you go with an apology! Child pornographers would be thrilled to have the database corrupted and your suggestion only helps them and no one else. -
What you need to know: Apple's iCloud Photos and Messages child safety initiatives
Mike Wuerthele said:elijahg said:Mike Wuerthele said:elijahg said:Remember that 1 in 1 trillion isn't 1 false positive per 1 trillion iCloud accounts - it's 1 per 1 trillion photos. I have 20,000 photos, that brings the chances I have a falsely flagged photo to 1 in 50 million. Not quite such spectacular odds then.
And even if it was, one in 50 million is still pretty spectacularly against.
Also, it's massively more likely someone will get their password phished than a hash collision occurring - probably 15-20% of people I know have been "hacked" through phishing. All it takes is a couple of photos to be planted, with a date a few years ago so they aren't at the forefront of someone's library and someone's in very hot water. You claim someone could defend against this in court, but I fail to understand how? "I don't know how they got there" isn't going to wash with too many people. And unfortunately, "good security practices" are practised only by the likes of us anyway, most people use the same password with their date of birth or something equally insecure for everything.
One in a trillion tried a trillion times does not guarantee a match, although it is likely. as you're saying. There may even be two or three. You're welcome to believe what you want, and you can research it with statisticians if you are so inclined. This is the last I will address this point here.
And, in regards to the false positive, somebody will look at the image, and say something like: Oh, this is a palm tree. It just coincidentally collides with the hash. All good. Story over.
In regards to your latter point, this is addressed in the article.