OutdoorAppDeveloper
About
- Banned
- Username
- OutdoorAppDeveloper
- Joined
- Visits
- 86
- Last Active
- Roles
- member
- Points
- 1,898
- Badges
- 1
- Posts
- 1,293
Reactions
-
Apple removes 'iDOS 2' from iOS App Store
Always remember: This is all about restricting what YOU can do with the computer devices YOU own. It is not about security. It is not about safety. It is about Apple's power over what users are allowed to do with the devices they sell. You should be angry. You should demand that Apple butt out of your business. Instead most of you will continue fawning over Apple never realizing how much more awesome your iOS devices could be if you had the power instead of Apple. Downloaded any good third party watch faces recently? No? Me neither. Now why is that? -
What you need to know: Apple's iCloud Photos and Messages child safety initiatives
This is what is really going on:
Apple sold itself on being the data privacy company.
This angered governments who feel they should be able to snoop on anyones data.
The governments argued that not allowing access to private data enables people to harm children by sharing explicit images of them.
Apple knew they would lose the political argument and would be legally forced to introduce back doors to their encryption.
Apple came up with a plan to scan iCloud data themselves for illegal photos using a government pattern matching database.
If the plan works, Apple could then implement strong encryption on iCloud with the pattern matching working with the encrypted data.
Unfortunately it appears that Apple has opened a big can of worms and is now on a slippery sloping razor blade into a pool of lemon juice (how's that for hyperbole?) -
What you need to know: Apple's iCloud Photos and Messages child safety initiatives
Mike Wuerthele said:Re: 1 and 2, write a proof of concept and send it to Apple. Maybe they'll give you a bug/vulnerability reward. The CSAM hash database is not publicly available.
I'm not sure how 3 is an argument against this.
4. This system is only in the US for now. I'm sure the legality of it will be assessed before it is rolled out in other countries. In regards to arrest in other countries, you are subject to the laws of that country/state if you travel in the country/state regardless of your citizenship. Given how the system works, it'd be hard to hash "people hugging" as the photos are not judged based on the image for content, beyond matching the CSAM hash database.
5. I'm glad you're keeping an open mind, and aren't resorting to hyperbole.
If you don't know why workers getting PTSD is an argument against this, then you have no compassion. Are you by any chance a narcissist?
That's the point. We have no idea how this will be rolled out in other countries but we can be certain it will be. The other countries will simply demand that Apple provide them with the same access to personal data provided in the USA. As to how the algorithm works, algorithms change all the time. The dumb pattern matching that Apple is using now could be replaced with an AI trained to recognize certain types of photos.
Are you keeping an open mind when it comes to Apple? You are posting all this Apple information verbatim without any editorial comments about the extreme danger of a program like this. You posted all those glowing articles about how great Apple is for promoting privacy (and you were right), why not be extremely critical or at least extremely skeptical of a plan to scan our photos without our permission and then have humans look at our photos without our permission and then send our photos to law enforcement without our permission?
Edit: That didn't take long:
Apple now says that this system will be implemented in other countries on a country by country basis.
-
What you need to know: Apple's iCloud Photos and Messages child safety initiatives
Unintended consequences are guaranteed if Apple starts scanning all photos in iCloud. Here are a few possibilities:- A trillion sounds like a big number but modern GPUs perform trillions of operations a second. That means that someone with the CSAM database of photos could create false photos that can fool the pattern recognition system. They could even modify your existing photos to make them pass Apple's threshold and get looked at by humans.
- There are a very large number of iOS apps and MacOS apps that have access to your photos. We know this because these apps have to ask for permission to access them. They can view, modify and save photos. We also know that there are a lot of very sketchy apps on the app store. What would prevent an app that modifies your photos to trigger the CSAM pattern recognition, or worse actually upload CSAM photos to your photo library which would automatically be copied to iCloud?
- The humans reviewing photos will be exposed to the photos. We know from workers at Facebook that they suffer post traumatic stress disorder from their constant exposure to hateful comments. How will Apple prevent the same thing happening to its workers? We know that Apple treats its contractors very poorly compared to its full time employees.
- Why did Apple state repeatedly that people deserved privacy and then create a program that is the opposite of privacy? Your photos are scanned by some algorithm no one outside of Apple has checked for errors. Humans will review your photos that the algorithm determines are illegal. The entire contents of your iCloud storage will be sent to law enforcement if the humans determine that it is illegal. What is legal and illegal varies country to country. Apple is an international company operating all over the planet. If you travel to a different country or your iCloud photos are stored on a server in another country, what stops them from arresting you when you arrive because you had photos of people hugging or women with uncovered faces driving cars?
- Why did Apple return to 1984?
-
Apple has contributed over $1 billion to California affordable housing