tylersdad
About
- Username
- tylersdad
- Joined
- Visits
- 58
- Last Active
- Roles
- member
- Points
- 2,020
- Badges
- 2
- Posts
- 310
Reactions
-
Internal Apple memo addresses public concern over new child protection features
StrangeDays said:tylersdad said:Wesley Hilliard said:tylersdad said:It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. -
Internal Apple memo addresses public concern over new child protection features
Beats said:mknelson said:Beats said:Some of this sounds like PR BS. I don’t see how this helps children like Tim claims. And Apple collaborating with the government is embarrassing.From the article:
” In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal…”
-Tim Cook
Maybe I’m reading it wrong but “legal” is a government entity. I would bet some looneys from Congress or the FBI twisted Apple’s arm to provide this. -
Internal Apple memo addresses public concern over new child protection features
leehericks said:Um, how do they look at the pixels if they don't examine the image? Do you have any idea how this technology even works? It doesn't appear so. To look at the pixels compare them to pixels in reference images, they must open both images. If they are opening the image and reading the pixels, then they are examining personal pictures.To explain this quickly, this “scanning applies only (for now) to photos before they are uploaded to iCloud Photos. If you don’t use iCloud Photos, then this will never happen (for now).
Your device scans your photos and calculates a photo “fingerprint” ON YOUR DEVICE. It downloads fingerprints of bad materials and checks ON YOUR DEVICE.If by some chance you had a match, it would put a kind of safety token with the media while uploading it to iCloud Photos. If you reach some number of safety tokens someone will be tasked with checking up on your media (presumably in a very secure way which logs everything they do during the check).
The big question is who creates/validates the media fingerprints that get compared. Example “what if” concerns include a government like China giving Apple fingerprints for anti-government images. -
Internal Apple memo addresses public concern over new child protection features
Wesley Hilliard said:tylersdad said:It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology. -
Internal Apple memo addresses public concern over new child protection features
mknelson said:Beats said:Some of this sounds like PR BS. I don’t see how this helps children like Tim claims. And Apple collaborating with the government is embarrassing.
This is essentially a warrantless search of a person's personal property.
We've already seen that the government is colluding with social media to censor posts which they deem unfavorable to them. We've already seen the government asking people to rat out their family members and friends for what they perceive as "extremist" behavior. Is it really that much of a stretch to think the government would ask tech companies to monitor emails and messages for extremist content?