tylersdad
About
- Username
- tylersdad
- Joined
- Visits
- 58
- Last Active
- Roles
- member
- Points
- 2,020
- Badges
- 2
- Posts
- 310
Reactions
-
Pay up or get out: Apple's options for South Korea's App Store law
loopless said:Only people who have never developed an app for sale want this to happen. Apple takes care of everything for you, money just appears in your bank account. It's worth every penny. -
Pay up or get out: Apple's options for South Korea's App Store law
rob53 said:22july2013 said:Exciting times. I've been arguing for this for years and that's why so many people hate me on this forum.
Of course, if you want to have your own payment system, then be prepared for Apple to start charging you a hosting fee for every download and install of that app. That's only fair isn't it?
It's as simple as that. I won't use their infrastructure and they won't get any money from me (other than my developers subscription). -
New iOS 15 and iPadOS 15 developer tool aggressively prioritizes 5G over Wi-Fi
-
Internal Apple memo addresses public concern over new child protection features
loopless said:
...snip...
Using the guise of child-porn as a trojan horse ( anyone who objects is tarred and feathered as being in favor of child-abuse) we are having our privacy invaded in the most evil way.
The ethics of this are an absolute mess. -
Internal Apple memo addresses public concern over new child protection features
Mike Wuerthele said:tylersdad said:Mike Wuerthele said:tylersdad said:Mike Wuerthele said:tylersdad said:Mike Wuerthele said:tylersdad said:Wesley Hilliard said:tylersdad said:It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."
Your answer to "where it ends" is beyond ridiculous. There is always a next. There are always enhancements. It's the nature of technology.The iCloud pictures are mathematically hashed. They are then compared to a hash database, provided to Apple, by the NCMEC. Apple does not have the source pictures, it has the database hash.
There's no assessment of the image as a whole. You could feed a Word file into the hasher, and it would still generate a hash.
They are opening the file. They have to. Sure, it's happening on your phone, but it's still happening. It may not be a human looking at your file, but it's being opened by Apple code that generates the hash.
You cannot generate a hash without some sort of input. In this case, the input is the 1's and 0's from the file...which they MUST read to generate the hash.
I'm not entirely sure what your argument is. Seeing the 0 and 1 in a file without knowing how to decode it isn't the same as knowing what the file is.
Today they're only looking at 1's and 0's. Tomorrow? Who knows...
The bottom line is that the code MUST open the file. The code MUST read the contents of the file.
Would you consent to law enforcement searching your phone without a warrant? Some random cop stops you on the street and wants to look at your pictures to see if you've broken the law. Would you consent? Probably not. I know I wouldn't.
What Apple is doing isn't much different except that they aren't looking at the physical representation of the file.
A rose by any other name is still a rose. This is a massive violation of privacy that has the potential to go very wrong in the wrong hands.