mfryd
About
- Username
- mfryd
- Joined
- Visits
- 57
- Last Active
- Roles
- member
- Points
- 726
- Badges
- 1
- Posts
- 274
Reactions
-
Crash-prone HBO Max Apple TV app will be overhauled by end of 2021
22july2013 said:Can someone tell me why companies like this require a standalone app instead of using the built-in Apple TV app? I really don't know the reason, but I presume it's to spy on and track their users better than Apple allows.- Integration with the Apple TV app only works on Apple TV. They still would need to maintain apps for other streaming boxes. An independent app reduces development time, and reduces maintenance costs as they can maintain a common code base across all platforms
- Having the same app across all platforms makes it easier to provide customer service, Your agents only need to be trained for one app.
- Having a separate app allows them to provide features not available on the Apple TV app. That can set them apart from competing services.
- The Apple TV app includes content from competing services. This can lead to views preferring those services and canceling this service.
-
Apple details user privacy, security features built into its CSAM scanning system
Apple has previously taken the reasonable position that if is technically possibly to abuse a system, then it will be abused.
Consider the idea of a "back door" on iPhone encryption. Obviously, it would only be used in cases where there was a court order. This would involve a sworn law enforcement officer to present legally obtained evidence to a judge that there was reasonable cause to believe that a crime had been committed, and that a search warrant should be issued. There are a number of checks and balances in this system, yet Apple knows that it can be abused.
Therefore Apple has refused to create a back door, so that if a court orders them to unlock a phone, they can legitimately claim that this is not something they are capable of doing. Apple knows that it is technically possible for a back door to be abused, and therefore Apple understands that if it exists, it will be abused at some point.
When it comes to a back door for searching private photo libraries, Apple is trying to create their own system of checks and balances for this new "back door". The problem is that once the back door exists, they can no longer refuse a government order on the grounds that the request is not possible. If Apple receives a court order to add additional items to the blacklist, they cannot legally refuse. Thus any government can order Apple to scan private photo libraries for forbidden images. Imagine if a Muslim country ordered Apple to add hashes of images of Mohamed to the blacklist. If Apple has a presence in that country, they are bound to follow the laws in that country.
Apple is wrong if they think they can create a backdoor that only Apple controls, and that they can prevent governments from tapping in to this backdoor.
-
Apple's Federighi says child protection message was 'jumbled,' 'misunderstood'
NYC362 said:Come on already. Google and Facebook have been doing this for years. Suddenly when Apple wants to do the same thing, everyone gets twisted.
It is just like the 1000 gas powered cars that will catch on fire get not a work of national attention, but one Tesla goes up in flames and there's a world wide news bulletin like it's the end of the world.
On the other hand, Apple prides itself on protecting the privacy of their customers. A key sales point in buying into the Apple ecosystem is that Apple does everything they possibly can in order to protect your data. They even fight court orders requiring them to add back doors to iPhone local encryption.
Under Apple's new policy, every image you upload to your iCloud library will be scanned, and compared against a list of blacklisted images. If you have too many blacklisted images, you will be reported to the authorities.
Initially, the blacklist will only contain child porn images. I can easily imagine a narcissistic leader ordering Apple to add to that list images which are critical to the government. Imagine a photo of a President that makes him look foolish, shows him in a compromising position, or reveals a public statement to be a lie. Such a President would have a strong incentive to add these photos to the list. Remember, Apple doesn't know what the blacklisted photos look like, Apple only has digital fingerprints of these images (it would be illegal for Apple to posses Child Porn, even if it was for a good cause).
Apple fights to keep out back doors because they know that once those tools exist, at some point they will be used inappropriately by the government. While I applaud Apple's goal of fighting child abuse, I know that once this tool exists, at some point it will be used inappropriately by the government.
One of the key beliefs in the US Constitution is a healthy distrust of government. We have three branches of government, each to keep watch on the other two. We have a free press and free speech so that the public can stay informed as to what government is doing. We have free and fair elections so the people can act as the ultimate watchdog by voting out representatives that are not doing what they should be doing.
I strongly urge Apple to protect the privacy of our images by killing this feature. At best, it will get the criminals use other services for sharing their personal photos. At worst, it is a tool that can be used by the government to prevent free speech. -
Get ready for your $3 settlement check over FaceTime needing iOS 7
-
First cannabis delivery app lands on App Store following policy changes
I wonder how the App handles payments? Most banks are federally chartered, and are not allowed to knowingly service customers running an illegal business. Under Federal law, pot is illegal everywhere in the USA. Therefore banks can’t take on pot sellers as customers. This is why many pot businesses are cash only. It’s true that the Feds tend not to enforce the pot laws in state that allow pot, but that is not the same as it being legal.