crowley

I don't add "in my opinion" to everything I say because everything I say is my opinion.  I'm not wasting keystrokes on clarifying to pedants what they should already be able to discern.

About

Banned
Username
crowley
Joined
Visits
454
Last Active
Roles
member
Points
11,743
Badges
2
Posts
10,453
  • Kanye West to hold third 'Donda' event on August 26 in Chicago

    What's Kanye going to do?  No one knows, least of all Kanye!
    AI_lias
  • What you need to know: Apple's iCloud Photos and Messages child safety initiatives

    crowley said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.

    radarthekat said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    No, it is NOT as simple as that. There is a difference and it is an important one. Apple conducting search for CSAM in iCloud is "searching for illegal content within their property". Apple conducting search for CSAM in end user's phone is "Invasion of privacy of the individual". The scope of former CANNOT be expanded in future to other contents not uploaded to iCloud. The scope of latter CAN be expanded to other contents within the phone in future.
    You do know it’s applied only to images to be uploaded to iCloud.  Which is an important distinction on two aspects.  1. A user can avoid this by not uploading images to iCloud.  And 2, this allows Apple to ensure that it’s not hosting illegal child abuse images on its servers.  
    Agreed, it is applied only to images to be uploaded to iCloud - FOR NOW. It still has a possibility of "scope creep" (scanning other contents in the phone which are not marked for upload to iCloud) IN FUTURE (particularly in China/Russia/Saudi Arabia). Which is why, it is called as "backdoor" by the people who can think critically and understand how these features start off to begin with (a noble cause) and how they expand in scope (surveillance) in the long run. If Apple were to scan for CSAM content in iCloud, there is NO possibility of "scope creep" (scanning of other contents in the user's phones) in future. That is an important distinction on this topic.
    OMG what if! literally applies to every possible thing you can think of. What if Russia makes Apple upload Spotlight databases for all Russian users! Sounds scary, right? I can make up a ton of these. Doesn't mean it'll happen.
    So, are you of the type "I don't need privacy because I have nothing to hide. Like me, no needs privacy. Because if they have something to hide, they deserve to be punished anyway"?

    Apple can search within their property (iCloud) for all they want. But they should NOT search in the property of end user's phones WITHOUT warrant.
    The photo is my property whether it's on my phone or in iCloud.  But if I want to place in it in iCloud Photos then Apple have the right to check if to make sure it's something they're willing to host.  And it makes no difference whether that check is before or after the transfer; it's Apple's service, I'm requesting to use it, and they're negotiating the contract.
    No, it DOES make a difference which is what you guys are NOT getting it. If someone enters an airport with illegal drugs/weapons, police has every right to arrest that person because the search happens in the location of the authority. But the same person cannot be arrested by police for possession of illegal drugs/weapons at his own house without warrant. Police cannot go and check each and everyone's house for possession of illegal drugs/weapons without warrant. Same person, same content - But depending upon the location where the search is done, the requirement for "warrant issued by court" becomes relevant/irrelevant.

    Apple doing the search for CSAM in people's phones is similar to this - police searching for illegal items in everyone's house without warrant. When you say that "it makes no difference whether the check is before or after the transfer", it is incorrect. There IS a difference and it is an important one. One is a blatant violation of privacy, the other one is not. You guys would be better off thinking through this "distinction" before commenting on this. The good news is - Most of the people DO get it and upset over it and rightly so.
    You're right I'm not getting it.  For one Apple are not the police, and everything is done by agreement with Apple.  If you don't want this to happen then turn off iCloud Photos and Apple will not hashcheck any of your photos.  Moreover, this only happens when a photo is in the process of being transferred to iCloud Photos, i.e. Apple are not scanning generic data on your phone, they are explicitly as part of the transfer process doing some routine checks, akin to validating datatypes, checksums or scanning for viruses or otherwise malicious code.  The "warrant" as you put it is invoked by your request to put something into iCloud Photos, and that'll be within Apple's terms and condition of using the iCloud Photos service.  

    And ultimately there is no practical difference.  You want the photo in iCloud, the photo gets checked, the photo is in iCloud.  That's the exact same effect whether the hashcheck was on device or on server.  The ruleset is the same, the technology would be pretty much the same, it's just the location that is different.  Zero user impact, absolutely no difference in impact to privacy.  This theoretical distinction is completely arbitrary, it makes no difference.
    fastasleep
  • Apple to pursue copyright claims against Corellium in appellate court

    lkrupp said:
    centaur said:
    Is Apple being sincere or are they trying to hide things?
    This outfit is profiting off of Apple’s IP and not paying for it. Even if their intentions are good it’s still Apple’s IP.
    It's not at Apple's expense, in fact it's rather to Apple's benefit.  This hostile legal pursuit would be far better solved with a licensing agreement.
    xyzzy-xxx
  • What you need to know: Apple's iCloud Photos and Messages child safety initiatives

    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.

    radarthekat said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    No, it is NOT as simple as that. There is a difference and it is an important one. Apple conducting search for CSAM in iCloud is "searching for illegal content within their property". Apple conducting search for CSAM in end user's phone is "Invasion of privacy of the individual". The scope of former CANNOT be expanded in future to other contents not uploaded to iCloud. The scope of latter CAN be expanded to other contents within the phone in future.
    You do know it’s applied only to images to be uploaded to iCloud.  Which is an important distinction on two aspects.  1. A user can avoid this by not uploading images to iCloud.  And 2, this allows Apple to ensure that it’s not hosting illegal child abuse images on its servers.  
    Agreed, it is applied only to images to be uploaded to iCloud - FOR NOW. It still has a possibility of "scope creep" (scanning other contents in the phone which are not marked for upload to iCloud) IN FUTURE (particularly in China/Russia/Saudi Arabia). Which is why, it is called as "backdoor" by the people who can think critically and understand how these features start off to begin with (a noble cause) and how they expand in scope (surveillance) in the long run. If Apple were to scan for CSAM content in iCloud, there is NO possibility of "scope creep" (scanning of other contents in the user's phones) in future. That is an important distinction on this topic.
    OMG what if! literally applies to every possible thing you can think of. What if Russia makes Apple upload Spotlight databases for all Russian users! Sounds scary, right? I can make up a ton of these. Doesn't mean it'll happen.
    So, are you of the type "I don't need privacy because I have nothing to hide. Like me, no needs privacy. Because if they have something to hide, they deserve to be punished anyway"?

    Apple can search within their property (iCloud) for all they want. But they should NOT search in the property of end user's phones WITHOUT warrant.
    The photo is my property whether it's on my phone or in iCloud.  But if I want to place in it in iCloud Photos then Apple have the right to check if to make sure it's something they're willing to host.  And it makes no difference whether that check is before or after the transfer; it's Apple's service, I'm requesting to use it, and they're negotiating the contract.
    fastasleep
  • Apple giving iOS 15 users choice between new and old Safari design

    I hope that Safari in macOS also allows you to turn the tinting off.  Every time Apple does this dumb UI flair with anything, whether it be the menu bar, or unnecessary animations or transparency, I look to turn them off straight away.  I want a computer, not a visual feast.

    And stop hiding functions behind menus too, buttons are useful.
    elijahgbaconstang