Tim Cook talks the need for privacy and exciting AI, AR

2»

Comments

  • Reply 21 of 38
    Speaking out of both sides of their mouth in 2021.
    elijahg
  • Reply 22 of 38
    GG1GG1 Posts: 483member
    crowley said:
    lkrupp said:
    xyzzy-xxx said:
    I don't believe anything Apple tells about privacy until image scanning (CSAM) on the device is finally declared to be dead. 
    As long as you understand you have no place to go for better privacy and security. NOWHERE!
    He could not use a smartphone.  Or use Android, but not use the Cloud storage.  Or carry on using iOS without iCloud Photos, he didn't even say he wouldn't use it, just that he didn't trust Apple.

    There are always options, despite your creepy and weirdly threatening ALL CAPS.
    Last year for fun I tried to find a dumbphone that supports 4G. I couldn't find anything. Just to be clear, to me a dumbphone runs an embedded OS, not some version of Android (like GrapheneOS, LineageOS, etc.). For such a dumbphone, a browser would be your only window to the internet (besides messaging), so you would give up a lot of convenience with no apps capability. Not to mention much better battery life with an embedded OS.
  • Reply 23 of 38
    lkrupp said:
    bluefire1 said:
    Am I one of the few who’s not excited about AR?
    I put AR right up there with 5G, both nothing burgers at the present time, solutions in search of future problems, marketing at its finest.
    I liken it more to 3D (especially on Blu Ray / TV).  They (the media and tech companies) certainly tried to market it as the next "big thing" but it was a complete flop...  and deservedly so.  Little more than a gimmick with little (if any) real value.  I might even throw in 60fps movies into the bin with AR and 3D.  As I live in cross between rural and sub-urban, 5G is not likely to ever come my way and have no first hand experience with it.
  • Reply 24 of 38
    gatorguygatorguy Posts: 24,213member
    lkrupp said:
    xyzzy-xxx said:
    I don't believe anything Apple tells about privacy until image scanning (CSAM) on the device is finally declared to be dead. 
    As long as you understand you have no place to go for better privacy and security. NOWHERE!
    https://daringfireball.net/linked/2020/01/21/android-encrypted-backups
    Yes, Google Android Cloud backups can be more private and secure than they are on Apple's iCloud

    Won't help with photos though. While those are encrypted once they hit Google servers they aren't E2EE. 
    edited September 2021 jony0
  • Reply 25 of 38
    crowleycrowley Posts: 10,453member
    GG1 said:
    crowley said:
    lkrupp said:
    xyzzy-xxx said:
    I don't believe anything Apple tells about privacy until image scanning (CSAM) on the device is finally declared to be dead. 
    As long as you understand you have no place to go for better privacy and security. NOWHERE!
    He could not use a smartphone.  Or use Android, but not use the Cloud storage.  Or carry on using iOS without iCloud Photos, he didn't even say he wouldn't use it, just that he didn't trust Apple.

    There are always options, despite your creepy and weirdly threatening ALL CAPS.
    Last year for fun I tried to find a dumbphone that supports 4G. I couldn't find anything. Just to be clear, to me a dumbphone runs an embedded OS, not some version of Android (like GrapheneOS, LineageOS, etc.). For such a dumbphone, a browser would be your only window to the internet (besides messaging), so you would give up a lot of convenience with no apps capability. Not to mention much better battery life with an embedded OS.
    There's a fair few Nokia feature phones with 4G available now, where I live at least.  If that's what you really want.
  • Reply 26 of 38
    From the very beginning, one of the best aspects of VR was that it encouraged developers to try new things. The early demos/experiments included walking out along a plank high on a sky scraper, experiencing a scene from a famous anime, and a Nintendo Game Cube emulator. See the problem? Apple's highly restrictive nature will not tolerate this kind of reckless creativity. They must control the fun!

    AR/VR with an Apple App Store will be like Apple TV. I have heard it said that there is just something a bit off about Apple TV. It is like Apple is watching over the shoulders of the producers making little "suggestions" about things to avoid. It is why there will never be a show like Breaking Bad on the network. Want to be "edgy" then have a character swear a lot. That's edgy, right? Discuss socially sensitive issues? Not a chance.

    Apple's need to control everything hurts everyone including customers and developers but most of all it hurts Apple. If Apple wants to get into TV and AR and other new highly expressive forms of entertainment, it needs to let go. Let the creative people do what they want even if you feel in your bones it is a mistake. Ban apps or shows after they have gone off the rails if you have no choice but give them a chance to succeed. No one can predict which creative, edgy, distasteful, disruptive media will be the next big thing.
    elijahg
  • Reply 27 of 38
    xyzzy-xxx said:
    I don't believe anything Apple tells about privacy until image scanning (CSAM) on the device is finally declared to be dead. 
    It has already been explained that you have to opt-in to iCloud Photos syncing to enable CSAM scans.
    Until such time as some government mandates the scanning of images on device, irrespective of whether they're bound for iCloud Photos or not.  Apple will then "comply with all local laws and regulations", privacy be damned.
    elijahg
  • Reply 28 of 38
    pbpb Posts: 4,255member
    xyzzy-xxx said:
    I don't believe anything Apple tells about privacy until image scanning (CSAM) on the device is finally declared to be dead. 
    It has already been explained that you have to opt-in to iCloud Photos syncing to enable CSAM scans.
    Until such time as some government mandates the scanning of images on device, irrespective of whether they're bound for iCloud Photos or not.  Apple will then "comply with all local laws and regulations", privacy be damned.
    It is more than clear that the CSAM scan is the excuse no one would refuse. The real goal is to put in place a technology that can be used by authorities for other purposes too. Besides, why it does protect against child abuse and exploitation if it can be turned off? Now that everyone knows how does this system work, real protection is going to be the exception.

    And having Apple saying that they will refuse government demands to scan for other content, is just ridiculous:


    Flat out liars.

    The idea of on-device scan technology has to be completely abandoned. A corporation should never get involved in such matters.

    elijahgbeowulfschmidt
  • Reply 29 of 38
    crowleycrowley Posts: 10,453member
    xyzzy-xxx said:
    I don't believe anything Apple tells about privacy until image scanning (CSAM) on the device is finally declared to be dead. 
    It has already been explained that you have to opt-in to iCloud Photos syncing to enable CSAM scans.
    Until such time as some government mandates the scanning of images on device, irrespective of whether they're bound for iCloud Photos or not.  Apple will then "comply with all local laws and regulations", privacy be damned.
    Which a government can do at any time.  Apple's iCloud Photos solution has no bearing on that.
    jony0
  • Reply 30 of 38
    crowleycrowley Posts: 10,453member
    pb said:
    xyzzy-xxx said:
    I don't believe anything Apple tells about privacy until image scanning (CSAM) on the device is finally declared to be dead. 
    It has already been explained that you have to opt-in to iCloud Photos syncing to enable CSAM scans.
    Until such time as some government mandates the scanning of images on device, irrespective of whether they're bound for iCloud Photos or not.  Apple will then "comply with all local laws and regulations", privacy be damned.
    It is more than clear that the CSAM scan is the excuse no one would refuse. The real goal is to put in place a technology that can be used by authorities for other purposes too. Besides, why it does protect against child abuse and exploitation if it can be turned off? Now that everyone knows how does this system work, real protection is going to be the exception.

    And having Apple saying that they will refuse government demands to scan for other content, is just ridiculous:


    Flat out liars.

    The idea of on-device scan technology has to be completely abandoned. A corporation should never get involved in such matters.

    Apple never said that they wouldn't remove apps if the laws of the country required it.  Quite the opposite in fact.
  • Reply 31 of 38
    crowley said:
    xyzzy-xxx said:
    I don't believe anything Apple tells about privacy until image scanning (CSAM) on the device is finally declared to be dead. 
    It has already been explained that you have to opt-in to iCloud Photos syncing to enable CSAM scans.
    Until such time as some government mandates the scanning of images on device, irrespective of whether they're bound for iCloud Photos or not.  Apple will then "comply with all local laws and regulations", privacy be damned.
    Which a government can do at any time.  Apple's iCloud Photos solution has no bearing on that.
    Are you saying you believe a government (body) can mandate on device image scanning?
  • Reply 32 of 38
    pbpb Posts: 4,255member
    crowley said:
    pb said:
    xyzzy-xxx said:
    I don't believe anything Apple tells about privacy until image scanning (CSAM) on the device is finally declared to be dead. 
    It has already been explained that you have to opt-in to iCloud Photos syncing to enable CSAM scans.
    Until such time as some government mandates the scanning of images on device, irrespective of whether they're bound for iCloud Photos or not.  Apple will then "comply with all local laws and regulations", privacy be damned.
    It is more than clear that the CSAM scan is the excuse no one would refuse. The real goal is to put in place a technology that can be used by authorities for other purposes too. Besides, why it does protect against child abuse and exploitation if it can be turned off? Now that everyone knows how does this system work, real protection is going to be the exception.

    And having Apple saying that they will refuse government demands to scan for other content, is just ridiculous:


    Flat out liars.

    The idea of on-device scan technology has to be completely abandoned. A corporation should never get involved in such matters.

    Apple never said that they wouldn't remove apps if the laws of the country required it.  Quite the opposite in fact.
    Apple generally complies with local laws, and rightfully so. This is exactly the reason why they should never develop this kind of on-device scan technology. Governments in some countries where they are doing business can force them to put this technology into use different from CSAM detection, for example to track down (and persecute) dissidents, people belonging to groups considered "dangerous" or illegal (but quite legal in western societies) etc. What Apple would do then? Go and cry to the UN for human rights? Not comply and withdraw all business from such countries? Or cave and leave human lives to the mercy of authoritarian regimes?
    beowulfschmidt
  • Reply 33 of 38
    crowleycrowley Posts: 10,453member
    pb said:
    crowley said:
    pb said:
    xyzzy-xxx said:
    I don't believe anything Apple tells about privacy until image scanning (CSAM) on the device is finally declared to be dead. 
    It has already been explained that you have to opt-in to iCloud Photos syncing to enable CSAM scans.
    Until such time as some government mandates the scanning of images on device, irrespective of whether they're bound for iCloud Photos or not.  Apple will then "comply with all local laws and regulations", privacy be damned.
    It is more than clear that the CSAM scan is the excuse no one would refuse. The real goal is to put in place a technology that can be used by authorities for other purposes too. Besides, why it does protect against child abuse and exploitation if it can be turned off? Now that everyone knows how does this system work, real protection is going to be the exception.

    And having Apple saying that they will refuse government demands to scan for other content, is just ridiculous:


    Flat out liars.

    The idea of on-device scan technology has to be completely abandoned. A corporation should never get involved in such matters.

    Apple never said that they wouldn't remove apps if the laws of the country required it.  Quite the opposite in fact.
    Apple generally complies with local laws, and rightfully so. This is exactly the reason why they should never develop this kind of on-device scan technology. Governments in some countries where they are doing business can force them to put this technology into use different from CSAM detection, for example to track down (and persecute) dissidents, people belonging to groups considered "dangerous" or illegal (but quite legal in western societies) etc. What Apple would do then? Go and cry to the UN for human rights? Not comply and withdraw all business from such countries? Or cave and leave human lives to the mercy of authoritarian regimes?
    As I said earlier and have said multiple times, such government action is entirely possible now, it is not at all dependant on Apple deploying a CSAM detection technology.  So the "reason why they should never" that you give is not a reason at all.
  • Reply 34 of 38
    crowleycrowley Posts: 10,453member
    crowley said:
    xyzzy-xxx said:
    I don't believe anything Apple tells about privacy until image scanning (CSAM) on the device is finally declared to be dead. 
    It has already been explained that you have to opt-in to iCloud Photos syncing to enable CSAM scans.
    Until such time as some government mandates the scanning of images on device, irrespective of whether they're bound for iCloud Photos or not.  Apple will then "comply with all local laws and regulations", privacy be damned.
    Which a government can do at any time.  Apple's iCloud Photos solution has no bearing on that.
    Are you saying you believe a government (body) can mandate on device image scanning?
    Sure.  Why wouldn't they be able to?
  • Reply 35 of 38
    crowley said:
    crowley said:
    xyzzy-xxx said:
    I don't believe anything Apple tells about privacy until image scanning (CSAM) on the device is finally declared to be dead. 
    It has already been explained that you have to opt-in to iCloud Photos syncing to enable CSAM scans.
    Until such time as some government mandates the scanning of images on device, irrespective of whether they're bound for iCloud Photos or not.  Apple will then "comply with all local laws and regulations", privacy be damned.
    Which a government can do at any time.  Apple's iCloud Photos solution has no bearing on that.
    Are you saying you believe a government (body) can mandate on device image scanning?
    Sure.  Why wouldn't they be able to?
    In some countries, I am sure they can (North Korea and China are two excellent examples).  In the USA, a mandate could certainly be made, but it would not be enforceable.  Even if in the form of a law (which is quite different from a mandate as demonstrated by the plethora of "health" mandates which have been issued these last 20 months) there would almost certainly be an immediate injunction issued as the SCOTUS (in 2014 I believe) has essentially declared cellular phones private and exempt from search without a warrant.
    jony0
  • Reply 36 of 38
    pbpb Posts: 4,255member
    crowley said:
    pb said:
    crowley said:
    pb said:
    xyzzy-xxx said:
    I don't believe anything Apple tells about privacy until image scanning (CSAM) on the device is finally declared to be dead. 
    It has already been explained that you have to opt-in to iCloud Photos syncing to enable CSAM scans.
    Until such time as some government mandates the scanning of images on device, irrespective of whether they're bound for iCloud Photos or not.  Apple will then "comply with all local laws and regulations", privacy be damned.
    It is more than clear that the CSAM scan is the excuse no one would refuse. The real goal is to put in place a technology that can be used by authorities for other purposes too. Besides, why it does protect against child abuse and exploitation if it can be turned off? Now that everyone knows how does this system work, real protection is going to be the exception.

    And having Apple saying that they will refuse government demands to scan for other content, is just ridiculous:


    Flat out liars.

    The idea of on-device scan technology has to be completely abandoned. A corporation should never get involved in such matters.

    Apple never said that they wouldn't remove apps if the laws of the country required it.  Quite the opposite in fact.
    Apple generally complies with local laws, and rightfully so. This is exactly the reason why they should never develop this kind of on-device scan technology. Governments in some countries where they are doing business can force them to put this technology into use different from CSAM detection, for example to track down (and persecute) dissidents, people belonging to groups considered "dangerous" or illegal (but quite legal in western societies) etc. What Apple would do then? Go and cry to the UN for human rights? Not comply and withdraw all business from such countries? Or cave and leave human lives to the mercy of authoritarian regimes?
    As I said earlier and have said multiple times, such government action is entirely possible now, it is not at all dependant on Apple deploying a CSAM detection technology.  So the "reason why they should never" that you give is not a reason at all.
    OK, then please explain how they can force mass surveillance without having automatic on-device content scans.
  • Reply 37 of 38
    crowley said:
    pb said:
    crowley said:
    pb said:
    xyzzy-xxx said:
    I don't believe anything Apple tells about privacy until image scanning (CSAM) on the device is finally declared to be dead. 
    It has already been explained that you have to opt-in to iCloud Photos syncing to enable CSAM scans.
    Until such time as some government mandates the scanning of images on device, irrespective of whether they're bound for iCloud Photos or not.  Apple will then "comply with all local laws and regulations", privacy be damned.
    It is more than clear that the CSAM scan is the excuse no one would refuse. The real goal is to put in place a technology that can be used by authorities for other purposes too. Besides, why it does protect against child abuse and exploitation if it can be turned off? Now that everyone knows how does this system work, real protection is going to be the exception.

    And having Apple saying that they will refuse government demands to scan for other content, is just ridiculous:


    Flat out liars.

    The idea of on-device scan technology has to be completely abandoned. A corporation should never get involved in such matters.

    Apple never said that they wouldn't remove apps if the laws of the country required it.  Quite the opposite in fact.
    Apple generally complies with local laws, and rightfully so. This is exactly the reason why they should never develop this kind of on-device scan technology. Governments in some countries where they are doing business can force them to put this technology into use different from CSAM detection, for example to track down (and persecute) dissidents, people belonging to groups considered "dangerous" or illegal (but quite legal in western societies) etc. What Apple would do then? Go and cry to the UN for human rights? Not comply and withdraw all business from such countries? Or cave and leave human lives to the mercy of authoritarian regimes?
    As I said earlier and have said multiple times, such government action is entirely possible now, it is not at all dependant on Apple deploying a CSAM detection technology.  So the "reason why they should never" that you give is not a reason at all.
    In the US, Apple has previously used the "we don't have that capability" as a defense, and the courts have upheld their right to not be forced to develop additional capabilities to comply with government requests.  The case where the FBI completely horked up that terrorist's phone and wanted Apple to fix things comes to mind.  As long as Apple didn't have a specific ability, the government couldn't force them to develop it.  I'm not sure what would happen if some country like China told Apple they had to provide on-device scanning of images.

    Now that they have developed such a capability however, utilizing it in a way not originally intended is well within Apple's ability.  And it seems exceedingly likely, in my opinion, that it's only a matter of when, and not if, some government will tell them to do it.
Sign In or Register to comment.