xyzzy-xxx

About

Username
xyzzy-xxx
Joined
Visits
65
Last Active
Roles
member
Points
850
Badges
1
Posts
222
  • Outdated Apple CSAM detection algorithm harvested from iOS 14.3 [u]

    Mass flooding the system with recognized hashes produced from manipulated images of Apple employees would be very funny 😆
    elijahgdarkvader
  • Apple giving iOS 15 users choice between new and old Safari design

    That's a good thing, now let's give users also a CSAM scan free version iOS15 and I will update and also buy the iPhone 13 :)
    rcfamuthuk_vanalingamOctoMonkey
  • German government wants Tim Cook to reconsider CSAM plans

    bluefire1 said:
    No matter how laudable Apple‘s motives may be, this is an idea which should never have come to pass.
    Totally agree, if it's really about the cloud they would need to scan in the cloud and not putting spyware onto a billion if devices.

    Since scanning data on user's devices is prohibited in many countries, Apple is also in legal trouble (even when it officially is only in the USA), since a company plays the gatekeeper of this spyware and could change things at any time (even for specific users)!
    newisneverenoughmuthuk_vanalingamOferentropysbaconstang
  • German journalism association stokes fear over Apple CSAM initiative

    German government / Bundestag just asked Apple to drop this feature !
    I am sure more will follow from other countries.

    Apple should just put the mess in the cloud, then it would not scan private data on user devices (what is prohibited in many countries).

    Currently it's just a spyware / backdoor completely controlled by a company – this is just illegal (and dangerous if it gets hacked - think Pegasus).
    muthuk_vanalingam
  • What you need to know: Apple's iCloud Photos and Messages child safety initiatives

    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.

    radarthekat said:
    chadbag said:
    I’ll admit to not reading all 138 (as of now) replies.  However, this article is misleading when it claims this is not new tech and Apple is just catching up with others who have been doing this for a decade.  The fact is, this IS new technology and Apple is the first to use it.  Google, Dropbox, Microsoft, etc are NOT scanning on my private device.   They scan on their servers when you upload to them.   The difference is that Apple is putting this on device.  And once on device, it can be updated to do check anything they want.  All the safeguards Apple claims are just policies.  And can easily be changed. 
    The same technology on device versus on-server is still the same technology, just in a different place.
    So simply stated.  How can so many still be so confused?  Boggles the mind. 
    No, it is NOT as simple as that. There is a difference and it is an important one. Apple conducting search for CSAM in iCloud is "searching for illegal content within their property". Apple conducting search for CSAM in end user's phone is "Invasion of privacy of the individual". The scope of former CANNOT be expanded in future to other contents not uploaded to iCloud. The scope of latter CAN be expanded to other contents within the phone in future.
    You do know it’s applied only to images to be uploaded to iCloud.  Which is an important distinction on two aspects.  1. A user can avoid this by not uploading images to iCloud.  And 2, this allows Apple to ensure that it’s not hosting illegal child abuse images on its servers.  
    Until it is changed or hacked this is only applied to images loaded to iCloud. If you want to scan images loaded to a cloud just do it in the cloud!
    But NEVER on my devices.
    Pascalxxmuthuk_vanalingam