exceptionhandler
About
- Username
- exceptionhandler
- Joined
- Visits
- 964
- Last Active
- Roles
- member
- Points
- 344
- Badges
- 0
- Posts
- 381
Reactions
-
T-Mobile secretly records iPhone screens and claims it's being helpful
seltzdesign said:I used to work in UX and lots of companies use tools where you can track users interactions. I could basically watch recordings in real-time of people interacting with the app and seeing what they click on, where they browse to, etc. It isn't recording video as such, but just playing back the interactions in real-time. I forget the name of the tool, but a lot of companies use it and it will be covered in their terms and conditions under the umbrella of tracking user data, not much different to Google Analytics and the likes. We were not able to see anything outside of the app, but you would (or probably wouldn't, actually) be surprised what detailed information most apps collect. -
Apple has a month to comply with EU antisteering mandate, or get fined again
-
On-device Apple Intelligence training seems to be based on controversial technology
The CSAM detection system preserved user privacy, data encryption, and moreI suppose this isn’t technically wrong. The data will be encrypted, but when a threshold was met, it would allow the data to be decrypted and reviewed and potentially then sent to authorities. I had post several comments on this topic.
Apples CSAM detection is not end to end encrypted, which requires asymmetric keys to ensure that the Sender, and then the Receiver are the only ones privy to the contents. Introduce any other mechanism to enable review by a Man In The Middle, is essentially a backdoor into the algorithm.
but as some may say, the scanning was on device, what’s the issue with that? On device scanning is a very useful tool; it makes finding things easier on your device. What I do have issue with is the reporting part. It’s a form surveillance. Of what’s on your device. Something that should be private.
Yes, it only scanned when a photo was sent via iCloud and reported when a threshold was met, but that’s written in software, and software can change. So when that reporting gets triggered can change. Let’s say the government did like the results, and required Apple to be more strict to help find more positive matches?
The only correct way to think of iCloud with CSAM on device scanning was to view your photos as being in a semi public (eg public) space.
On device should be private, communication through Apple should be considered semi public (the data would still be encrypted in transit to Apple, but Apple would technically have full access), unless otherwise specified by Apple as being end to end encrypted and verified by a third party (true E2E, not wish it were E2E). -
New Magic Mouse may listen for voice commands
xyzzy-xxx said:Reminds me of Star Trek IV
link for those who don’t know:
https://m.youtube.com/watch?v=hShY6xZWVGE
-
Apple's study proves that LLM-based AI models are flawed because they cannot reason
12Strangers said:hexclock said:Of course they can’t reason. It’s not a living mind. It’s the illusion of intelligence.