jdw
About
- Username
- jdw
- Joined
- Visits
- 261
- Last Active
- Roles
- member
- Points
- 2,965
- Badges
- 1
- Posts
- 1,472
Reactions
-
Is it safe to leave your Mac plugged in and always on?
I've personally come to think it is most prudent to shutdown and then leave my mid-2015 15" MBP unplugged throughout the day at home while I am at work for the following two reasons:
1. Battery Bloat is real, which I talk about in my video here.
2. If there is ever a battery fault (arguably, extremely rare), a fire could be triggered in your absence, which could have catastrophic results for your home. -
Bill Maher declares Apple CSAM tools a 'blatant constitutional breach'
randominternetperson said:crowley said:jdw said:tedz98 said:The general public has no understanding of what a file hash is.
That really is the entire point which many who are defending Apple's move are ignoring. Nothing else matters, and certainly not the technical way in which CSAM scanning works. That's precisely why I've said in other threads that Apple is now obligated to at the very least DELAY the release until they can do PR damage control and at least try to win more public support. They cannot do that between now and the release of iOS 15, so the feature must be pulled from iOS 15 and delay the release until at least iOS 16. And if they never get public support and the matter seems only to get worse and worse, then the idea may need to be permanently shelved.
This is Tim Cook's call now. It's no doubt a hard call for him because he's played social justice warrior at times in the past, and this no doubt would seem like a step back for him. But it's a call he has to make and make soon. -
Apple details user privacy, security features built into its CSAM scanning system
Apple is still not making the majority of people feel comfortable with its plan, and as such Apple has the moral obligation to at the very least DELAY its plan until it can better convey to the public why they will be 100% protected. That remains true even of some contend this is being blown out of proportion.
Apple needs to explain IN DETAIL how it will proactively help anyone falsely accused of engaging in illicit activities seeing that Apple would be primarily responsible for getting those falsely accused people in that situation in the first place. That discussion must include monetary compensation, among other things.
Apple needs to explain how it intends to address the mental toll on its own employees or contracted workers who will be forced to frequently examine kiddy porn to determine if the system properly flagged an account or not. THIS IS HUGE and must not be overlooked! On some level it is outrageous that the very content we wished banned from the world will be forced upon human eyes in order to determine if a machine made a mistake or not.
Only when these points have been adequately addressed should Apple begin implement their plan or a variant of it. The next operating system release is too soon. -
Fed expansion of Apple's CSAM system barred by 4th Amendment, Corellium exec says
wwinter86 said:Shocking how many people do not want this feature and seem keen to protect the rights of pedophiles -
Apple privacy head explains privacy protections of CSAM detection system
After reading all the debate, I have not yet been convinced it is a good idea. And here's another reason to be highly cautious about it -- a reason you'd think Apple would care about...
What about the mental state of all those Apple employees forced to look at kiddo porn all day long when it comes time for Apple to check all those flagged photos?
Seriously. I've read a lot about FaceBook moderators who have to look at horrible posts all day long and have that take a mental toll on them. Imagine how much worse kiddy porn would be. And yet, Apple has a human review process. That means, your job will be to look at that stuff frequently. I can only image the mental havoc it will have on good people who know the evils of that stuff, but image if someone in the review process was a child predator getting their jollies off the entire review process!
I think these are legitimate concerns that few are talking about in this debate. We need to consider the mental toll on Apple employees (or contractors) forced to constantly review flagged photos.