IreneW

About

Username
IreneW
Joined
Visits
75
Last Active
Roles
member
Points
786
Badges
1
Posts
319
  • Security of Quebec vaccine passport app's QR codes questioned

    hexclock said:
    DAalseth said:
    DAalseth said:
    That is disconcerting. They are implementing a Vaccine Passport here in BC next month. I hope they use a more secure system. It's sad thought that there are people that will work hard to make themselves a fake VP, or buy one from someone, rather than just getting the shot for free. Reminds me of the people that spend a hundred dollars worth of time and hassle to build a system to save thirty dollars on their taxes. 
    I live in Quebec and what is sad is the implementation of a vaccine passport that will create two classes of citizens. Call it segregation, apartheid. This is 1938 Nazi Germany nothing less. BC and Ontario will follow. Canada is now known as Chinada. Shame.


    That's one of the dumbest comparisons I've ever run across. Sadly you're not the only one to make it. Read a book. Find out what TRUE oppression and victimization is. This is not Naziism or apartheid or segregation or China. You are just flat out wrong on that.
    What about people who can’t get the vaccine, for medical reasons? Are they just shit out of luck? 
    If you can't get the vaccine due to allergy or something else, it sounds like a super-bad idea to go to a concert or other big event -- even more so if you don't know whether the others are vaccinated. Or, what do you think should be the solution?
    darkvaderbaconstangyoyo2222
  • Apple evaluating Apple Watch end link as basis for universal connector standard

    Isn't that kind of broad description what we would normally attribute to patent trolls? I mean, the general case they are defining is a connector.
    darkvader
  • What you need to know: Apple's iCloud Photos and Messages child safety initiatives

    crowley said:
    crowley said:
    crowley said:
    Then I assume you don’t use Dropbox, Gmail, Twitter, Tumblr, etc etc… They all use the CSAM database for the same purpose. 

    The main take-away - commercial cloud hosting uses their servers. Should they not take measures to address child pornography on them? Not using their commercial service, there’s no issue. Is that not reasonable? One needn’t use commercial hosting services, especially if using it for illegal purposes.
    And this is exactly what criminals actually do: they are not stupid enough to use iCloud, they have the dark web, they have browsers and file transfer tools tailored to the special protocols developed for the dark web. Apple has long explained very well that iCloud backups are not encrypted. Law enforcement has (or should have) no issue with iCloud, because they can get any person’s unencrypted iCloud data anytime by presenting a court order. And I assure you, this is almost always much faster than Apple’s surveillance, based on the accumulation of some nasty tokens and the following human review.

    So, that child protection pretext stinks. Since law enforcement can access iCloud data anytime, Apple’s  attempt to adopt self-declared law enforcement role to “prevent crimes before they occur” is Orwellian !
    I'mma just leave this here:
    U.S. law requires tech companies to flag cases of child sexual abuse to the authorities. Apple has historically flagged fewer cases than other companies. Last year, for instance, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million, according to the center’s statistics. That enormous gap is due in part to Apple’s decision not to scan for such material, citing the privacy of its users.
    From: https://www.nytimes.com/2021/08/05/technology/apple-iphones-privacy.html
    Flagging such cases doesn't mean preventive Orwellian surveillance. Such a law cannot pass. Even if it did, it cannot be interpreted in such an Orwellian sense. Citizens will fight, courts will interpret.
    No idea what you're even talking about.  

    You said criminals are "not stupid enough to use iCloud", which is obviously untrue, since they're stupid enough to use Facebook.

    You said Apple are attempting to "prevent crimes before they occur", which doesn't seem to be true or even relevant.  Images of child abuse are definitely crimes that have already occurred.

    Stop using Orwellian like a trump word.  It isn't.
    This is why preventive Orwellian surveillance is not a solution. How will you distinguish a mother's baby shower photo from a child abuse photo? Not AI, I mean human interpretation. You need a context to qualify it as child abuse. The scheme as described will not provide that context. "Images of child abuse are definitely crimes that have already occurred", agreed, but if and only if they are explicit enough to provide an abuse context. What about innocent looking non-explicit photos collected as a result of long abusive practices? So, the number of cases Apple can flag will be extremely llimited, since such explicit context will mostly reside elsewhere, dark web or some other media.
    Have you even bothered to read these articles? Like even bothered? They do NOT evaluate the subject of your photos. They are specific hash matches to *known* child pornography, cataloged in the CSAM database. 

    Seriously fucking educate yourself before clutching your pearls. If you can’t read the article you’re commenting on, try this one:

    https://daringfireball.net/2021/08/apple_child_safety_initiatives_slippery_slope
    Apparently you fucking educated yourself enough to still not understand that an innocent looking photo may still point to child abuse but Apple’s scheme will miss it thus it is ineffective. Crime is a very complex setup, it cannot be reduced to a couple of hashes.
    No one is claiming that this system will solve the problem of child abuse. 
    Well, Apple _could_ improve even further by implementing more intelligent, AI based, scanning. Something along the lines of what Google and others are using. Or even just use Google's implementation:

    "How does Google identify CSAM on its platform?

    We invest heavily in fighting child sexual exploitation online and use technology to deter, detect and remove CSAM from our platforms. This includes automated detection and human review, in addition to relying on reports submitted by our users and third parties such as NGOs, to detect, remove and report CSAM on our platforms. We deploy hash matching, including YouTube’s CSAI match, to detect known CSAM.  We also deploy machine-learning classifiers to discover never-before-seen CSAM, which is then confirmed by our specialist review teams. Using our classifiers, Google created the Content Safety API, which we provide to others to help them prioritise abuse content for human review. 

    Both CSAI match and Content Safety API are available to qualifying entities who wish to fight abuse on their platforms. Please see here for more details."
    (From https://support.google.com/transparencyreport/answer/10330933#zippy=,how-does-google-identify-csam-on-its-platform)

    Why settle at known hashes, if this is only good?
    dewme
  • Apple warns against AirTag replacement batteries with bitter coatings

    macgui said:
    IreneW said:
    this just seems like a stupid oversight from Apples side. I guess there will be a quick rev 2.0. No big deal, just hope no one is getting hurt in the meantime.
    Not a stupid oversight on Apple's part. A stupid overreaction on Oz's part. No rev 2 with a screw or glue. Just use common sense (an oxymoron for some people) and keep small choking hazards away from children.

    Warning: AirTag, the battery cover, and the battery might present a choking hazard or cause other injury to small children. Keep these items away from small children.

    You _really_ don't get it, do you? The tag is specifically made to be attached to bags and other stuff that are routinely placed on the floor, on chairs and by the bedside where small children can easily reach them. Or on cats and dogs - shouldn't the children be allowed to come near the kittens?
    Companies, including Apple, need to follow some good design rules - not necessarily because you might be sued otherwise (which, I admit, is a stupid practice, but almost exclusively relevant in USA), but because it makes sense to reduce the risks of accidents. This exact problem has been solved in thousands of products by a tiny screw. I bet Apple will find a more elegant solution, but if they can't just add the damn screw.
    Why make such a fuss about Apple f@cking up this?
    williamlondonmuthuk_vanalingam
  • Apple warns against AirTag replacement batteries with bitter coatings

    crowley said:
    columbia said:
    Here we go again paying the price for irresponsible people.  That’s why we have all those labels, kill and safety switches all over our lawnmowers, washing machines and everything else.  It’s now found it’s way to our AirTags.
    It very specifically hasn't found its way to AirTags, because you can't use the bitterant batteries with AirTags, according to Apple.
    You don't get it.  Some retailers in Australia have pulled AirTags from their shelves because of fear of the regulators.  This is the "price" columbia is referring to.
    Well, it seems the fears are well founded - it _is_ easy to open the battery compartment, and the tag is specifically made to be fastened to bags and other stuff that are laying around. Considering how similar products are usually designed, this just seems like a stupid oversight from Apples side. I guess there will be a quick rev 2.0. No big deal, just hope no one is getting hurt in the meantime.
    williamlondon