Rayz2016

About

Banned
Username
Rayz2016
Joined
Visits
457
Last Active
Roles
member
Points
18,421
Badges
2
Posts
6,957
  • Samsung's new Galaxy Watch 4 models are not iOS compatible

    I want the Galaxy Watch 4 to be great and affordable. It seems to be those things. But their website doesn't even talk about the features that are important to me, like privacy. Does my health data ever leave my watch? Where is my health data stored? In Korea? North or South Korea?
    https://www.samsung.com/uk/info/privacy/

    They keep their privacy stuff in one place by the looks of it. Concerning your specific question:

    Your use of our Services will involve the transfer, storage, and processing of your personal information within and outside of your country of residence, consistent with this policy. In particular, your personal information will be transferred to the Republic of Korea. Please note that the data protection and other laws of countries to which your information may be transferred might not be as comprehensive as those in your country.

    [For European Economic Area (EEA) Residents Only]
    In addition, your use of the Services may also involve the transfer, storage, and processing of your personal information to other countries; such countries include, without limitation, countries in the European Economic Area, the United Kingdom, the United States of America, China, Singapore, Vietnam, India, Canada, the Philippines, and Japan. We will take appropriate measures, in compliance with applicable law, to ensure that your personal information remains protected. Such measures include the use of Standard Contractual Clauses to safeguard the transfer of data outside of the EEA. To request more information or to obtain a copy of the contractual agreements in place, contact us. See the CONTACT US section.

    (I added the highlight)

    Which, if I'm reading this correctly, means they can send it pretty much anywhere. Now, this is unusual, because most services keep your data on servers in the country where you signed up. For the EEA, they say they will take appropriate measures to ensure your data remains protected while when it's transferred outside the EEA. That sounds very woolly to me.



    Oferroundaboutnowwatto_cobrajony0
  • Apple privacy head explains privacy protections of CSAM detection system

    At least Apple is discussing the solution. Facebook (as admitted by a former head of privacy) detected 4.5 million CSAM related images and nobody knew about it or complained about it.
    Actually Facebook made a pretty big fuss about it. We just weren’t listening. 

    muthuk_vanalingamchemengin1darkvaderelijahg
  • Apple privacy head explains privacy protections of CSAM detection system

    Old privacy chief:

    "Apple has confirmed that it’s automatically scanning images backed up to iCloud to ferret out child abuse images.

    As the Telegraph reports, Apple chief privacy officer Jane Horvath, speaking at the Consumer Electronics Show in Las Vegas this week, said that this is the way that it’s helping to fight child exploitation, as opposed to breaking encryption."

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/

    New privacy chief:

    "The voucher generation is actually exactly what enables us not to have to begin processing all users’ content on our servers which we’ve never done for iCloud Photos" The TechCruch interview quoted in the article.

    What do you think?

    I think she’s a little out of the loop. 
    darkvader
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    elijahg said:
    elijahg said:
    Rayz2016 said:
    elijahg said:
    Bollocks. So when the Chinese government tells Apple to add a heap of CPP provided hashes, they’re going to refuse? Of course they won’t. If any government said provided data were hashes of CSAM material, who’s Apple to say it’s not?
    That's the great thing about the CSAM material; it's just hashes. In some countries it could kiddie porn; in other countries it could be photos taken by the police at protest march. And in those countries, Apple won't be the only ones checking the pictures.
    CSAM is not just hashes. Where did you get that idea? The hashes that Apple will compare against come from NCMEC, where the actual images are stored. The hashes are created from the images. Are we supposed to believe that NCMEC will now just accept a hash from any government that feels like sending it over without a corresponding image to go along with it?

    Let’s not forget that it US law requires tech companies to report incidences of CSAM. Also, using iCloud Photo Library is opt in, so people who are worried about their photos being matched to a hash don’t need to opt in.

    Gruber posits that doing the check client-side, rather than server-side, will allow them to fully encrypt iCloud backups.
    So you think China will be happy with Apple using hashes of NCMEC? Where the US government could insert hashes that are of someone they want in China, and then under the guise of CSAM find out all the photos they want of this person? 

    There is literally no point in encrypting backups if Apple has defied the trust of their customers by inserting this spyware. What's the point in end to end encryption if the spyware is already on the device pre-encryption? How long until it scans all files on your phone before syncing to iCloud? How long before it scans all files all the time? 
    That isn’t how hashes work. Hashes find the exact same photograph, not a photograph that is similar. So, your imagined scenario where the US government uploads a hash of a photo of someone they are looking for and in return get all photos of that person is not how it works. The uploaded hash would only help to find positive matches of that exact same photo.

    Also, as has been mentioned several times already, everyone can opt out.
    Try reading up on what you're trying to defend because otherwise you make yourself look pretty stupid. The matching is fuzzy, it looks for similar photos. Ones that have been cropped, mirrored, had a couple of pixels changed, blurred etc. 
    Which is why Apple needs to have a universal key to decrypt your image file so it can be checked manually by someone you neither know or trust. 


    Henry Farid, one of the people who helped develop PhotoDNA, wrote an article for Wired saying:

    Recent advances in encryption and hashing mean that technologies like PhotoDNA can operate within a service with end-to-end encryption. Certain types of encryption algorithms, known as partially or fully homomorphic, can perform image hashing on encrypted data. This means that images in encrypted messages can be checked against known harmful material without Facebook or anyone else being able to decrypt the image. This analysis provides no information about an image’s contents, preserving privacy, unless it is a known image of child sexual abuse.


    Sounds like a much better idea. What it has in common with Apple’s idea is that if you don’t want to be scanned then you don’t you don’t send it up to the server. What it has over Apple’s idea is that you know for sure there’s nothing nefarious going on because it’s not running spy software on the client.


    elijahgbaconstangmuthuk_vanalingam
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    elijahg said:
    Rayz2016 said:
    elijahg said:
    Bollocks. So when the Chinese government tells Apple to add a heap of CPP provided hashes, they’re going to refuse? Of course they won’t. If any government said provided data were hashes of CSAM material, who’s Apple to say it’s not?
    That's the great thing about the CSAM material; it's just hashes. In some countries it could kiddie porn; in other countries it could be photos taken by the police at protest march. And in those countries, Apple won't be the only ones checking the pictures.
    CSAM is not just hashes. Where did you get that idea? The hashes that Apple will compare against come from NCMEC, where the actual images are stored. The hashes are created from the images. Are we supposed to believe that NCMEC will now just accept a hash from any government that feels like sending it over without a corresponding image to go along with it?

    Let’s not forget that it US law requires tech companies to report incidences of CSAM. Also, using iCloud Photo Library is opt in, so people who are worried about their photos being matched to a hash don’t need to opt in.

    Gruber posits that doing the check client-side, rather than server-side, will allow them to fully encrypt iCloud backups.
    So you think China will be happy with Apple using hashes of NCMEC? Where the US government could insert hashes that are of someone they want in China, and then under the guise of CSAM find out all the photos they want of this person? 

    There is literally no point in encrypting backups if Apple has defied the trust of their customers by inserting this spyware. What's the point in end to end encryption if the spyware is already on the device pre-encryption? How long until it scans all files on your phone before syncing to iCloud? How long before it scans all files all the time? 
    Quite. 

    It’s still end to end, but we just scan and log it before the end begins … so to speak. 
    darkvaderelijahgmuthuk_vanalingam