maestro64

About

Username
maestro64
Joined
Visits
158
Last Active
Roles
member
Points
4,923
Badges
2
Posts
5,043
  • Ex-Facebook security chief says Apple should do more to protect children

    Coming from the company who was just caught not protecting kids from their own product and they knew it was harmful to teen girls, sounds like do not look here look there they are bad ones. When someone blames you of bad things it’s because they’re the one doing the bad things.
    williamlondonwatto_cobra
  • Apple adds original Apple Watch to list of vintage products

    Still have my SS series 0 and have been debating what to do with it since I recently bought a series 6 since the wife and I wanted to use the near feature the series 0 did not support. 

    I guess I need to add it display. 
    watto_cobra
  • EFF protesting Apple CSAM identification programs on Monday evening

    jungmark said:
    maestro64 said:
    Anyone who think this is good idea of Apple scanning your phones for images fail to see the altruist down line bad effects this can have. 

    I share this before it only works for images which are known about and stored in the CSAM data base these are image that sick people trade in. It can not catch newly created images since they are not in the database and a Hash has not been created. If it can catch new images not in the data base it looking for feature which could catch other nude image of young kids which is demand acceptable and it becomes an invasion of everyone's privacy. Once the tech is installed what going to stop bad actors for using it for other reasons.

    No one can say it will only catch the bad guys, no one know that to be true since no one really know how it works. This will catch the stupid people not the people who are smart and dangerous and who are creating the original images.

    You are free to give away your privacy, but your not free to give everyone else's away. I am glad to see what EFF is standing up.
    The hashes are obtained from at least two separate groups. The hashes will need to push in a software update to your phone. A human team will need to review those flagged images only if you reach a threshold. Let’s be honest, if you reach that threshold, you’re not innocent. Only then a special team will refer you the authorities. The authorities will investigate as hashes cannot be used as evidence. 

    You give up some piece of your privacy every time you walk out of your house. Your neighbors may have ring doorbells. Traffic cams. ATMs have cameras. Businesses have cameras. People have cell phones. 
    No one has an expecation of privacy outside their home, whether there are cameras there or not. However, you do have an expecation of privacy inside you home and on your propery. This include your car and things in your house and on your person. You also have an expectation of privacy on the phone and even talking to someone in public, This is why you can not record a third parites conversation when you are no part of the conversation. You can listen to their conversation and if you try to report them to the police they were talking about something illegal it may be considered hersay since you were not involved in the conversation. The fact you do not know this is concerning, please stop trying to give people privacy away.

    You claim there will be a special teams reviewing image this then imply that someone will have access to your phone and able to get the pictures off your phone. Lets not talk about how many laws were broken there. Also they system is only comparing the Hash of the file, there is nothing to review by the special team, system is just reporting that someone has an image that has a hash that match a hash in the database.

    If you do not think this is not complicated, go check why the Music industry stop going after individuals who they claim had illegal music down loaded on to their computers by claiming an IP was used by a speicfic house had down loaded music. Someone finally sued the music industry claiming they could not prove the owner of that household actually downloaded the files just becuase they had an IP address. You could had a visitor using your internet access or someone outside could have access your wireless network with our your permission and down load content. Thye got sue on the fact they invade people privacy by accessing their computers without their permission.

     

    muthuk_vanalingam
  • EFF protesting Apple CSAM identification programs on Monday evening

    This is complicate issue that most people fail to see the unintended consequence or do not care about them until they show up on your door step. We live in country where we error on the side of personal privacy, however, many here think its okay to invade privacy and cause innocent people to be questioned. This is system casing a wide net and can be used for more than just catching bad people. I am all about catching bad guys but i do not want to be sweep up in the process even innocently since it could cost you more than you think.

    Keep in mind Apple is going to Scan iCloud and your phone not just server side and Apple has every right to make sure their servers are not contributing to the problem. However, it only works if there is a original source Hash in the database, yes you have idiots who trade and share these photos which have been in circulations for a long time. Keep in mind this does not catch the abuser which is the people we want to really catch since they are the one creating the content. These are truly the bad people, catching the viewers of the content does not stop the human trafficing, The people who make this content they do not store their image on public servers and they will encrypt the files so they can not be seen or analyze. They have been doing this for a long time and they know how not to get caught. 

    If someone is using their phone to take images of kids the SCAM Hash will not catch these since they are new and the database does not have Hash of the image to compare against. There has been statement that the system will be able to know if someone taking an innocent baby photo verse someone who is involved in human trafficing.

    Also the images contained in the database can be modified to the point that it image Hash will not match the one stored in the database. Take any photo you have, create a Hash for it then open the image in Photoshop and run it through a filter and regenerate the Hash and they will not match, also go in and remove all the meta data on the image and the Hash will not match.

    Also as point pointed out, any government can place Hash in the database of any image they like and have the system tell who has those images on their phone. Keep in mind these organization doing this work is getting funding from governments so they are beholden to people giving them money.



    muthuk_vanalingam
  • EFF protesting Apple CSAM identification programs on Monday evening

    Anyone who think this is good idea of Apple scanning your phones for images fail to see the altruist down line bad effects this can have. 

    I share this before it only works for images which are known about and stored in the CSAM data base these are image that sick people trade in. It can not catch newly created images since they are not in the database and a Hash has not been created. If it can catch new images not in the data base it looking for feature which could catch other nude image of young kids which is demand acceptable and it becomes an invasion of everyone's privacy. Once the tech is installed what going to stop bad actors for using it for other reasons.

    No one can say it will only catch the bad guys, no one know that to be true since no one really know how it works. This will catch the stupid people not the people who are smart and dangerous and who are creating the original images.

    You are free to give away your privacy, but your not free to give everyone else's away. I am glad to see what EFF is standing up.
    Oferxyzzy-xxxchemengin1