Good luck with the dawn of AI, and being able to manipulate and create millions of pictures without any context, or intelligence behind them, just press the button and go, within 10 years what’s real and what’s not?
10 years? We already have deep fakes now and it's only going to get worse as time goes on and technology gets better and easier to use.
The other issue that many need to be thoughtful about is that there is a growing community called "Naturalists" aka NUDISTS, and it extends to not just adults being the primary participants in the community. Back in the 80's and 90's (I was born in 1980), I was personally interested in that type of Lifestyle. Mostly because of my Autism I have/had a textural issue (I hated the feel of Fabric and most other clothing type materials on my skin or body). I rather walk around naked/nude, and that was back when I was a CHILD!!! Imagine today, same problem for other children or teens whom suffer as I did, or just rather enjoy the pride of being Nude or of their body. Not everyone is inclined to want to wear clothing for the sake of "Moronic Modesty" originally enforced by religious nutcases 100's or 1000's of years ago. In the past 50+ years, the Nudist Communities have been expanding at a steady and increasing rate. Now, if CSAM is enforced wantonly and indiscriminately like HEAT wants, then most families whom live the Nudist Lifestyle will be arrested in minutes or days of uploading family photos of their lives and vacations.
Also, I am completely against CSAM, not because I do not care about the safety of Children, but because too many THINGS (which are too innocent or lacking in Context) can be easily misused as "MATERIAL" to convict someone of C.S.A.
Now, another issue to consider, and before I comment on it, let me first say, that I am in agreement with the whole 17+/18+ Laws for being considered a consenting Adult. Ok now onto what I was gonna say, so, the other issue to consider is that TEENS will be TEENS, and SEXT or do things they want to do within their own age groups. HOWEVER, tho we as a Society have determined 18+ as the primary age of an Adult, we must remember that MEDICALLY & BIOLOGICALLY, females, are TECHNICALLY an adult the MOMENT they have their first Menstrual Cycle (Age ranging from 9 to 13)... plain and simple... Cultures for 100's to 1000's of years all over the planet have taught this before RELIGION and Social Morality got involved. Girls, Women, Females, mature faster than Males, always have and most always will. The whole 18+ Laws were not made to protect the BODY of those that have "Matured for Reproduction", but to give both Females, and Males the opportunity to MATURE mentally for those few to several years. But due to Technology, privacy, and the reach of interaction outside of their "Community" has made it impossible to let them GROW both physically, and mentally over the long game. Rather, it has made it possible to make them Mature Mentally sooner rather than later, and unfortunately, that also means, when they physically THINK they are mature enough, they seek out those willing to HELP them seek experience.
Again Females tend to mature faster, but not always. They do need time, and should have it. Technology does not allow for that anymore, and that means they want to SHARE, learn, and experience the world, or their bodies. Males on the other hand, well, lmfao, they will never be Mentally an Adult hahaha, we all know us males will remain mentally immature for all of eternity. That again, with the thanks to Technology, there is no such thing as Privacy, there is no such thing as keeping those under 18 from seeking what is so readily available out there. We can TRY to impose so many restrictions, laws, and whatever else is out there. But that will not solve the inherent curiosity of the TEENAGE brain. Look at all of us before we had INTERNET.. we still used every means at our disposal to explore the "forbidden" or "taboo". There will always and were always a means to share these things, to learn these things, and to experience these things.
As for CSAM... even with adding technology to protect our children, these types of tech, or protective parameters on any kind of device, will only force those children to look outside of their devices, to do what they want, or, if they are still bold enough to keep using their devices for such means, will eventually find loopholes, or design better apps, code for themselves, and protect themselves FROM the very protection so called AGENCIES like HEAT try to implement/enforce. Along with that, HEAT is only pushing privacy into a corner to just absolutely disappear completely. Not only that, at some point, those teens, those children that miraculously have a higher mental maturity (much like I had at a very young age due to my Asperger's aka Autism) to still seek out those their own age, or those OLDER than them by their own volition, and by doing so, will get flagged by these so called CSAM protective protocols, and just getting those children thrown in jail, or incarcerated for nothing, all because they are teens whom are beyond curious, inciteful and persistent to get what they want/need.
“Apple doesn't "scan" user photos stored on device or in iCloud in any way.”
well, this is a patently false statement. If Apple weren’t scanning photos in any way how could the For You memories feature in the photos app exist. All our photos are being scanned for content, period. This is such a farce it’s sickening.
New article here on AI (Apple insider) just popped up on Apple’s use of AI (the sorta scarier one).
In part it talks about AI use in photos app: “Facial recognition in images is possible thanks to machine learning. The People album allows searching for identified people and curating images.
An on-device knowledge graph powered by machine learning can learn a person's frequently visited places, associated people, events, and more. It can use this gathered data to automatically create curated collections of photos and videos called "Memories."
You don’t have “curated collections” without content scanning.
Comments
Also, I am completely against CSAM, not because I do not care about the safety of Children, but because too many THINGS (which are too innocent or lacking in Context) can be easily misused as "MATERIAL" to convict someone of C.S.A.
Now, another issue to consider, and before I comment on it, let me first say, that I am in agreement with the whole 17+/18+ Laws for being considered a consenting Adult.
Ok now onto what I was gonna say, so, the other issue to consider is that TEENS will be TEENS, and SEXT or do things they want to do within their own age groups.
HOWEVER, tho we as a Society have determined 18+ as the primary age of an Adult, we must remember that MEDICALLY & BIOLOGICALLY, females, are TECHNICALLY an adult the MOMENT they have their first Menstrual Cycle (Age ranging from 9 to 13)... plain and simple... Cultures for 100's to 1000's of years all over the planet have taught this before RELIGION and Social Morality got involved. Girls, Women, Females, mature faster than Males, always have and most always will. The whole 18+ Laws were not made to protect the BODY of those that have "Matured for Reproduction", but to give both Females, and Males the opportunity to MATURE mentally for those few to several years. But due to Technology, privacy, and the reach of interaction outside of their "Community" has made it impossible to let them GROW both physically, and mentally over the long game. Rather, it has made it possible to make them Mature Mentally sooner rather than later, and unfortunately, that also means, when they physically THINK they are mature enough, they seek out those willing to HELP them seek experience.
Again Females tend to mature faster, but not always. They do need time, and should have it. Technology does not allow for that anymore, and that means they want to SHARE, learn, and experience the world, or their bodies. Males on the other hand, well, lmfao, they will never be Mentally an Adult hahaha, we all know us males will remain mentally immature for all of eternity. That again, with the thanks to Technology, there is no such thing as Privacy, there is no such thing as keeping those under 18 from seeking what is so readily available out there. We can TRY to impose so many restrictions, laws, and whatever else is out there. But that will not solve the inherent curiosity of the TEENAGE brain. Look at all of us before we had INTERNET.. we still used every means at our disposal to explore the "forbidden" or "taboo". There will always and were always a means to share these things, to learn these things, and to experience these things.
As for CSAM... even with adding technology to protect our children, these types of tech, or protective parameters on any kind of device, will only force those children to look outside of their devices, to do what they want, or, if they are still bold enough to keep using their devices for such means, will eventually find loopholes, or design better apps, code for themselves, and protect themselves FROM the very protection so called AGENCIES like HEAT try to implement/enforce. Along with that, HEAT is only pushing privacy into a corner to just absolutely disappear completely. Not only that, at some point, those teens, those children that miraculously have a higher mental maturity (much like I had at a very young age due to my Asperger's aka Autism) to still seek out those their own age, or those OLDER than them by their own volition, and by doing so, will get flagged by these so called CSAM protective protocols, and just getting those children thrown in jail, or incarcerated for nothing, all because they are teens whom are beyond curious, inciteful and persistent to get what they want/need.
well, this is a patently false statement. If Apple weren’t scanning photos in any way how could the For You memories feature in the photos app exist. All our photos are being scanned for content, period. This is such a farce it’s sickening.
New article here on AI (Apple insider) just popped up on Apple’s use of AI (the sorta scarier one).
In part it talks about AI use in photos app: “Facial recognition in images is possible thanks to machine learning. The People album allows searching for identified people and curating images.
An on-device knowledge graph powered by machine learning can learn a person's frequently visited places, associated people, events, and more. It can use this gathered data to automatically create curated collections of photos and videos called "Memories."
You don’t have “curated collections” without content scanning.