I'm surprised the EU doesn't start fining these companies as this is surely a violation of GDPR? Personal data should only be used for the purposes that were stated up front. I assume video of someone in their home is considered personal data and training AI is not the intent most people have for their security cameras. If anything, a user should be alerted that this specific 30 second clip is considered a false alarm and a request made to use it for training. People need to be given full control of their data.
Right. Maybe, alert the user, ask for permission to use the clip for training and then let the user access a training tool that they can complete themselves. Is it really so difficult to use the training software that some stranger needs to review the clips?
I'm surprised the EU doesn't start fining these companies as this is surely a violation of GDPR? Personal data should only be used for the purposes that were stated up front. I assume video of someone in their home is considered personal data and training AI is not the intent most people have for their security cameras. If anything, a user should be alerted that this specific 30 second clip is considered a false alarm and a request made to use it for training. People need to be given full control of their data.
Right. Maybe, alert the user, ask for permission to use the clip for training and then let the user access a training tool that they can complete themselves. Is it really so difficult to use the training software that some stranger needs to review the clips?
We don't know.
Apple feels they are more qualified/in a better position to review users questionable or failed voice commands rather than the owner. Apple could alert the user and ask them to access a training tool to complete the review themselves, but that's presumably not a good option.
I'm surprised the EU doesn't start fining these companies as this is surely a violation of GDPR? Personal data should only be used for the purposes that were stated up front. I assume video of someone in their home is considered personal data and training AI is not the intent most people have for their security cameras. If anything, a user should be alerted that this specific 30 second clip is considered a false alarm and a request made to use it for training. People need to be given full control of their data.
Right. Maybe, alert the user, ask for permission to use the clip for training and then let the user access a training tool that they can complete themselves. Is it really so difficult to use the training software that some stranger needs to review the clips?
We don't know.
Apple feels they are more qualified/in a better position to review users questionable or failed voice commands rather than the owner. Apple could alert the user and ask them to access a training tool to complete the review themselves, but that's presumably not a good option.
Maybe it’s completely not on the same level, but I “train” the Photos app to learn people’s faces. It’s pretty easy and once it gets going Photos generally gets to know which face belongs to which person. I can also use the search function in Photos to find items in photos, no reviewer necessary, all processed on device. Not only that but I can have Photos find videos of certain things. For instance, I can search “beach videos” and I’m presented with videos I took at/near the beach. Again, on device, no reviewer necessary.
I feel like it must be similar to do with videos from a security camera. Maybe there are different categories that would need to be covered but, again, it doesn’t seem to be that difficult. Perhaps this will be an option going forward.
Comments
Apple feels they are more qualified/in a better position to review users questionable or failed voice commands rather than the owner. Apple could alert the user and ask them to access a training tool to complete the review themselves, but that's presumably not a good option.
I feel like it must be similar to do with videos from a security camera. Maybe there are different categories that would need to be covered but, again, it doesn’t seem to be that difficult. Perhaps this will be an option going forward.