US : Apple announced a series of new features targeted at child safety on its devices, which is not live so far, features of the app will be announced later this year.
Apples intention of these features are universally accepted to be good ones. The protection of children, the limit of the spread of Child Sexual Abuse Material (CSAM).
Head of Privacy at Apple shared details about the features, the tactical and strategic issues etc.
CSAM detection in iCloud Photos – A detection system called NeuralHash creates identifiers it can compare with IDs from the National Center for Missing and Exploited Children and other entities to detect known CSAM content in iCloud Photo libraries. Most cloud providers already scan user libraries for this information — Apple’s system is different in that it does the matching on device rather than in the cloud.
Communication Safety in Messages – A feature that a parent opts to turn on for a minor on their iCloud Family account. It will alert children when an image they are going to view has been detected to be explicit and it tells them that it will also alert the parent.
Interventions in Siri and search – A feature that will intervene when a user tries to search for CSAM-related terms through Siri and search and will inform the user of the intervention and offer resources.
TC