Apple Inc (NASDAQ: AAPL) is launching new features in its Messages app, iOS, iPadOS, and macOS to help law enforcement fight child sexual abuse.
According to Apple, the new child safety features will expand protections for children “against predators who use communication tools to recruit and exploit them and limit the spread of Child Sexual Abuse Material (CSAM), which depicts sexually explicit activities involving a child.
Apple to report images of sexual abuse stored in iCloud to law enforcement
Apple said one of the new features involves using a detection technology in iOS and iPadOS. The technology will report detected CSAM images stored in iCloud Photos to the National Center for Missing and Exploited Children (NCMEC), which is working with law enforcement agencies across the United States.
The tech giant’s CSAM detection technology uses a process called hashing to maintain user privacy. Hashing is a process any content or material is converted into unique numbers also called hashes.