Apple will launch new features to help law enforcement fight child sexual abuse

115
SHARE

NMEC and other child safety organizations provided Apple with a database of CSAM image hashes. The tech giant does not scan images. It performs on-device matching using the CSAM image hashes database, which is further transformed into an unreadable set of hashes securely stored in user’s devices.

Enter Email to View Articles

Loading...

Apple’s matching process is powered by a cryptographic technology called private set intersection. The technology determines if an image matches a CSAM hash without revealing the result. Instead, it creates a cryptographic safety voucher that encodes the match result together with additional encrypted data about the image. The voucher and the image are uploaded to iCloud Photos.

The tech giant also uses another technology called threshold secret sharing to ensure that it cannot interpret the contents of a safety voucher unless an iCloud Photos account reaches a threshold of CSAM content. If that happens, the cryptographic technology will allow the tech giant to review the contents of a safety voucher associated with matching CSAM images, disables the users’ account, and report it to NMEC or law enforcement. Users can file an appeal if they believe their accounts have been disabled by mistake.

Children and parents will receive warnings when sending or receiving sexually explicit photos

Apple said its Messages app will have a new feature that will warn children and parents when sending or receiving sexually explicit photos.