Apple announces service to auto-scan iCloud images for child porn
Yesterday, Apple announced that with the launch of iOS 15 and iPadOS 15, it will begin scanning iCloud Photos in the U.S. to look for known Child Sexual Abuse Material (CSAM), with plans to report the findings to the National Center for Missing and Exploited Children (NCMEC).
Thank you for reading this post, don't forget to subscribe!
Prior to when Apple detailed its plans, news of the CSAM initiative leaked, and security researchers have already begun expressing concerns about how Apple’s new image scanning protocol could be used in the future, as noted by Financial Times.
To read the rest of the article, click here.
About Post Author
(Visited 48 times, 1 visits today)