Apple finally discontinues CSAM | CompuScoop.com

Apple is said to have dumped its ill-conceieved plan to scan users’ photos stored in iCloud for child sexual abuse material, or CSAM, amid a renewed privacy push.

Thank you for reading this post, don't forget to subscribe!

These safety tools, announced in August 2021, were meant to flag illicit content while preserving privacy. But the plans drew widespread criticism from digital rights groups who argued that the surveillance capabilities were ripe for potential abuse.

Apple put the plans on pause a month later. Now, more than a year after its announcement, the company has no plans to move forward with the CSAM-detection tool…

Apple says the best way to prevent online exploitation of children is to interrupt it before it happens. The company pointed to new features it rolled out in December 2021 that enabled this process.

The company announced Wednesday it will now offer full end-to-end encryption for nearly all the data its users store in its global cloud-based storage system, making it more difficult for hackers, spies and law enforcement agencies to access sensitive user information.

Via: Fox Business

About Post Author

(Visited 23 times, 1 visits today)


Advertisement

You may have Missed:

Verified by MonsterInsights