Apple installs backdoors to iPhones
Apple has announced its plans to bring changes to its operating systems that sound to many security researchers like a massive privacy nightmare and the perfect invitation to unintended consequences.
Thank you for reading this post, don't forget to subscribe!
Raising concerns in the industry, the company argues it is doing so to protect children and limit the spread of Child Sexual Abuse Material (CSAM).
Two main concerning points are:
• Apple plans to add a scanning feature that will scan all photos as they are uploaded into iCloud Photos to see if they match a photo in the database of known CSAM maintained by the National Center for Missing & Exploited Children (NCMEC).
• It will also scan all iMessage images sent or received by child accounts (accounts designated as owned by a minor) for sexually explicit material. If the child is a minor, Apple will warn them if they try to send or receive sexually explicit photos and notify the parent.
However, security researchers, while supportive of the [CSAM] efforts, are concerned that Apple is enabling governments worldwide to effectively have access to user data, which could go beyond what Apple is currently planning, as is the case with all backdoors. While the system is being purported to detect child sex abuse, it could be adapted to scan for other text and imagery without user knowledge.
Security researchers around the globe have been writing about why this is effectively the end of privacy at Apple since every Apple user is now a criminal unless proven otherwise.
Via: WCCFTech
About Post Author
Discover more from CompuScoop.com
Subscribe to get the latest posts sent to your email.