Center for Democracy & Technology: Apple’s backdoor scanning of Photos and Messages threaten users’ security and privacy
The reception to Apple’s backdoor scanning of Photos and Messages is clearly not the warm welcoming one for which Apple executives may have naively hoped by couching it within a laughably contrived Think of the Children™ introduction, as the Center for Democracy & Technology has plainly stated that the move “will threaten the security and privacy of its users and ultimately imperil secure messaging around the world.”
Thank you for reading this post, don't forget to subscribe!
Apple on Thursday said it will implement a system that checks photos on iPhones in the United States before they are uploaded to its iCloud storage services to ensure the upload does not match known images of child sexual abuse, but some privacy advocates said the system could open the door to monitoring of political speech or other content on iPhones. The Electronic Frontier Foundation (EFF) said in a statement that, “Apple is planning to build a backdoor into its data storage system and its messaging system.”
[On Thursday], Apple announced that it is planning to make several changes to its messaging and photo services in the United States which the Center for Democracy & Technology (CDT) believes will threaten the security and privacy of its users and ultimately imperil secure messaging around the world.
Apple describes these new policies as an effort to protect children, which is unquestionably an important and worthy goal. Proliferation of child sexual abuse material (CSAM) is an abhorrent crime against which firm action is required. However, CDT is deeply concerned that Apple’s changes in fact create new risks to children and all users, and mark a significant departure from long-held privacy and security protocols.
“Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the U.S., but around the world,” says Greg Nojeim, Co-Director of CDT’s Security & Surveillance Project. “Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.”
The changes Apple announced today create a backdoor, so that iMessage will no longer provide end-to-end encryption. These changes also create a dangerous precedent for allowing one account to essentially conduct surveillance of another. More specifically, Apple will add a feature to iOS that scans images in iMessages sent to and from users if they are on a family account. On these accounts, Apple will conduct machine learning-based “client-side scanning” in an attempt to detect sexually explicit imagery.
When the system detects a suspected “explicit” image to or from a child user on a family account, it will warn the user that the image is sensitive and notify them that a notice may be sent to the parent if the young person chooses to send or view the image.
The company also announced changes to its photo storing policies. Apple will store a database of hashes (small strings of data that serve as a fingerprint for an image) of child sexual abuse material (CSAM) on users’ phones. For users that have enabled iCloud photo storage, the operating system will check a user’s photos against the database before uploading them to iCloud. If a given account reaches a pre-set threshold of images that match the database, and a human reviewer agrees that the images depict CSAM, Apple will submit a report to the National Center for Missing and Exploited Children (NCMEC). Apple will also suspend the account, subject to appeal by the account owner.
These new practices mean that Apple will no longer be offering fully end-to-end encrypted messaging through iMessage and will be undermining the privacy previously offered for the storage of iPhone users’ photos. The mechanism that will enable Apple to scan images in iMessages is not an alternative to a backdoor — it is a backdoor. Client-side scanning on one “end” of the communication breaks the security of the transmission, and informing a third-party (the parent) about the content of the communication undermines its privacy. Organizations around the world have cautioned against client-side scanning because it could be used as a way for governments and companies to police the content of private communications.
The changes to iMessage’s privacy for users under 18 are particularly concerning because there is no guarantee that the parent-child account structure Apple has devised will be used as intended by actual parents and young people. The same tool that Apple intends to be used to fight predators seeking to “groom” potential victims could expose sensitive information about young people’s sexual identities to unsympathetic adults. And machine-learning classifiers for detecting nudity and sexually explicit content are notoriously error-prone; it’s almost certain that Apple’s new tool will mistakenly flag health information, memes, art, and advocacy messages as “explicit” and send alarming notifications to parents that imply their child is sexting.
Nojeim says, “The changes Apple announced are extremely disappointing, given the leadership and commitment to user privacy and communications security it has long demonstrated. Apple’s retreat from providing secure end-to-end encrypted services opens the door to privacy threats for all users, while creating new threats for young people. In particular, LGBTQ youth and children in abusive homes are especially vulnerable to injury and reprisals, including from their parents or guardians, and may inadvertently expose sensitive information about themselves or their friends to adults, with disastrous consequences.”
“Images attached to messages that were previously protected by end-to-end encryption will now be searched routinely using algorithms that have not been revealed to the public. And users who expect privacy in the photos they take and share with their iPhones can no longer have that expectation when those photos are backed up to iCloud. Instead, they should know that Apple will scan those photos,” Nojeim adds.
Via: Center for Democracy & Technology