Month: August 2021

Center for Democracy & Technology: Apple’s backdoor scanning of Photos and Messages threaten users’ security and privacy

The reception to Apple’s backdoor scanning of Photos and Messages is clearly not the warm welcoming one for which Apple executives may have naively hoped by couching it within a laughably contrived Think of the Children™ introduction, as the Center for Democracy & Technology has plainly stated that the move “will threaten the security and privacy of its users and ultimately imperil secure messaging around the world.”

Apple on Thursday said it will implement a system that checks photos on iPhones in the United States before they are uploaded to its iCloud storage services to ensure the upload does not match known images of child sexual abuse, but some privacy advocates said the system could open the door to monitoring of political speech or other content on iPhones. The Electronic Frontier Foundation (EFF) said in a statement that, “Apple is planning to build a backdoor into its data storage system and its messaging system.”

[On Thursday], Apple announced that it is planning to make several changes to its messaging and photo services in the United States which the Center for Democracy & Technology (CDT) believes will threaten the security and privacy of its users and ultimately imperil secure messaging around the world.

Apple describes these new policies as an effort to protect children, which is unquestionably an important and worthy goal. Proliferation of child sexual abuse material (CSAM) is an abhorrent crime against which firm action is required. However, CDT is deeply concerned that Apple’s changes in fact create new risks to children and all users, and mark a significant departure from long-held privacy and security protocols.

“Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the U.S., but around the world,” says Greg Nojeim, Co-Director of CDT’s Security & Surveillance Project. “Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.”

The changes Apple announced today create a backdoor, so that iMessage will no longer provide end-to-end encryption. These changes also create a dangerous precedent for allowing one account to essentially conduct surveillance of another. More specifically, Apple will add a feature to iOS that scans images in iMessages sent to and from users if they are on a family account. On these accounts, Apple will conduct machine learning-based “client-side scanning” in an attempt to detect sexually explicit imagery.

When the system detects a suspected “explicit” image to or from a child user on a family account, it will warn the user that the image is sensitive and notify them that a notice may be sent to the parent if the young person chooses to send or view the image.

The company also announced changes to its photo storing policies. Apple will store a database of hashes (small strings of data that serve as a fingerprint for an image) of child sexual abuse material (CSAM) on users’ phones. For users that have enabled iCloud photo storage, the operating system will check a user’s photos against the database before uploading them to iCloud. If a given account reaches a pre-set threshold of images that match the database, and a human reviewer agrees that the images depict CSAM, Apple will submit a report to the National Center for Missing and Exploited Children (NCMEC). Apple will also suspend the account, subject to appeal by the account owner.

These new practices mean that Apple will no longer be offering fully end-to-end encrypted messaging through iMessage and will be undermining the privacy previously offered for the storage of iPhone users’ photos. The mechanism that will enable Apple to scan images in iMessages is not an alternative to a backdoor — it is a backdoor. Client-side scanning on one “end” of the communication breaks the security of the transmission, and informing a third-party (the parent) about the content of the communication undermines its privacy. Organizations around the world have cautioned against client-side scanning because it could be used as a way for governments and companies to police the content of private communications.

The changes to iMessage’s privacy for users under 18 are particularly concerning because there is no guarantee that the parent-child account structure Apple has devised will be used as intended by actual parents and young people. The same tool that Apple intends to be used to fight predators seeking to “groom” potential victims could expose sensitive information about young people’s sexual identities to unsympathetic adults. And machine-learning classifiers for detecting nudity and sexually explicit content are notoriously error-prone; it’s almost certain that Apple’s new tool will mistakenly flag health information, memes, art, and advocacy messages as “explicit” and send alarming notifications to parents that imply their child is sexting.

Nojeim says, “The changes Apple announced are extremely disappointing, given the leadership and commitment to user privacy and communications security it has long demonstrated. Apple’s retreat from providing secure end-to-end encrypted services opens the door to privacy threats for all users, while creating new threats for young people. In particular, LGBTQ youth and children in abusive homes are especially vulnerable to injury and reprisals, including from their parents or guardians, and may inadvertently expose sensitive information about themselves or their friends to adults, with disastrous consequences.”

“Images attached to messages that were previously protected by end-to-end encryption will now be searched routinely using algorithms that have not been revealed to the public. And users who expect privacy in the photos they take and share with their iPhones can no longer have that expectation when those photos are backed up to iCloud. Instead, they should know that Apple will scan those photos,” Nojeim adds.

Via: Center for Democracy & Technology

Apple announces service to auto-scan iCloud images for child porn

[caption id="" align="alignright" width="473"] Image by: Apple, inc.[/caption]

Yesterday, Apple announced that with the launch of iOS 15 and iPadOS 15, it will begin scanning iCloud Photos in the U.S. to look for known Child Sexual Abuse Material (CSAM), with plans to report the findings to the National Center for Missing and Exploited Children (NCMEC).

Prior to when Apple detailed its plans, news of the CSAM initiative leaked, and security researchers have already begun expressing concerns about how Apple's new image scanning protocol could be used in the future, as noted by Financial Times.

To read the rest of the article, click here.

Apple announces updates to Apple Pay

Apple sent out an email to Apple Pay customers announcing updates to its pay service.

They are as follows:

We’ve made a couple of updates to Apple Cash.1

  • Starting today, you can use Instant Transfer with Mastercard debit cards in addition to Visa debit cards. Instant Transfer is the fastest way to transfer money from your Apple Cash balance to your bank account.2
  • Beginning August 26, 2021, the cost to make an Instant Transfer will change to 1.5% of the transfer amount, with a minimum fee of $0.25 and a maximum fee of $15. The Apple Cash Terms & Conditions have been updated effective today, August 5, 2021, to reflect the new pricing.
  • You can also transfer money to your bank account using ACH and receive it within 1 to 3 business days with no fee.

To make an Instant Transfer, go to your Apple Cash card in the Wallet app and tap (...) icon. Tap Transfer to Bank, enter an amount, and select Instant Transfer.

You may have Missed:

Verified by MonsterInsights