Apple risponde alle preoccupazioni sul rilevamento CSAM e la scansione dei messaggi thumbnail

Apple responds to concerns about CSAM detection and message scanning

Apple has published a FAQ titled “Expanded Protections for Children” which aims to allay users’ privacy concerns about the new CSAM detection in iCloud Photos e la communications security for messages that the company announced last week.

Apple and the FAQ on CSAM and message scanning

“Since we announced these features, many stakeholders, including children’s privacy and safety organizations, have expressed their support for this new solution. Some have asked us questions, ”reads the FAQ. “This document serves to address these questions and provide more clarity and transparency in the process.”

Some discussions have blurred the distinction between the two characteristics and Apple is committed, in the document, to differenziarle. The company explains that communication security in Messages “only works on images sent O received in the Messages app for child accounts set in Family Sharing“, While CSAM detection in iCloud Photos“ only has an impact on users who have chosen to use iCloud Photos to store their photos. There is no impact on any other data on the device ”.

From the FAQ Apple reveals that these two features They are not the same thing e they don’t use the same technology.

Some information from the FAQ

Apple reveals that communications security in Messages is designed to give to parents and children additional tools to help protect their children from being sent and from receiving sexually explicit images in the Messages app. This only works on images sent or received in the Messages app for child accounts set up in Family Sharing.

The instrument analyze images on the device and then does not change the privacy guarantees of Messages. When a child’s account sends or receives sexually explicit images, the photo will be blurred and the baby will be warned, useful resources will be presented to him and he will be reassured. A message will tell them it’s okay if they don’t want to see or send the photo. As an added precaution, young children may also be told that to keep them safe, their parents will get a message if they see it.

The second feature, CSAM detection in iCloud Photos, is designed for keep CSAM out of iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images.

CSAM images they are illegal to own in most countries, including the United States. This feature has an impact only on users who have chosen to use iCloud Photos to archive their photos. It has no impact on users who have not chosen to use iCloud Photos. There is no impact on any other data on the device. This feature does not apply to Messages.

For more information, you can consult the PDF published by Apple and also the official website.

Walker Ronnie is a tech writer who keeps you informed on the latest developments in the world of technology. With a keen interest in all things tech-related, Walker shares insights and updates on new gadgets, innovative advancements, and digital trends. Stay connected with Walker to stay ahead in the ever-evolving world of technology.