Snowden against Apple’s CSAM takeover, “it’s a tragedy”

Apple vuole conoscere le fonti del leaker cinese thumbnail

The Apple’s CSAM detection function is the main topic of the latest editorial by Edward snowden on Substack, with the former CIA computer scientist turned journalist who calls the strategy a “tragedy”. In the latest installment of his newsletter, Snowden gets to the point, saying the solution “will permanently redefine what belongs to you and what belongs to them “.

Snowden against Apple’s new CSAM detection feature

The feature, which is expected to roll out with iOS 15 this fall, is a system that aims to avoid the dissemination of child pornography material. Apple will in fact be able to detect sexually explicit images involving children when they are stored in ICloud Photos. The feature will hash and match user photos uploaded to iCloud with a known CSAM (Child Sexual Abuse Material) hashed database extracted from at least two different entities. Importantly, unlike existing systems, Apple’s variation leads all processing on the device. This, according to Snowden, “will erase the line that divides which devices work for you and which devices work for them.”

“Once the precedent is established that it is appropriate even for a ‘pro-privacy’ company like Apple to make products that betray its users and owners, Apple itself will lose all control over how that precedent is applied,” writes Snowden.

He also claims that Apple’s implementation of CSAM detection capabilities has more to do with branding than child protection or regulatory compliance. In fact, he notes that the function can be simply avoided disabling uploads to iCloud. The idea that Apple will introduce the measure in preparation for end-to-end encryption on iCloud is also ludicrous, according to Snowden. Implementing such a system wouldn’t matter, Snowden says, because iPhone will already have built-in surveillance capability.

Snowden’s Fears

Like others, Snowden fears that governments abuse the system by forcing Apple to expand the CSAM function on the device or by requiring it to be always on by all users. “There is no fundamental technological limit to how far the precedent that Apple is creating can be pushed. Which means that the only brake is Apple’s overly flexible corporate policy, something governments understand all too well, ”writes Snowden.

Apple for its part claims that it will not bow to the demands of the government to expand the system beyond its original directive. In recent years, the company has positioned itself as a champion of privacy and security, investing in advanced hardware and software capabilities to achieve those goals. Critics argue that CSAM features, especially photo hash scanning, will not only tarnish that reputation, but will pave the way for a new era of digital surveillance.