The issue of the access of young people to social platforms, with all the risks that this can entail, is much debated in recent days.
For example, from Monday 7 November a double procedure wanted by Instagram for age verification has been active in several countries, including Italy. Double procedure that does not leave us without doubts. Because one of the two possibilities that the young person has to prove their majority age is to upload an identity document. But the other – which seems to us much more random – is based on a technology that would allow, after the user in question has shot and sent a short selfie video to Meta, to establish his or her age.
Apple is now also entering the very delicate issue of the protection of minors, introducing a new feature on iMessage which will soon be available in Italy. And that will allow you to recognize nude images, so that they are not seen by the little ones. Let’s find out what this tool is and how it works.
Apple and child protection on iMessage
The new service is called “Communication safety in Messages”, which in our country will be literally translated: “Communication safety in Messages”.
It will arrive in Italy shortly, but it is already present in the beta 2 of iOS 16.2, the update that will be released in mid-December.
The feature is already active in several countries. It arrived in the United States last April, and then landed in other Anglo-Saxon countries, as well as in France and Germany. And now it’s up to Italy and Spain. But how does Communication safety in Messages work?
How Apple’s Child Protection Works
Apple, we said, has introduced a protection to safeguard minors from viewing explicit images.
To activate it, you will first need to set the parental controls. The feature will be located under Usage Time and will be available to all accounts in the family group who are registered as minors.
We remind you that in Italy the creation of an autonomous Apple ID is reserved for those over 14 years old.
Once the service is activated on a child’s smartphone, Communication safety in Messages will be able to recognize nude images that the child may be trying to share, or is receiving.
A machine learning technology will at the same time be able to intercept the image and safeguard privacy, because it will analyze the content without transferring it to any external server.. In compliance, therefore, with the end-to-end encryption of iMessage.
Blurred images and warning messages
Nude images, incoming or outgoing, will be blurry. After that, somehow educational messages will appear towards the child. Three examples: “It’s not your fault, but nude photos and videos can be used to hurt you.” “The person who appears here may not want to be seen, this photo may have been shared without her permission.” “Don’t share anything if you don’t want to. Talk to someone you trust if you feel pressured ”.
After that, the boy has several options: can still continue, and send or receive the image. Or stop, but also block contact (in the case of an image received) and notify an adult.
There is no automatic forwarding of the explicit image to parents.
Doubts
We would not want to play the awkward shoes of moralists, but even this procedure leaves no doubt, just like the aforementioned Instagram age check.
There are two weaknesses: the lack of direct communication with the parents, and the possibility left to the very young to continue sending or receiving the nude image. An eventuality, the latter, which effectively nullifies any genuine desire to stem the phenomenon. Because it remains to be seen whether a blurry image and a pseudopedagogical message really have a deterrent function. Or if, on the contrary, they do not involuntarily increase the morbid curiosity of the boys.
Apple’s first filter attempt (with controversy)
Apple had already introduced child protection last year, which involved sending a notification to parents.
But that had aroused a double controversy: one on its scarce usefulness and the other on the violation of privacy. The balance between the incisiveness of such measures and the respect for privacy is indeed very thorny.
Of course, the idea of leaving young people the freedom of choice is suggestive: but in doing so we are treating children as adults, imagining that they possess a maturity of their own, precisely, of adulthood.
Let’s repeat what was said when talking about age verification on Instagram: better than these measures which are incomplete by their nature, it would carry out a continuous and non-invasive control action, by parents, on the virtual “life” of their children.
Leave a Reply
View Comments