Never as for the topic we will be dealing with today is the adage that no technology is good or bad, but it depends on how it is used.
We have already talked about the deepfake in several articles, illustrating – so to speak – its vices and virtues.
First of all let us briefly recall what it is about. In short, the deepfake, based on artificial intelligence, “feeds” on images and videos of a specific character, of which it is in turn capable of reproducing extremely realistic videos.
We said, in fact, that it depends on the use that is made of the deepfake.
The deepfake and its possible uses
For example, we wrote that the deepfake could make dubbed films much more likely, by matching the sentences spoken by the voice actor with the lips of the actors.
A controversy has arisen around an actor, Bruce Willis, regarding the deepfake: Willis, unable to act because he was struck by aphasia, has or has he not transferred the rights to his images to a deepfake company?
But, if we want to think of a perverse use of the deepfake, our memory must go back to the beginning of the Russian-Ukrainian conflict, when the AI made a fake Zelensky pronounce a message of his country’s surrender.
I porno deepfake di FaceMega
The deepfake, however you take it, artfully manipulates someone’s features.
So the temptation to use it a little chivalrous is, we imagine, strong.
In this sense, what happened in recent days on the FaceMega app is emblematic, and which involved actresses such as Scarlett Johansson and Emma Watson (Harry Potter’s Hermione, so to speak).
Let’s find out what happened, first of all explaining what FaceMega is.
What is Face Mega
FaceMega is an app (which, while we are writing the article, is not present on the app stores) along the lines of FaceApp.
That is, which allows, through artificial intelligence, to show faces and bodies drawing inspiration – let’s say so – from those of real people. In short, FaceMega uses the deepfake casually.
Too casual, to see the advertising campaign launched in recent days.
I porno deepfake di FaceMega
In fact, 127 videos of Emma Watson and 74 of Scarlett Johansson, in provocative or explicit attitudes, have been circulated on Facebook and Instagram, thanks to FaceMega.
Sure: it was the face of the two actresses retouched with AI and superimposed on the bodies of porn stars.
Emma Watson, moreover, had already been the victim of a deepfake porn video in 2020.
You are not satisfied with such cheek, the managers of the app have even circulated a short tutorial on Meta’s social networks showing how to create similar “hybrids”.
And to think that Zuckerberg’s company announced at the beginning of 2020 that it would block the spread of manipulated content.
FaceMega, on the other hand, allows (at a cost of 8 dollars a week) to freely exchange absolutely credible videos, superimposing the bodies and faces of anyone. It goes without saying what are the risks of such a trade in deepfake porn videos.
Non-consensual pornography
Unless such a tool is blocked or regulated, deepfake porn would fuel mindless non-consensual pornography.
Which could affect unsuspecting, psychologically fragile people. And it could open a disturbing new chapter on gender-based violence in the digital age.
It goes without saying that this technology, if left unchecked, could target not only celebrities but, for example, minors.
Artificial intelligence is not a game to be watched passively, it is not a challenge that stimulates us to always go a little further. We need to regulate it, it is necessary that the legal constraints remind us of the ethical ones. Breast, what appears to be a harmless pastime (deepfake porn) can easily degenerate into cyberbullying, non-consensual pornography, child pornography.
The numbers of deepfake porn
Just to frame the problem also from a numerical point of view, in 2019 DeepTrace found that 96% of deepfake videos circulating on the Internet are pornographic.
The amount of deepfake porn videos has doubled every year since 2018, and February 2023 saw the highest monthly number of deepfake porn produced and uploaded to the Internet.
We need, we repeat, clear and strict laws on the subject. Because these data show that hoping for global repentance would really be a naive hope.
Leave a Reply
View Comments