ONG Humanos vs Inteligencia Artificial

NUNE AUT NUNQUAM.

24/9/23

The danger of Artificial Intelligence that undresses girls.

ClotOff generates the fake image without any type of safeguard

("this image is fake", for example).

It is limited to placing a watermark on the fake image generated with the company logo.

The Spanish Police opened an investigation after images of girls, altered with artificial intelligence to remove their clothes, were sent to a city in the south of the country.

A group of mothers from Almendralejo, in the Extremadura region, reported that their daughters had received images of themselves in which they appeared naked.

One of the mothers, Miriam, contacted Dr. Javier Miglino, Founder of the first NGO on the planet that is already working against the abuses of A.I. Worldwide. The NGO Humans vs Artificial Intelligence was able to interview via Zoom with the desperate mother who explained the situation, which as it is the first of its kind to occur, no one can offer a solution.

The testimonies of the victims:

The affected girls say that someone had used an application to make images of them in which they appeared to be naked.

They don't understand the damage they have caused.

"Although the Police identified some of the young people who could be involved in the production of the images, the age of the authors (12 to 15 years) makes it impossible to prosecute them criminally in Spain. However, the damage they have done to the girls , especially taking into account that they are just young ladies between 12 and 14 years old and all this happens in a small town where everyone knows each other. The mothers ask us for help because they don't know what to do or what they are facing. As I write this statement they are emails arriving stating that the same thing has happened in Argentina, France, South Korea and Mexico," said Dr. Javier Miglino, Expert in Human Rights and Ethics and Founder of the NGO Humans vs Artificial Intelligence.

We call for solidarity:

"Those who have access to staff who work on the most well-known pornographic sites in the world, such as Pornhub, Xhamster and responses in this case and on top of that, there would be similar situations in other parts of Europe, Latin America and Asia," Miglino concluded.

The ClothOff company 'washes its hands' with the use of images:

Part of the International Multidisciplinary Team of the NGO Humans vs Artificial Intelligence contacted the company ClothOff, provider of the software. They told us that they are not responsible for the use that will be given to the images that are generated (such as extorting someone). The Internet user "is responsible for not using the application for illegal or unauthorized purposes," the website specifically says that it does not accept judicial responsibility for what can be done with its technology.

ClothOff declares that it does not preserve images:

When asked what they do with the images that are uploaded or built on the page, prior to generating the content, they told us that they do not store any type of data. This implies that once the input image has been used in the AI, it is deleted from its database.


 

Alerta en Buenos Aires, Argentina por un joven que desnudó chicas con ClotOff.

El software ClotOff que desnuda chicas es un verdadero peligro.