August 20th, 2024

Popular AI "nudify" sites sued amid rise in victims globally

San Francisco's city attorney is suing 16 websites for creating non-consensual intimate imagery, seeking fines and shutdowns to protect victims amid rising harassment linked to AI-generated content.

Read original articleLink Icon
Popular AI "nudify" sites sued amid rise in victims globally

San Francisco's city attorney, David Chiu, has initiated a lawsuit against 16 popular websites and apps that enable users to create non-consensual intimate imagery (NCII) of women and girls, often referred to as "nudify" sites. The lawsuit claims these platforms are designed to facilitate the creation of fake nude images without consent, contributing to a significant rise in harassment and exploitation of victims. Chiu highlighted the alarming increase in such incidents, which has affected individuals globally, including celebrities and minors. The lawsuit seeks to impose fines of $2,500 for each violation of California consumer protection laws and aims to shut down these sites entirely. Chiu emphasized the need for accountability and the importance of leveraging existing laws against deepfake pornography and related offenses. The lawsuit also calls for restrictions on domain registrars and payment processors to prevent the launch of new sites. With over 200 million visits to these sites in the first half of 2024, the city attorney aims to raise awareness about the dangers of generative AI technologies being exploited for sexual abuse.

- San Francisco's city attorney is suing 16 "nudify" sites for creating non-consensual intimate imagery.

- The lawsuit seeks to impose fines and shut down these websites to protect victims.

- There has been a significant increase in harassment and exploitation linked to AI-generated imagery.

- The lawsuit aims to hold site operators accountable under existing laws against deepfake pornography.

- Over 200 million visits to these sites were recorded in the first half of 2024.

Link Icon 0 comments