July 25th, 2024

AOC's Deepfake AI Porn Bill Unanimously Passes the Senate

The Senate passed the DEFIANCE Act, allowing victims of deepfake pornography to sue creators and distributors. The bill aims to provide legal recourse and address psychological harm from such abuse.

Read original articleLink Icon
AOC's Deepfake AI Porn Bill Unanimously Passes the Senate

The Senate has unanimously passed the DEFIANCE Act, a bipartisan bill aimed at providing legal recourse for victims of deepfake pornography, which involves the creation of non-consensual sexually explicit images using artificial intelligence. Spearheaded by Senators Dick Durbin and Lindsey Graham, along with Representative Alexandria Ocasio-Cortez, the legislation amends the Violence Against Women Act to allow individuals to sue those who produce or distribute such deepfakes if they knew or recklessly disregarded the lack of consent. Durbin emphasized the need for victims to have legal tools to combat this form of abuse, which current laws do not adequately address. Senate Majority Leader Chuck Schumer highlighted the widespread nature of the issue, noting that deepfake images can severely impact victims' lives. Ocasio-Cortez, who has personal experience with this type of abuse, described deepfakes as a form of digital humiliation akin to physical assault. The bill, which includes a refined definition of "digital forgery" and provisions for victim compensation, aims to address the psychological harm caused by deepfakes, including anxiety and depression. If passed by the House, the DEFIANCE Act would establish the first federal civil cause of action for deepfake victims, allowing both adults and minors to seek justice. Previous legislative attempts to regulate deepfakes focused on criminal penalties, while this bill emphasizes civil legal recourse.

Related

New York governor signs bill regulating social media algorithms, in a US first

New York governor signs bill regulating social media algorithms, in a US first

New York Governor signs laws regulating social media algorithms and children's data. Legislation targets addictive features in apps, mandates chronological content for under 18, limits late-night notifications. Debate over First Amendment rights.

Google's Nonconsensual Explicit Images Problem Is Getting Worse

Google's Nonconsensual Explicit Images Problem Is Getting Worse

Google is struggling with the rise of nonconsensual explicit image sharing online. Despite some efforts to help victims remove content, advocates push for stronger measures to protect privacy, citing the company's capability based on actions against child sexual abuse material.

Spain sentences 15 schoolchildren over AI-generated naked images

Spain sentences 15 schoolchildren over AI-generated naked images

Fifteen Spanish schoolchildren receive probation for creating AI-generated deepfake images of classmates, sparking concerns about technology misuse. They face education on gender equality and responsible tech use. Families stress societal reflection.

Copied Act would make removing AI digital watermarks illegal

Copied Act would make removing AI digital watermarks illegal

The COPIED Act aims to protect creators by regulating AI-generated content through authentication standards and legal repercussions for unauthorized use, garnering support from industry groups for transparency and accountability.

Deepfake Porn Prompts Tech Tools and Calls for Regulations

Deepfake Porn Prompts Tech Tools and Calls for Regulations

Deepfake pornographic content creation prompts new protection industry. Startups develop tools like visual and facial recognition to combat issue. Advocates push for legislative changes to safeguard individuals from exploitation.

Link Icon 20 comments
By @svieira - 3 months
The text of the House's version of the bill: https://www.congress.gov/bill/118th-congress/house-bill/7569...

* 10 year statute of limitations

* $150,000 limits in damages plus court costs and attorney fees

The crime:

> The term ‘digital forgery’ means any intimate visual depiction of an identifiable individual created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means, including by adapting, modifying, manipulating, or altering an authentic visual depiction, to appear to a reasonable person to be indistinguishable from an authentic visual depiction of the individual, regardless of whether the visual depiction indicates, through a label or some other form of information published with the visual depiction, that the visual depiction is not authentic ... [and]

> (I) the identifiable individual did not consent to such production, disclosure, solicitation, or possession;

> (II) the person knew or recklessly disregarded that the identifiable individual did not consent to such production, disclosure, solicitation, or possession; and

> (III) such production, disclosure, solicitation, or possession is in or affects interstate or foreign commerce or uses any means or facility of interstate or foreign commerce

By @jrm4 - 3 months
So, people should read the bill.

Roughly, my take is: It does feel true that this is hasty and responsive; there's an obvious "demand" for this kind of legislation.

That being said; I don't find the bill itself to be particularly problematic. The actual changes to the law are kind of minimal. "Digital forgery" as they use it, sure, could apply to Photoshops as well.

Big picture here, I have no qualms with this, even if it is more "show" than "substance."

By @codekisser - 3 months
If you read the actual law, it looks like this law pertains to "digital forgery" - computer-generated material that "falsely appears to be authentic". Does this mean that AI-generated porn is OK as long as it is explicitly labelled?

South Carolina is currently in a funny state where it's illegal to distribute non-consensual AI-generated porn of someone, but there aren't any laws against non-consensual revenge porn. Good news is that revenge porn laws have been adopted almost country-wide.

By @jsheard - 3 months
I wonder if CivitAI will stop hosting LoRAs of real people out of fear of liability now. I don't know if they explicitly condone deepfake porn, but they do host NSFW baseline models and LoRAs of real peoples faces under the same roof, so it's not hard to put two and two together.
By @techload - 3 months
By @vincnetas - 3 months
"to appear to a reasonable person to be indistinguishable from an authentic visual depiction of the individual"

so if deepfake would be generated with distinctive indication that this is AI generated forgery then it would be ok? for example cyborg hand or palm and everything else like real person.

By @al_borland - 3 months
> The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake...

How can someone control if they receive a deepfake?

By @dbg31415 - 3 months
Hey, so not trying to troll... I feel like that has to be stated when talking about this stuff.

I get that weaponizing revenge porn someone sent you can be damaging.

But I don't get what this law is trying to do.

Is Hustler v. Falwell still valid?

> In the case, Hustler magazine ran a full-page parody ad against televangelist and political commentator Jerry Falwell Sr., depicting him as an incestuous drunk who had sex with his mother in an outhouse.

https://en.wikipedia.org/wiki/Hustler_Magazine_v._Falwell

As long as it's labeled "AI Generated" in the corner, does that make it OK?

What if a person wanted to generate porn of Eva Braun & Adolf Hitler -- who knows, just for the shock value... would that be banned? Even if it's a snuff film?

I can think of a bunch of things that aren't full on traditional porn... can I still make a video of Katy Perry eating a banana? Can I show Hugh Jackman making out with John C. McGinley (as Dr. Cox, of course)? Or what if I wanted a video of Elizabeth Hurley doing ASMR in a bikini, or in a burqa?

Falls back to, "What's porn, anyway?" is it still "I know it when I see it?" / "Anything that gives a judge an erection?" Or is it anything involving using AI to draw people? I make a drawing on Civitai... can someone come up and say, "That looks like my great aunt Trudy, take it down now!"

We've had Photoshop forever... I get that AI is maybe (emphasis on maybe) a bit more realistic than Photoshop. But to me, a law like this feels like it'll have unintended and far reaching connotations for how we use AI to create scenes from each of our own vivid and unique imaginations.

By @kstrauser - 3 months
Good. I’m not a lawyer but “knowingly” seems to be a central point here. Does an AI image generator “know” it’s being asked to make a nude of Jane Smith? The person asking it certainly does. Does their “artist for hire”?

If this makes it all but illegal for AIs to generate images of people in the general case (e.g. outside a Hollywood studio using it for their own actors per contract), fine. So be it. Nothing of value would be lost.

“Oh no, I can’t add an uncanny valley picture to my blog” is a small price to pay for “I can’t get deepfakes of Taylor Swift anymore”.

I’m usually far on the other side of things like that, like Congress’s endless requests to make cell phones detect nudity in text messages. I guess it’s mainly that I don’t see value in AI generated pictures of humans. Having that capability hasn’t improved anything that I can see.

By @nuz - 3 months
It'll move to lawless eastern europe wild west probably.
By @throw20240724 - 3 months
> The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography

produce? why should it be illegal to create any forms on my own machine? where is the harm? or is this moral prudence at new level of invasiveness? is it now illegal to imagine someone naked?

distribute? yeah, this is a dick move

receive? so am i in trouble if someone else sends me this? it's now illegal to look up those swift fakes? why?

it seems more capable ai is being used as an excuse to limit our freedoms even more instead of the other way around. is there any system of law based on reason instead of votes?

By @neilv - 3 months
What about other deepfakes?
By @apwell23 - 3 months
This is great. Although I don't quite get the connection to abortion.
By @debacle - 3 months
When does this type of censorship (yes, that's what it is, though maybe a positive kind) run into the first amendment? If I make a deep fake of Trump making love to Lindsay Graham, when does it become protected speech? Do I need a storyline? Does there need to be a moral bent? Does Trump need to say something about Graham "crossing his border?"
By @elmerfud - 3 months
Don't let the pesky first amendment get in the way of a emotionally charged law. This is literally how all rights get taken away. You find an issue that is emotionally charged, then say some "new doohickey" causes it to now be different than it ever was in the past (fear mongering), talk about how this could possibly be impacting and we must take extreme measures by limiting freedoms (cast uncertainty), finally indicate that society will crumble due to this (doubt).

Same playbook regardless of issue. People in power stirred up by busy bodies have been doing this repeatedly for 100 years. I do find it odd that originally these kind of things were pushed by religious fundamentalists. Many laws were passed based upon religious fundamentalist ideas that restrict behaviors and restrict speech. The entire liberal push used to be to strip down these restrictions because they fundamentally violated some of the core freedoms we're granted in a free country.

Now it seems the script has flipped. I see more liberals pushing for restrictions of Rights and freedoms than I do religious conservatives. No good deed goes unpunished and no emotionally feel good law will remain unabused by those in power.

So while it's easy to agree with this core idea that deep fake porn is bad, it's conveying a new right that has never been conveyed before. This idea that you have ownership of your image and how it is used. Depending on how you twist the words this could crush a satire completely. This can crush freedom of artistic expression. This is ripe for abuse because it's so emotionally charged is why you get a unanimous but vote. Politicians hoping to gain brownie points but understand the court will largely gut this bill. Unfortunately the people who have their rights violated, the actual enumerated and protected rights, not the emotional feel good things people think are rights, have no recourse other than spending large amounts of money or hoping the ACLU will take up this fight.

By @sschueller - 3 months
Wouldn't a more effective way be to hold any maker of a porn deep fake financially liable for lost revenue?

Just like if I made a movie using a famous actor using AI.