AI p0rnography becoming a major issue in Spain

Specter Von Baren

Annoying Green Gadfly
Legacy
Aug 25, 2013
5,632
2,850
118
I don't know, send help!
Country
USA
Gender
Cuttlefish

Any thoughts on how this new technology is effecting the world? I'm not sure what could even be done to limit this kind of thing. Sure you could stop distribution on some level, but the ease with which one can just cook up their own AI created porn means this can be done without ever having anyone else know. Thoughts?
 

Gergar12

Elite Member
Legacy
Apr 24, 2020
3,419
813
118
Country
United States
Illegal and heinous. While I generally side with generative ai. This is not how we should use it.
 

Xprimentyl

Made you look...
Legacy
Aug 13, 2011
6,280
4,560
118
Plano, TX
Country
United States
Gender
Male
No thoughts whatsoever. If the adage "with great power comes great responsibility" rings true, it's been obvious for millennia that humankind are collectively the most irresponsible beings to ever exist. We can't have nice, important, or powerful things without the dregs and cautiously curious immediately finding ways to abuse and misuse them, then the capitalists finding a way to monetize them which emboldens the dregs and cautiously curious.

Once this whole AI buzz started in earnest, the famous quote from Jurassic Park immediately sprang to mind: "your scientists were so preoccupied with whether or not they could that they didn't stop to think if they should,." Well, there you go, AI porn because of the massive shortage of readily available porn out there already /s.
 
  • Like
Reactions: BrawlMan

BrawlMan

Lover of beat'em ups.
Legacy
Mar 10, 2016
27,100
11,363
118
Detroit, Michigan
Country
United States of America
Gender
Male
AI porn is a problem in a lot of places; not just Spain. Though they seem to have it the worst right now. Fictional characters is one thing, but it's another thing when it's actual people with lives. That includes real life porn stars too. Yeah, you might want to track down these sick bastards before it gets even worse and put some new laws in place. Stump those bastards in the groin while you're at it.
 
Last edited:

Bedinsis

Elite Member
Legacy
Escapist +
May 29, 2014
1,456
721
118
Country
Sweden
Once AI becomes a common tool for image generation/manipulation people will recognize such images as not reflective of reality and the blackmail attempt will be laughed out the room, since they will recognize that it is the equivalent of taking someone's face and photoshopping it onto some nude model, i.e. not reflective of the subject's character, behaviour or actual appearance.

There might need to be some new laws put in place, though I suspect as is it will mostly fall under regular sexual harassment laws. Or possibly copyright law.
 
  • Like
Reactions: BrawlMan

Schadrach

Elite Member
Legacy
Mar 20, 2010
2,003
357
88
Country
US
The way you police underage porn. It didn't just magically spawn out of nowhere; someone intentionally fed that AI model's with those girls faces, generated the images and circulated them at the school.
The way those kind of apps work is by feeding them a photo of the person in general, and the AI identifies the clothes and renders nude body fitting the shape of the clothes, as opposed to just a face photo. If you've seen the DALL-E tool that let's you add/edit a thing in an existing image, it's basically the same idea and technique, except the thing it's adding is always "naked female body" and the place it's adding it is where clothed body was as opposed to adding a flamingo inner tube to a photo of a pool or w/e.

A potentially useful consequence of this is that if she remembers taking a photo similar to it clothed, then she knows who might have had a copy of the original (assuming it wasn't like a social media post or something that literally everyone can see) and it might hypothetically be traced from there. Also, according to the article it had a watermark from the app used, which could be used to trace it down to a user on that site and identify the creator.

But these things aren't that new - from a quick Googling the first one launched in 2012 and a bunch more launched in the 2018-2022 range. Given what massive assholes teenagers tend to be, it amazes me it took this long for this to happen.
 

CM156

Resident Reactionary
Legacy
May 6, 2020
1,133
1,213
118
Country
United States
Gender
White Male
In a purely US law context:
You would probably be able to criminalize people producing AI porn of people.
But probably not ban their possession of it, given the precedent set by Stanley v. Georgia, 394 U.S. 557 (1969). And that's going to be the major problem.
That is, unless, a state successfully argues that possession of such material is closer to CSAM than it is conventional porn, and as such possession is unprotected, as per Osborne v. Ohio, 495 U.S. 103 (1990)

Also, always amazed to see examples of we, as humans, developing technologies that we don't have the moral restraint to not misuse.
 
Last edited:

Dirty Hipsters

This is how we praise the sun!
Legacy
Feb 7, 2011
7,940
2,305
118
Country
'Merica
Gender
3 children in a trench coat
Did anyone think this wasn't going to happen?

People have been photoshopping people's heads onto porn stars' bodies since photoshop was invented. This is the same thing except you don't have to learn how to do anything or have any artistic talent.
 
  • Like
Reactions: BrawlMan

Terminal Blue

Elite Member
Legacy
Feb 18, 2010
3,912
1,777
118
Country
United Kingdom
Once AI becomes a common tool for image generation/manipulation people will recognize such images as not reflective of reality and the blackmail attempt will be laughed out the room, since they will recognize that it is the equivalent of taking someone's face and photoshopping it onto some nude model, i.e. not reflective of the subject's character, behaviour or actual appearance.
I suspect that is very overly optimistic.

The fact that a specialist could go over an image pixel by pixel and identify it as a fake does not mean the average person is going to be able to distinguish between generated and real images, and to a certain extent it doesn't necessarily matter that much if they are real or not. If the images are realistic, then they are essentially replicating the experience of non-consensual voyeurism. It is still humiliating and violating because, even the image isn't actually of your body, it's still encouraging people to think about your body in the same terms.

The potential problem with AI in this case is that it makes what was previously achievable only with an enormous amount of skill and artistry available to anyone. A lot of photoshop edits look crap because they were made by unskilled users. Generative AI is already pretty good at some things, and is likely to improve dramatically as the technology develops.

There might need to be some new laws put in place, though I suspect as is it will mostly fall under regular sexual harassment laws. Or possibly copyright law.
In the UK, I believe this would fall under existing laws around child pornography, as UK laws generally don't distinguish between photographs or film of real events and artificially created images. I suspect we will see that become the norm and, while admitting it causes me considerable mental anguish, I feel like TERF island may actually have been ahead of the curve on that one.
 
  • Like
Reactions: BrawlMan

Thaluikhain

Elite Member
Legacy
Jan 16, 2010
18,695
3,594
118
Where I live, non-consensual pornography, real or fake, has been illegal since 2017.

Does make me wonder about, say, certain caricatures of political figures. During WW2, people made cartoons of Mussolini and Hitler having gay sex, which would now be banned. Not that we really need that sort of thing, but people could get in trouble (in theory) if they had some for historical reasons.
 

Ag3ma

Elite Member
Jan 4, 2023
2,546
2,196
118
I suspect that is very overly optimistic.

The fact that a specialist could go over an image pixel by pixel and identify it as a fake does not mean the average person is going to be able to distinguish between generated and real images, and to a certain extent it doesn't necessarily matter that much if they are real or not. If the images are realistic, then they are essentially replicating the experience of non-consensual voyeurism. It is still humiliating and violating because, even the image isn't actually of your body, it's still encouraging people to think about your body in the same terms.
I'm inclined to agree. The enhanced ability to harass or humiliate individuals with highly accurate AI-generated porn is an extremely serious concern.

The possibility also exists that if untackled, in the long term it may desensitise the vast majority of people because the knowledge that anyone can whip up a porn picture of another person will lead to an assumption such pictures are fake, and reduce the general sense of shame people may experience as victims. Although that is not to underweigh the likelihood that a significant number of people will still find it very distressing.
 
  • Like
Reactions: BrawlMan

McElroy

Elite Member
Legacy
Apr 3, 2013
4,582
377
88
Finland
Where I live, non-consensual pornography, real or fake, has been illegal since 2017.

Does make me wonder about, say, certain caricatures of political figures. During WW2, people made cartoons of Mussolini and Hitler having gay sex, which would now be banned. Not that we really need that sort of thing, but people could get in trouble (in theory) if they had some for historical reasons.
Speaking for Finnish law, I think our slander/libel laws are wide enough to cover fake porn that's not agreed upon. A lie or insinuation that has a real possibility of harm.

It's funny actually. We just now have a case going into the appeal process in which a one-sided depiction of real events got some slander charges stick, because apparently it's not okay to publicly tell you've left an abusive relationship, if people knew the partner.
 

Piscian

Elite Member
Apr 28, 2020
1,701
1,739
118
Country
United States
I highly suspect this will go the way these things usually do. Eventually it will affect people in actual power and they will lobby congress to create another DMCA like act that heavily restricts and punishes companies who design AI tools to limit how it can be used and the FBI will go after internet bodies who host or distribute. It's likely gonna take somebody like wealthy to put pressure on them or somebody with actual AI skills affects an election turn out by using tools to discredit a republican leader in congress.

However successful it will be hard to say. I work in.. call it "internet engineering" and I can confirm we have the tools, but MSO really don't like being responsible for this kinda stuff. We do it when when we get pressured through stuff like DMCA and CALEA. We can easily trace by socket and content to "get" anybody. The liability is tricky though.

TLDR - in the US at least the FBI and congress will eventually use their power to crack down on tool and content distributers to limit the blowback once it starts affect the rich and powerful. It's not quite "out of control" as it may seem on the surface.
 
  • Like
Reactions: BrawlMan

Specter Von Baren

Annoying Green Gadfly
Legacy
Aug 25, 2013
5,632
2,850
118
I don't know, send help!
Country
USA
Gender
Cuttlefish
I highly suspect this will go the way these things usually do. Eventually it will affect people in actual power and they will lobby congress to create another DMCA like act that heavily restricts and punishes companies who design AI tools to limit how it can be used and the FBI will go after internet bodies who host or distribute. It's likely gonna take somebody like wealthy to put pressure on them or somebody with actual AI skills affects an election turn out by using tools to discredit a republican leader in congress.

However successful it will be hard to say. I work in.. call it "internet engineering" and I can confirm we have the tools, but MSO really don't like being responsible for this kinda stuff. We do it when when we get pressured through stuff like DMCA and CALEA. We can easily trace by socket and content to "get" anybody. The liability is tricky though.

TLDR - in the US at least the FBI and congress will eventually use their power to crack down on tool and content distributers to limit the blowback once it starts affect the rich and powerful. It's not quite "out of control" as it may seem on the surface.
I think the unique aspect of this is that you don't really NEED a content distributor for this. Like others have said, photoshop still requires a degree of skill, while AI software has a much lower barrier of entry.
 

Piscian

Elite Member
Apr 28, 2020
1,701
1,739
118
Country
United States
I think the unique aspect of this is that you don't really NEED a content distributor for this. Like others have said, photoshop still requires a degree of skill, while AI software has a much lower barrier of entry.
By content I mean the end product. Deepfake host sites and individual users that traffic in illegal content. It would be similar to how the government went after team xecuter and rom sites, but this time it'll be deepfakes. The irony is we'll use AI to track that stuff down. So hopefully security AI and pr0n AI won't team up against us and input the launch codes. Can't really stop kids from deepfaking girls at school, but that stuff ends up coming to light anyway with severe consequences.
 
  • Like
Reactions: BrawlMan