He allegedly used Stable Diffusion, a text-to-image generative AI model, to create “thousands of realistic images of prepubescent minors,” prosecutors said.
The problem with AI CSAM generation is that the AI has to be trained on something first. It has to somehow know what a naked minor looks like. And to do that, well… You need to feed it CSAM.
So is it right to be using images of real children to train these AI? You’d be hard-pressed to find someone who thinks that’s okay.
The images were created using photos of real children even if said photos weren’t CSAM (which can’t be guaranteed they weren’t). So the victims were are the children used to generate CSAM
Let’s do a thought experiment, and I’d look to to tell me at what point a victim was introduced:
I legally acquire pictures of a child, fully clothed and everything
I draw a picture based on those legal pictures, but the subject is nude or doing sexually explicit things
I keep the picture for my own personal use and don’t distribute it
Or with AI:
I legally acquire pictures of children, fully clothed and everything
I legally acquire pictures of nude adults, some doing sexually explicit things
I train an AI on a mix of 1&2
I generate images of nude children, some of them doing sexually explicit things
I keep the pictures for my own personal use and don’t distribute any of them
I distribute my model, using the right to distribute from the legal acquisition of those images
At what point did my actions victimize someone?
If I distributed those images and those images resemble a real person, then that real person is potentially a victim.
I will say someone who does this creepy and I don’t want them anywhere near children (especially mine, and yes, I have kids), but I don’t think it should be illegal, provided the source material is legal. But as soon as I distribute it, there absolutely could be a victim. Being creepy shouldn’t be a crime.
I think it should be illegal to make porn of a person without their permission regardless of if it was shared or not. Imagine the person it is based off of finds out someone is doing that. That causes mental strain on the person. Just like how revenge porn doesn’t actively harm a person but causes mental strafe (both the initial upload and continued use of it). For scenario 1 it would be at step 2 when the porn is made of the person. For scenario 2 it would be a mix between step 3 and 4.
Thanks for sharing! I’m going to disagree with pretty much everything, so please stop reading here if you’re not interested.
Imagine the person it is based off of finds out someone is doing that. That causes mental strain on the person…
Sure, and there are plenty of things that can cause mental strain, but that doesn’t make those things illegal. For example:
public display of affection - could cause mental stain people who recently broke up or haven’t found love
drug use - recovering addicts could experience mental strain
finding out someone is masturbating to a picture of you
And so on. Those things aren’t illegal, but someone could experience mental strain from them. Experiencing that doesn’t make you a victim, it just means you experience it.
revenge porn doesn’t actively harm a person but causes mental strafe
Revenge porn damages someone’s reputation, at the very least, which is a large part of why it’s illegal.
Someone keeping those images for private use doesn’t cause harm, therefore it shouldn’t be illegal.
Someone doing something creepy for their own use should never be illegal.
Thanks for sharing! I’m going to disagree with pretty much everything, so please stop reading here if you’re not interested.
I’m not one to stop because of disagreement. You’re in good faith and that’s all that matters imo
Revenge porn damages someone’s reputation, at the very least, which is a large part of why it’s illegal.
Someone keeping those images for private use doesn’t cause harm, therefore it shouldn’t be illegal.
I believe consent is a larger factor. The person who made it consented to have their photos/videos seen by that person but did not consent to them sharing it.
That’s why it’s not illegal to call someone a slut (even though that also damages reputation)
Someone doing something creepy for their own use should never be illegal.
What if the recording was made without the person’s consent. Say someone records their one night stand without the other person’s knowledge but they don’t share it with anyone. Should that be illegal?
Consent is certainly important, but they don’t need your consent if the image was obtained legally and thus subject to fair use, or if you gave them permission in the past.
That’s why it’s not illegal to call someone a slut (even though that also damages reputation)
It can be, if that constitutes defamation or libel. A passing statement wouldn’t, but a post on a popular website absolutely could. It all comes down to the damages that (false) statement caused.
What if the recording was made without the person’s consent. Say someone records their one night stand without the other person’s knowledge but they don’t share it with anyone. Should that be illegal?
That depends on whether there was a reasonable expectation of privacy. If it’s in public, there’s no reasonable expectation of privacy.
In general, I’d say intimacy likely occurs somewhere with a reasonable expectation of privacy, at which point it would come down to consent (whether implied or explicit).
Exactly. If you can’t name a victim, it shouldn’t be illegal.
The problem with AI CSAM generation is that the AI has to be trained on something first. It has to somehow know what a naked minor looks like. And to do that, well… You need to feed it CSAM.
So is it right to be using images of real children to train these AI? You’d be hard-pressed to find someone who thinks that’s okay.
If the images were generated from CSAM, then there’s a victim. If they weren’t, there’s no victim.
The images were created using photos of real children even if said photos weren’t CSAM (which can’t be guaranteed they weren’t). So the victims were are the children used to generate CSAM
Let’s do a thought experiment, and I’d look to to tell me at what point a victim was introduced:
Or with AI:
At what point did my actions victimize someone?
If I distributed those images and those images resemble a real person, then that real person is potentially a victim.
I will say someone who does this creepy and I don’t want them anywhere near children (especially mine, and yes, I have kids), but I don’t think it should be illegal, provided the source material is legal. But as soon as I distribute it, there absolutely could be a victim. Being creepy shouldn’t be a crime.
I think it should be illegal to make porn of a person without their permission regardless of if it was shared or not. Imagine the person it is based off of finds out someone is doing that. That causes mental strain on the person. Just like how revenge porn doesn’t actively harm a person but causes mental strafe (both the initial upload and continued use of it). For scenario 1 it would be at step 2 when the porn is made of the person. For scenario 2 it would be a mix between step 3 and 4.
Thanks for sharing! I’m going to disagree with pretty much everything, so please stop reading here if you’re not interested.
Sure, and there are plenty of things that can cause mental strain, but that doesn’t make those things illegal. For example:
And so on. Those things aren’t illegal, but someone could experience mental strain from them. Experiencing that doesn’t make you a victim, it just means you experience it.
Revenge porn damages someone’s reputation, at the very least, which is a large part of why it’s illegal.
Someone keeping those images for private use doesn’t cause harm, therefore it shouldn’t be illegal.
Someone doing something creepy for their own use should never be illegal.
I’m not one to stop because of disagreement. You’re in good faith and that’s all that matters imo
I believe consent is a larger factor. The person who made it consented to have their photos/videos seen by that person but did not consent to them sharing it.
That’s why it’s not illegal to call someone a slut (even though that also damages reputation)
What if the recording was made without the person’s consent. Say someone records their one night stand without the other person’s knowledge but they don’t share it with anyone. Should that be illegal?
Consent is certainly important, but they don’t need your consent if the image was obtained legally and thus subject to fair use, or if you gave them permission in the past.
It can be, if that constitutes defamation or libel. A passing statement wouldn’t, but a post on a popular website absolutely could. It all comes down to the damages that (false) statement caused.
That depends on whether there was a reasonable expectation of privacy. If it’s in public, there’s no reasonable expectation of privacy.
In general, I’d say intimacy likely occurs somewhere with a reasonable expectation of privacy, at which point it would come down to consent (whether implied or explicit).