• Empricorn@feddit.nl
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    This is tough. If it was just a sicko who generated the images for himself locally… that is the definition of a victimless crime, no? And it might actually dissuade him from seeking out real CSAM…

    BUT, iirc he was actually distributing the material, and even contacted minors, so… yeah he definitely needed to be arrested.

    But, I’m still torn on the first scenario…

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      But, I’m still torn on the first scenario…

      To me it comes down to a single question:

      “Does exposure and availability to CSAM for pedophiles correlate with increased or decreased likelihood of harming a child?”

      If there’s a reduction effect by providing an outlet for arousal that isn’t actually harming anyone - that sounds like a pretty big win.

      If there’s a force multiplier effect where exposure and availability means it’s even more of an obsession and focus such that there’s increased likelihood to harm children, then society should make the AI generated version illegal too.

        • ricecake@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          6 months ago

          How they’ve done it in the past is by tracking the criminal history of people caught with csam, arrested for abuse, or some combination thereof, or by tracking the outcomes of people seeking therapy for pedophilia.

          It’s not perfect due to the sample biases, but the results are also quite inconsistent, even amongst similar populations.

  • not_that_guy05@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    6 months ago

    Fuck that guy first of all.

    What makes me think is, what about all that cartoon porn showing cartoon kids? What about hentai showing younger kids? What’s the difference if all are fake and being distributed online as well?

    Not defending him.

  • 0x0001@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    One thing to consider, if this turned out to be accepted, it would make it much harder to prosecute actual csam, they could claim “ai generated” for actual images

    • theherk@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      I get this position, truly, but I struggle to reconcile it with the feeling that artwork of something and photos of it aren’t equal. In a binary way they are, but with more precision they’re pretty far apart. But I’m not arguing against it, I’m just not super clear how I feel about it yet.

      • Corkyskog@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        6 months ago

        It’s not a difficult test. If a person can’t reasonably distinguish it from an actual child, then it’s CSAM.

        • Madison420@lemmy.world
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          6 months ago

          This would also outlaw “teen” porn as they are explicitly trying to look more childlike as well as models that only appear to be minors.

          I get the reason people think it’s a good thing but all censorship has to be narrowly tailored to content lest it be too vague or overly broad.

          • Corkyskog@sh.itjust.works
            link
            fedilink
            arrow-up
            0
            ·
            6 months ago

            And nothing was lost…

            But in seriousness, as you said they are models who are in the industry, verified, etc. It’s not impossible to have a white-list of actors, and if anything there should be more scrutiny on the unknown “actresses” portraying teenagers…

            • Madison420@lemmy.world
              link
              fedilink
              arrow-up
              0
              arrow-down
              1
              ·
              6 months ago

              Except jobs dude, you may not like their work but it’s work. That law ignores verified age, that’s a not insignificant part of my point…

      • Madison420@lemmy.world
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        6 months ago

        So long as the generation is without actual model examples that are actual minors there’s nothing technically illegal about having sexual material of what appears to be a child. They would then have a mens rea question and a content question, what actual defines in a visual sense a child? Could those same things equally define a person of smaller stature? And finally could someone like tiny texie be charged for producing csam as she by all appearance or of context looks to be a child.

        • Fungah@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          6 months ago

          It is illegal in Canada to have sexual depictions of a child whether its a real image or you’ve just sat down and drawn it yourself. The rationale being that behavior escalated, and looking at images goes to wanting more

          It borders on thought crime which I feel kind of high about but only pedophiles suffer which I feel great about. There’s no legitimate reason to have sexualized image of a child whether computer geneerate, hand drawn, or whatever.

          • Madison420@lemmy.world
            link
            fedilink
            arrow-up
            0
            arrow-down
            1
            ·
            6 months ago

            This article isn’t about Canada homeboy.

            Also that theory is not provable and never will be, morality crime is thought crime and thought crime is horseshit. We criminalize criminal acts not criminal thoughts.

            Similarly, you didn’t actually offer a counterpoint to any of my points.

          • Madison420@lemmy.world
            link
            fedilink
            arrow-up
            0
            arrow-down
            1
            ·
            6 months ago

            Real images that don’t have to be of csam but rather of children, it could theoretically train anything sexual with legal sexual content and let the ai connect the dots.