• Madison420@lemmy.world
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    7 months ago

    So long as the generation is without actual model examples that are actual minors there’s nothing technically illegal about having sexual material of what appears to be a child. They would then have a mens rea question and a content question, what actual defines in a visual sense a child? Could those same things equally define a person of smaller stature? And finally could someone like tiny texie be charged for producing csam as she by all appearance or of context looks to be a child.

    • [email protected]@lemmy.federate.cc
      link
      fedilink
      arrow-up
      2
      arrow-down
      2
      ·
      7 months ago

      The problem is that the only way to train an AI model is on real images, so the model can’t exist without crimes and suffering having been committed.

      • jeremyparker@programming.dev
        link
        fedilink
        arrow-up
        3
        ·
        7 months ago

        This isn’t true. AI can generate tan people if you show them the color tan and a pale person – or green people or purple people. That’s all ai does, whether it’s image or text generation – it can create things it hasn’t seen by smooshing together things it has seen.

        And this is proven by reality: ai CAN generate csam, but it’s trained on that huge image database, which is constantly scanned for illegal content.

      • Madison420@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        7 months ago

        Real images that don’t have to be of csam but rather of children, it could theoretically train anything sexual with legal sexual content and let the ai connect the dots.

    • Fungah@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      2
      ·
      7 months ago

      It is illegal in Canada to have sexual depictions of a child whether its a real image or you’ve just sat down and drawn it yourself. The rationale being that behavior escalated, and looking at images goes to wanting more

      It borders on thought crime which I feel kind of high about but only pedophiles suffer which I feel great about. There’s no legitimate reason to have sexualized image of a child whether computer geneerate, hand drawn, or whatever.

      • Madison420@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        7 months ago

        This article isn’t about Canada homeboy.

        Also that theory is not provable and never will be, morality crime is thought crime and thought crime is horseshit. We criminalize criminal acts not criminal thoughts.

        Similarly, you didn’t actually offer a counterpoint to any of my points.