There has been a ton of CSAM and CP arrests in the US lately, especially from cops and teachers, along with at least one female teacher seducing boys as young as 12. I cannot understand the attraction to kids. Even teens. Do these people think they are having a relationship, or it is somehow okay to take away another human beings’ innocence? Is it about sex, power, or WTH is it? Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues? I am a grandfather, and I am worried about how far this will go with the new AI being able to put anyone’s face into a Porno movie too.

It seems to me that a whole new set of worldwide guidelines and laws need to be put into effect asap.

How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

    • Maeve@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      It used to be nothing for parents to take pictures of their kids playing in the bath. Parents have been convicted and lost their children for it, though.

    • Dame @lemmy.ml
      link
      fedilink
      arrow-up
      1
      arrow-down
      2
      ·
      1 year ago

      If the artist is drawing naked children that isn’t for the sake of a book or something of similar nature there is a problem. This is also a disingenuous comparison an artist hasn’t been trained on hundreds to millions of children’s images and then fine tuned. There’s a lot of illegal content these models come across and then are hopefully tuned by human hands. So try another example

    • Axxys@lemmy.world
      link
      fedilink
      arrow-up
      9
      arrow-down
      10
      ·
      1 year ago

      It’s not OK make CSAM.

      The origin of CSAM does not make it acceptable.

        • Vedlt@lemmy.world
          link
          fedilink
          arrow-up
          9
          arrow-down
          4
          ·
          1 year ago

          I am not an expert in any field relating to any of this by any means, but we can all agree that CSAM is unequivocally reprehensible. Thusly many people will have severe issues with anything that normalizes it even remotely. That would be my knee jerk response anyway.

          • CaptainEffort
            link
            fedilink
            arrow-up
            14
            arrow-down
            1
            ·
            edit-2
            1 year ago

            Well maybe we shouldn’t base our decisions on knee jerk responses.

            Imo if nobody’s being hurt then it’s none of our business. If it helps these people to deal with their urges without actually hurting anyone then I think that’s unquestionably a good thing.

            • Slowy@lemmy.world
              link
              fedilink
              arrow-up
              5
              arrow-down
              4
              ·
              1 year ago

              If it is in fact helping them, yes. It would be ideal to do a study of how it affects their self control before going that direction though I think, as some argue it would do the opposite.

              • CaptainEffort
                link
                fedilink
                arrow-up
                10
                ·
                1 year ago

                If it is in fact helping them, yes

                Okay so… we agree?

                And yes, some would argue the opposite. But I don’t think we should be creating laws without any actual proof one way or the other.

                • Slowy@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  edit-2
                  1 year ago

                  I don’t have enough information to have an opinion and I do agree with you that knee jerk reactions are not ideal. But choosing to allow it (at a time when AI generated media is starting to be regulated) is also a decision.

              • CeruleanRuin@lemmings.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                10
                ·
                edit-2
                1 year ago

                It almost certainly “helps” as many of these people as it encourages. The hedonistic effect is a phenomenon common to all humans, where a person indulging heavily in something that makes them feel good needs more and more extreme examples of it to maintain the baseline of satisfaction from it. Any harmful compulsion when indulged will fall victim to this effect.

                Providing virtual explicit images of children might mollify some, but it will have an inflaming effect on just as many others, who will seek out increasingly realistic or visceral imagery, up to and including looking for real photos and/or exploiting real children. That in turn ensures a market for child exploitation.

                So no, it’s not harmless. Not remotely.

                • CaptainEffort
                  link
                  fedilink
                  arrow-up
                  11
                  ·
                  1 year ago

                  Wtf are you talking about? So if someone enjoys killing npcs in a video game they’ll start to need to kill people irl?

                  What year is this?

      • surewhynotlem@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        1 year ago

        Yes, but it’s wrong for very different reasons and severities. Murder vs murder porn, if you will. Both are bad and gross, but different, and that matters.

        But that’s irrelevant to my question, which no one actually answered.

        I am curious about people’s take on the difference between human creativity from memory vs AI “creativity” from training. The porn aspect is only relevant in that it’s an edge case that makes the debate meaningful.

        There are laws today that you can’t copyright AI art, but we can copyright art that’s based on a person’s combined experiences. That seems arbitrary to me, and I’m trying to understand better.