• DoPeopleLookHere
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    12
    ·
    1 day ago

    The only way to generate something like that is to teach it something like that from real images.

    • sugar_in_your_tea
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 day ago

      only way

      That’s just not true.

      That said, there’s a decent chance that existing models use real images, and that is what we should be fighting against. The user of a model has plausible deniability because there’s a good chance they don’t understand how they work, but the creators of the model should absolutely know where they’re getting the source data from.

      Prove that the models use illegal material and go after the model creators for that, because that’s an actual crime. Don’t go after people using the models who are providing alternatives to abusive material.

      • DoPeopleLookHere
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        4
        ·
        1 day ago

        I think all are unethical, and any service offering should be shut down yes.

        I never said prosecute the user’s.

        I said you can’t make it ethically, because at some point, someone is using/creating original art and the odds of human explotations at some point in the chain are just too high.

        • sugar_in_your_tea
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          1 day ago

          the odds of human explotations at some point in the chain are just too high

          We don’t punish people based on odds. At least in the US, the standard is that they’re guilty “beyond a reasonable doubt.” As in, there’s virtually no possibility that they didn’t commit the crime. If there’s a 90% chance someone is guilty, but a 10% chance they’re completely innocent, most would agree that there’s reasonable doubt, so they shouldn’t be convicted.

          If you can’t prove that they made it unethically, and there are methods to make it ethically, then you have reasonable doubt. All the defense needs to do is demonstrate one such method of producing it ethically, and that creates reasonable doubt.

          Services should only be shut down if they’re doing something illegal. Prove that the images are generated using CSAM as source material and then shut down any service that refuses to remove it, or who can be proved as knowing “beyond a reasonable doubt” that they were committing a crime. That’s how the law works, you only punish people you can prove “beyond a reasonable doubt” were committing a crime.

          • DoPeopleLookHere
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            4
            ·
            1 day ago

            How can it be made ethically?

            That’s my point.

            It can’t.

            Some human has to sit and make many, many, many models of genitals to produce an artificial one.

            And that, IMO is not ethically possible.

            • sugar_in_your_tea
              link
              fedilink
              English
              arrow-up
              7
              ·
              1 day ago

              How can it be made ethically?

              Let’s say you manually edit a bunch of legal pictures and feed that into a model to generate new images. Or maybe you pull some legal images from other regions (e.g. topless children), and label some young-looking adults as children for the rest.

              I don’t know, I’m not an expert. But just because I don’t know of something doesn’t mean it doesn’t exist, it means I need to consult experts.

              It can’t.

              Then prove it. That’s how things are done in courts of law. Each side provides experts to try to convince the judge/jury that something did or did not happen.

              My point is merely that an image that looks like CSAM is only CSAM if it actually involves abuse of a child. It’s not CSAM if it’s generated some other way, such as hand-drawing (e.g. hentai) or a model that doesn’t use CSAM in its training data.

              • DoPeopleLookHere
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                4
                ·
                1 day ago

                You can’t prove a negative. That’s not how prooving things work.

                You also assume legal images. But that puts limits on what’s actually legal globally. What if someone wants a 5 year old? How are there legal photos of that?

                You assume it can, prove that it can.

                • sugar_in_your_tea
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  ·
                  1 day ago

                  You can’t prove a negative

                  You can show how existing solutions work and demonstrate that the solution used works like those other solutions. That takes a lot more work than “see, it looks like a child therefore it’s CSAM,” but it’s necessary to protect innocent people.

                  You assume it can, prove that it can.

                  That’s guilty until proven innocent. There’s a reason courts operate on the assumption of innocence and force the prosecution to prove guilt. I am not interested in reversing that.

                  • DoPeopleLookHere
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    3
                    ·
                    1 day ago

                    You better believe when the cops come knocking, the burden of proof to be ethical is wholly on you.

                    All existing solutions are based on real life images. There’s no ethically way to acquire thousand upon thousands of images of naked children to produce anything resembling real.

                    That’s how existing solutions work.

                    So again, how can it be done ethically?

    • ifItWasUpToMe@lemmy.ca
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      1 day ago

      I don’t think this is actually true. Pretty sure if you feed it naked adults and clothed children it can figure out the rest.

      • DoPeopleLookHere
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        5
        ·
        1 day ago

        That’s not how these image generators work.

        How would it know what an age appropriate penis looks like with our, you know, seeing one.

        • Allero@lemmy.today
          link
          fedilink
          English
          arrow-up
          9
          ·
          1 day ago

          That’s exactly how they work. According to many articles I’ve seen in the past, one of the most common models used for this purpose is Stable Diffusion. For all we know, this model was never fed with any CSAM materials, but it seems to be good enough for people to get off - which is exactly what matters.

          • DoPeopleLookHere
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            3
            ·
            1 day ago

            How can it be trained to produce something without human input.

            To verify it’s models are indeed correct, some human has to sit and view it.

            Will that be you?

            • TheRealKuni@midwest.social
              link
              fedilink
              English
              arrow-up
              4
              ·
              23 hours ago

              How can it be trained to produce something without human input.

              It wasn’t trained to produce every specific image it produces. That would make it pointless. It “learns” concepts and then applies them.

              No one trained AI on material of Donald Trump sucking on feet, but it can still generate it.

              • DoPeopleLookHere
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                2
                ·
                23 hours ago

                It was able to produce that because enough images of both feet and Donald Trump exist.

                How would it know what young genitals look like?

                • JuxtaposedJaguar@lemmy.ml
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  22 hours ago

                  If you train a model on 1,000,000 images of dogs and 1,000,000 images of cats, your output isn’t going to be a 50/50 split of purely dogs and purely cats, it’s going to be (on average) somewhere between a cat and a dog. At no point did you have to feed in pictures of dog-cat hybrids to end up with that model.

                  • DoPeopleLookHere
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    2
                    ·
                    22 hours ago

                    Yes but you start with the basics of a cat and a dog. So you start with adult genitals and…

                • JuxtaposedJaguar@lemmy.ml
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  22 hours ago

                  You could probably make some semi-realistic drawings and feed those in, and then re-train the model with those same images over and over until the model is biased to use the child-like properties of the drawings but the realism of the adult pictures. You could also feed the most CP-looking images from a partially trained model as the training data of another model, which over time would make the outputs approach the desired result.

                  • DoPeopleLookHere
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    2
                    ·
                    22 hours ago

                    But to know if it’s accurate, someone has to view and compare…

            • Allero@lemmy.today
              link
              fedilink
              English
              arrow-up
              4
              ·
              23 hours ago

              Much as all in modern AI - it’s able to train without much human intervention.

              My point is, even if results are not perfectly accurate and resembling a child’s body, they work. They are widely used, in fact, so widely that Europol made a giant issue out of it. People get off to whatever it manages to produce, and that’s what matters.

              I do not care about how accurate it is, because it’s not me who consumes this content. I care about how efficient it is at curbing worse desires in pedophiles, because I care about safety of children.

        • lime!@feddit.nu
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          1 day ago

          no, it sort of is. considering style transfer models, you could probably just draw or 3d model unknown details and feed it that.

          • DoPeopleLookHere
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            5
            ·
            1 day ago

            Again, that’s not how image generators work.

            You can’t just make up some wishful thinking and assume that’s how it must work.

            It takes thousands upon housands of unique photos to make an image generator.

            Are you going to draw enough child genetalia to train these generators? Are you actually comfortable doing that task?

            • lime!@feddit.nu
              link
              fedilink
              English
              arrow-up
              5
              ·
              1 day ago

              i’m not, no. but i’m also well-enough versed in stable diffusion and loras that i know that even a model with no training on a particular topic can be made to produce it with enough tweaking, and if the results are bad you can plug in an extra model trained on at minimum 10-50 images to significantly improve them.

              • DoPeopleLookHere
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                2
                ·
                1 day ago

                Okay, but my point still stands.

                Someone has to make the genitals models to learn from. Some human has to be involved otherwise it wouldn’t just exist.

                And if your not willing to get your hands dirty and do it, why would anyone else?

                • lime!@feddit.nu
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  edit-2
                  23 hours ago

                  people are fucking weird. especially when it comes to porn. just look at the amount of effort people put into skyrim porn mods, or source filmmaker.