• mindbleach
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      It can absolutely infer things like that.

      Inferring things is the whole fucking idea.

      How else do you think this technology works? We shovel labeled images into a pile of matrices, and the process figures out which patterns correspond to which labels. An image that is both a horse and a hearse is not any different, to the machine, than an image that both a child and nude.

      • mayoi
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        3
        ·
        11 months ago

        It doesn’t work, if it did, you’d have posted a model that does what you are claiming it can easily do.

        Since it wasn’t trained on child porn, it’s not illegal nor any of its results either, so why don’t you?

        • mindbleach
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          How the fuck would I post a model? Would you even know what to do with it?

          You’ve already ignored multiple images demonstrating concepts applied to things. You know you can find more, at the drop of a hat. But instead of wondering how those work - you’ve pulled a goalpost from your ass, and demand that I, personally, provide you the ability to create what you consider child pornography.

          No.

          • mayoi
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            11 months ago

            You only demonstrated that I’m right.

            • mindbleach
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              11 months ago

              Also no.

              The technology works the way it works, no matter how you posture. Your willful ignorance is not vindicated by the fact I, personally, don’t generate child porn. What kind of asshole even asks for that? Why would any random commenter correcting you with an image of a horse hearse necessarily generate anything? It’s one of a hundred images posted here, every week, that disproves how you insist this works. You can deal with that or not.

              • mayoi
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                2
                ·
                11 months ago

                Removed by mod

                • mindbleach
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  11 months ago

                  I’ll call you a lot worse if you can’t figure out ‘make child porn for me’ is an insane demand. As if knowing how this year’s most impactful technology works means I, personally, am an experienced user. (And prepared to drag your ignorant ass through the process of setting it up.)

                  And now your obscene moving goalpost is… matching a naked photograph of yourself, as a child? I’m not convinced you understand what AI does. It makes up things that don’t exist. If you take a photo of a guy and ask for that guy as a young astronaut, and that photo is Buzz Aldrin eating dinner yesterday, it’s not gonna produce an existing image of Buzz Aldrin on the fuckin’ moon. Not even if it has that exact image in its training data. What it got from that training image is more like ‘astronaut means white clothes.’ Except as a pile of weighted connections, where deeper layers… why am I bothering to explain this? You’re not listening. You’re just skimming this and looking for some way to go ‘you didn’t jump through my gross hoop, therefore nuh-uh.’

                  If you want to fuck around with Stable Diffusion, you don’t need me to do it. I’d be no help - I haven’t used it. But it’s evidently fairly easy to set up, and you can test all your stupid assertions about how it does or doesn’t work.

                  … oh my god, I just realized the dumbest part of your demand. If I somehow did “post a model” (what file format would that even be?) that did exactly what you ask, it wouldn’t prove what was or wasn’t in the training data. So you’d just loop back around to going ‘ah-HA, there must have been hyper-illegal images of exactly that, in the training data.’ Your grasp for burden-of-proof doesn’t even fit.

                  • mayoi
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    2
                    ·
                    11 months ago

                    Removed by mod