• mindbleach
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Correct, and these models produce hallucinations. That’s literally what the process is called.

      Do you think AI is a camera? Are you still convinced there’s a horse hearse somewhere out there, in real life? The model makes shit up. The shit it makes up is based on a bunch of labeled images - for example, hearses and horses. It can combine the two even if those two things never coexist, anywhere. So of course it can combine the two even if those two things simply do not coexist, in its training data.

      The training data includes children.

      The training data includes nudity.

      That’s enough.

        • mindbleach
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Correct.

          Which of those things do you think AI produces? Hallucinations, or reality?

          • mayoi
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            1 year ago

            How does this relate to you being unable to understand that a car with wheels replaced to be horse legs is incomparable to inferring what a naked child looks like from adult porn?

            • mindbleach
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              What about it is incomparable, dingus? That’s literally what it does. As surely as it combines any other two things.

                • mindbleach
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  The machine doesn’t know the difference. It’s just pixels and labels.

                  Steampunk isn’t a thing, but AI can generate the hell out of it.