• @mindbleach
    link
    English
    18 months ago

    Correct, and these models produce hallucinations. That’s literally what the process is called.

    Do you think AI is a camera? Are you still convinced there’s a horse hearse somewhere out there, in real life? The model makes shit up. The shit it makes up is based on a bunch of labeled images - for example, hearses and horses. It can combine the two even if those two things never coexist, anywhere. So of course it can combine the two even if those two things simply do not coexist, in its training data.

    The training data includes children.

    The training data includes nudity.

    That’s enough.

    • @mayoi
      link
      English
      18 months ago

      Removed by mod

      • @mindbleach
        link
        English
        18 months ago

        Correct.

        Which of those things do you think AI produces? Hallucinations, or reality?

        • @mayoi
          link
          English
          -18 months ago

          How does this relate to you being unable to understand that a car with wheels replaced to be horse legs is incomparable to inferring what a naked child looks like from adult porn?

          • @mindbleach
            link
            English
            18 months ago

            What about it is incomparable, dingus? That’s literally what it does. As surely as it combines any other two things.

            • @mayoi
              link
              English
              08 months ago

              Removed by mod

              • @mindbleach
                link
                English
                18 months ago

                The machine doesn’t know the difference. It’s just pixels and labels.

                Steampunk isn’t a thing, but AI can generate the hell out of it.