• mayoi
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      11 months ago

      You’re a complete moron if you think “nudity”, a concept, is equivalent to an adjective without a verb which you omitted because otherwise you cannot even cope with sending it since you know that you’re wrong.

      • mindbleach
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Horse is a concept, to the machine. That’s the point. It cares even less about grammar than you do, as you insist there’s no verb in the sentence ‘what is horse.’ (It’s is.)

        Type in “purple apple” and it’ll give you the concept of apple plus the concept of purple.

        The concept of nudity is in every wonky NSFW content detection network. The machine absolutely has some model of what it means. Rage about it to someone else, if you can’t handle that.

        • mayoi
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          11 months ago

          Removed by mod

          • mindbleach
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 months ago

            Correct, and these models produce hallucinations. That’s literally what the process is called.

            Do you think AI is a camera? Are you still convinced there’s a horse hearse somewhere out there, in real life? The model makes shit up. The shit it makes up is based on a bunch of labeled images - for example, hearses and horses. It can combine the two even if those two things never coexist, anywhere. So of course it can combine the two even if those two things simply do not coexist, in its training data.

            The training data includes children.

            The training data includes nudity.

            That’s enough.

              • mindbleach
                link
                fedilink
                English
                arrow-up
                1
                ·
                11 months ago

                Correct.

                Which of those things do you think AI produces? Hallucinations, or reality?

                • mayoi
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  11 months ago

                  How does this relate to you being unable to understand that a car with wheels replaced to be horse legs is incomparable to inferring what a naked child looks like from adult porn?

                  • mindbleach
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    11 months ago

                    What about it is incomparable, dingus? That’s literally what it does. As surely as it combines any other two things.