• mayoi
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    37
    ·
    edit-2
    11 months ago

    Removed by mod

    • papertowels@lemmy.one
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      3
      ·
      edit-2
      11 months ago

      Your claims backbone is that models don’t know the differences between a child’s naked body and an adults, yes?

      What happens if you ask chat gpt “what are the anatomical differences between human child and adult bodies?”

      I’m sure it’ll give you an accurate response.

      https://www.technologyreview.com/2021/01/05/1015754/avocado-armchair-future-ai-openai-deep-learning-nlp-gpt3-computer-vision-common-sense/

      To test DALL·E’s ability to work with novel concepts, the researchers gave it captions that described objects they thought it would not have seen before, such as “an avocado armchair” and “an illustration of a baby daikon radish in a tutu walking a dog.” In both these cases, the AI generated images that combined these concepts in plausible ways.

      • Zuberi 👀@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        25
        ·
        11 months ago

        You genuinely don’t think CSAM is used in the training of these AI models…? And then you used a chat model to essentially google the differences in text and not visually?..

        Why did you feel the need to jump in and defend stuff like this?

        • Mr_Dr_Oink@lemmy.world
          link
          fedilink
          English
          arrow-up
          24
          arrow-down
          2
          ·
          11 months ago

          Didnt they then post a link showing that dall-e could combine two different things into something its never seen before?

          Did you read the whole comment? Even if the text model describing things is irrelevant the dall-e part is not.

          • Zuberi 👀@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            24
            ·
            11 months ago

            It is irrelevant. Armchairs are not people. Dali does not know what is inside of those objects. Or under their fabrics for instance. Ask Dali to cut open the Avacado armchair.

            I’m sorry if I’m not buying your defense of CSAM.

            But the Dali use case of “an illustration of a baby daikon radish in a tutu walking a dog" can’t possibly be the best example to use here to defend child porn.

            • lolcatnip@reddthat.com
              link
              fedilink
              English
              arrow-up
              21
              arrow-down
              1
              ·
              edit-2
              11 months ago

              I’m sorry if I’m not buying your defense of CSAM.

              Thanks for making it clear you’re either arguing in bad faith, or that you’re incapable of talking about actual issues the moment anyone mentions CSAM.

            • Mr_Dr_Oink@lemmy.world
              link
              fedilink
              English
              arrow-up
              9
              arrow-down
              1
              ·
              11 months ago

              Im sorry? My defense of CSAM?

              What defence of CSAM?

              Do you require mental assistance? You appear to be having some kind of aneurism…

        • papertowels@lemmy.one
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          The original comment said it’s impossible for a model to be able to produce CP if it was never exposed to it.

          They were uninformed, so as someone who works with machine learning I informed them. If your argument relies on ignorance it’s bad.

          Re: text model, someone already addressed this. If you’re going to make arguments and assumptions about things I share without reading them, there’s no need for me to bother with my time. You can lead a horse to water but you cant make it drink.

          Have a good one!

      • mayoi
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        27
        ·
        11 months ago

        Removed by mod

        • Player2@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          11 months ago

          Just like all the words you used to compose that sentence already existed and yet you made it yourself, language models can take tokens that they know generally go together and make original sentences. Your argument is that a dictionary exists, therefore authors are lying to everyone by saying that they wrote something.

          • Seasoned_Greetings@lemm.ee
            link
            fedilink
            English
            arrow-up
            10
            ·
            11 months ago

            Hey, just so you know, this guy is a crazy troll. He’s clocked 130 comments on his 9 hr old profile, and almost all of them are picking fights and deflecting. Save yourself the trouble. His goto line is “I don’t remember that”

          • mayoi
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            8
            ·
            11 months ago

            Removed by mod

      • mayoi
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        11 months ago

        That’s very clearly a modern car on horse legs, nothing that I didn’t see before.

          • mayoi
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            3
            ·
            edit-2
            11 months ago

            What’s “nudity”, how can I see it?

            I for sure can conceptualize it, but please explain how I’ve seen it, where and when, since it’s not a physical object, has no shape, color, smell, mass…

            I can’t even really imagine a naked child in my mind and even if I tried to, it would be my imagination and not what the real thing actually looks like, and I can’t know what real thing looks like unless I saw it.

              • mayoi
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                2
                ·
                11 months ago

                You’re a complete moron if you think “nudity”, a concept, is equivalent to an adjective without a verb which you omitted because otherwise you cannot even cope with sending it since you know that you’re wrong.

                • mindbleach
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  11 months ago

                  Horse is a concept, to the machine. That’s the point. It cares even less about grammar than you do, as you insist there’s no verb in the sentence ‘what is horse.’ (It’s is.)

                  Type in “purple apple” and it’ll give you the concept of apple plus the concept of purple.

                  The concept of nudity is in every wonky NSFW content detection network. The machine absolutely has some model of what it means. Rage about it to someone else, if you can’t handle that.

                  • mayoi
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    1
                    ·
                    11 months ago

                    Removed by mod