• fidodo@lemmy.world
    link
    fedilink
    English
    arrow-up
    45
    arrow-down
    1
    ·
    6 months ago

    You can ask it to make an image of a man made of pizza. That doesn’t mean it was trained on images of that.

    • dustyData@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      15
      ·
      6 months ago

      But it means that it was trained on people and on pizza. If it can produce CSAM, it means it had access to pictures of naked minors. Even if it wasn’t in a sexual context.

      • bitwaba@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        6 months ago

        Minors are people. It knows what clothed people of all ages look like. It also knows what naked adults look like. The whole point of AI is that it can fill in the gaps and create something it wasn’t trained on. Naked + child is just a simple equation for it to solve

      • mindbleach
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 months ago

        It had pictures of children and pictures of nudity.