• mindbleach
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    Also no.

    The technology works the way it works, no matter how you posture. Your willful ignorance is not vindicated by the fact I, personally, don’t generate child porn. What kind of asshole even asks for that? Why would any random commenter correcting you with an image of a horse hearse necessarily generate anything? It’s one of a hundred images posted here, every week, that disproves how you insist this works. You can deal with that or not.

      • mindbleach
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I’ll call you a lot worse if you can’t figure out ‘make child porn for me’ is an insane demand. As if knowing how this year’s most impactful technology works means I, personally, am an experienced user. (And prepared to drag your ignorant ass through the process of setting it up.)

        And now your obscene moving goalpost is… matching a naked photograph of yourself, as a child? I’m not convinced you understand what AI does. It makes up things that don’t exist. If you take a photo of a guy and ask for that guy as a young astronaut, and that photo is Buzz Aldrin eating dinner yesterday, it’s not gonna produce an existing image of Buzz Aldrin on the fuckin’ moon. Not even if it has that exact image in its training data. What it got from that training image is more like ‘astronaut means white clothes.’ Except as a pile of weighted connections, where deeper layers… why am I bothering to explain this? You’re not listening. You’re just skimming this and looking for some way to go ‘you didn’t jump through my gross hoop, therefore nuh-uh.’

        If you want to fuck around with Stable Diffusion, you don’t need me to do it. I’d be no help - I haven’t used it. But it’s evidently fairly easy to set up, and you can test all your stupid assertions about how it does or doesn’t work.

        … oh my god, I just realized the dumbest part of your demand. If I somehow did “post a model” (what file format would that even be?) that did exactly what you ask, it wouldn’t prove what was or wasn’t in the training data. So you’d just loop back around to going ‘ah-HA, there must have been hyper-illegal images of exactly that, in the training data.’ Your grasp for burden-of-proof doesn’t even fit.