• mindbleach
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 months ago

    But a human asking how to make a bomb is somehow the LLM’s fault.

    Or the LLM has to know that you are who you say you are, to prevent you from writing scam e-mails.

    The guy you initially replied to was talking about hooking up an LLM to a virus replication machine. Is that the level of safety you’re asking for? A machine so safe, we can give it to supervillains?