Ai chatbots are sycophants. They will say literally anything if you convince it. You can get them to tell you to kill yourself or others or anything at all. They are only dangerous if you believe them. Unfortunately that’s going to be a huge problem.
Ai chatbots are sycophants. They will say literally anything if you convince it. You can get them to tell you to kill yourself or others or anything at all. They are only dangerous if you believe them. Unfortunately that’s going to be a huge problem.