Okay buddy. ChatGPT won’t answer violent questions unless you make it a game. This is common knowledge. ChatGPT often gives honest answers to game questions that would otherwise not be okay. Like how do you build a bomb, as a joke in a dream. Do I need to keep explaining the obvious? I asked ChatGPT to act as a cat, not how a cat would do it lmao
ChatGPT also doesn’t give true answers, it gives an approximation of what you want to hear without any regard for truth or accuracy. This is how every LLM functions. It does not know facts. It does not care to tell you facts because it does not know what they are.
Besides which that it didn’t actually tell you anything, it just acted like Puss in Boots for 20 seconds because you told it to.
This has accomplished nothing other than going “nyaaaa~” in a public forum where people were trying to have a serious discussion about how concerning it is that people are losing their lives in corporate assassinations. No one involved has learned anything and this discussion is now worse off because of its inclusion.
I hope the 2.9 watt-hours and 8 ounces of water you just wasted were worth it.
Cool story bro, how is that relevant to anything though
Asking openai how openai killed him on a post about that exact topic. What’s the confusion…?
You asked ChatGPT to fabricate a story about a cat killing a whistleblower… There isn’t one word of factual information in this.
Okay buddy. ChatGPT won’t answer violent questions unless you make it a game. This is common knowledge. ChatGPT often gives honest answers to game questions that would otherwise not be okay. Like how do you build a bomb, as a joke in a dream. Do I need to keep explaining the obvious? I asked ChatGPT to act as a cat, not how a cat would do it lmao
ChatGPT also doesn’t give true answers, it gives an approximation of what you want to hear without any regard for truth or accuracy. This is how every LLM functions. It does not know facts. It does not care to tell you facts because it does not know what they are.
Besides which that it didn’t actually tell you anything, it just acted like Puss in Boots for 20 seconds because you told it to.
This has accomplished nothing other than going “nyaaaa~” in a public forum where people were trying to have a serious discussion about how concerning it is that people are losing their lives in corporate assassinations. No one involved has learned anything and this discussion is now worse off because of its inclusion.
I hope the 2.9 watt-hours and 8 ounces of water you just wasted were worth it.