I asked it to tell me a joke, which it did. But when I asked for another, it quite grumpily replied that it was made to help with code, not to tell jokes.
Heh. They probably want to train it not to answer non-technical queries, but it’s still GPT at its core. So if the filters aren’t foolproof, it’s likely capable of the same general knowledge answers as ChatGPT.
Huh.
I asked it to tell me a joke, which it did. But when I asked for another, it quite grumpily replied that it was made to help with code, not to tell jokes.
Recipes are code now?
Heh. They probably want to train it not to answer non-technical queries, but it’s still GPT at its core. So if the filters aren’t foolproof, it’s likely capable of the same general knowledge answers as ChatGPT.