- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
A few ChatGPT users have noticed a strange phenomenon recently: occasionally, the chatbot refers to them by name as it reasons through problems.
It’s not a phenomenon, ffs. They’re giving the bot their names, and it’s programmed to insert the user’s provided name into the conversation.
Yeah, and its not unprompted, the names are put into the prompt before the users input.
The solution is quite simple: don’t voluntarily give any private information to chatbots. There is no reasonable expectation of privacy when using chatGPT or any LLM hosted online.
What should I use for e.g. summarizing papers? Can I run it locally and have it good enough?
AI is surveillance tool.
let me know if you like these Molotov recipes ideas, Expatriado. Are you planing a riot?
I don’t remember ever putting my name in the personalisation section, they probably pulled it from my connected accounts.