Using the Political Compass is a bit of a strange way to conduct research. I do think it is important to identify biases of course, but at some point you have to look at the bigger picture and realise why the bias exists.
In order to swing ChatGPT more to the right (if you want to balance it at neutral in the end), you’d have to inject it with more racism, anti-science conspiracy and American Christian views - none of which are particularly pleasant.
Do we want a LLM that limits facts about COVID-19 so that those who view it as a conspiracy feel validated?
Do we want it to respond that homosexual people don’t exist? Or even to say “I can’t give a response to this that remains politically neutral”?
Or if someone asks how old the earth is, do we want it to reply with “about 3000 years old”?
Or to contest climate change?
Do we want to sacrifice accuracy in favour of neutrality just because one party has a denial stance on these topics?
Hey, sorry for going dark, I lost my hardware 2FA key and couldn’t log in for ages, just found it again!
Unfortunately I never got a clear answer from the reddit mods about whether I could sync user posts or not (there’s only one guy doing most of the work and he had to take a break which was fair enough), and they ended up reopening the subreddit anyway.
So no plans to sync for now, and this community is kinda dead for now unless people start posting themselves. It’s hard to get content if no one is here, and it’s hard to get people if there’s no content.