- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
at frickin last - I wrote this in January and it’s finally been published
archive: https://archive.is/RSU91
The more I use various LLMs, the more I’ve come to realise that they have a tendency to confidently lie. More often than not, it seems an LLM will give me the answer it thinks I want to hear, even if the details of it’s answer are factually incorrect.
Using these tools to decide and affect real peoples lives is a very dangerous prospect.
Interesting article. Thanks
Thank you for this much needed reality check. I don’t understand why the Government are doing venture capital’s bidding.
Labour are all about the billionaire donors these days.
Presumably ‘AI’ can make simple rules based decisions, if done properly (unfortunately, being the UK government this is a big ‘if’).
But what exactly is sacking a million people supposed to do to the economy?
Presumably ‘AI’ can make simple rules based decisions, if done properl
honest question: was this meant seriously, or in jest?
Serious.
- Fill in form online
- AI analyses it, decides if applicant is entitled to benefits.
Why do you ask the question?
“AI” in the context of the article is “LLMs”. So, the definition of not trustworthy.
why do you think hallucinating autocomplete can make rules-based decisions reliably
AI analyses it, decides if applicant is entitled to benefits.
why do you think this is simple
My excel spreadsheet does it on a daily basis.
This is boring.
good, use your excel spreadsheet and not a tool that fucking sucks at it
citation/link/reference, please