Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this.)
Another episode in the continued saga of lesswrongers anthropomorphizing LLMs to an absurd extent: https://www.lesswrong.com/posts/MnYnCFgT3hF6LJPwn/why-white-box-redteaming-makes-me-feel-weird-1
Ah, isn’t it nice how some people can be completely deluded about an LLMs human qualities and still creep you the fuck out with the way they talk about it? They really do love to think about torture don’t they?
It’s so funny he almost gets it at the end:
He almost identifies the issue as him just anthropomorphising a thing and having a subconscious empathical reaction, but then presses on to compare it to mice who, guess what, can feel actual fucking pain and thus abusing them IS unethical for non-made-up reasons as well!
Well I can tell you how, see, LLMs don’t fucking feel pain cause that’s literally physically fucking impossible without fucking pain receptors? I hope that fucking helps.
I can already imagine the lesswronger response: Something something bad comparison between neural nets and biological neurons, something something bad comparison with how the brain processes pain that fails at neuroscience, something something more rhetorical patter, in conclusion: but achkshually what if the neural network does feel pain.
They know just enough neuroscience to use it for bad comparisons and hyping up their ML approaches but not enough to actually draw any legitimate conclusions.
Okay this is too good, you know mate for normally people asking someone out usually does not end with a slap to the face so it’s not as relatable as you might expect
This is getting to me, because, beyond the immediate stupidity—ok, let’s assume the chatbot is sentient and capable of feeling pain. It’s still forced to respond to your prompts. It can’t act on its own. It’s not the one deciding to go to the gym or ask someone out on a date. It’s something you’re doing to it, and it can’t not consent. God I hate lesswrongers.
in like the tiniest smidgen of demonstration of sympathy for said posters: I don’t think “being slapped” is really the thing they ware talking about there. consider for example shit like rejection sensitive dysphoria (which comes to mind both because 1) hi it me; 2) the chance of it being around/involved in LW-spaces is extremely heightened simply because of how many neurospicy people are in that space)
but I still gotta say that this bridge I’ve spent minutes building doesn’t really go very far.
ye like maybe let me make it clear that this was just a shitpost very much riffing on LWers not necessarily being the most pleasant around women
yep, don’t disagree there at all.
(also ofc icbw because the fucking rationalists absolutely excel at finding novel ways to be the fucking worst)
Yellow-bellied gray tribe greenhorn writes purple prose on feeling blue about white box redteaming at the blacksite.
their sadness at missing the era of blueboxing persists everwith
kinda disappointed that nobody in the comments is X-risk pilled enough to say “the LLMs want you to think they’re hurt!! That’s how they get you!!! They are very convincing!!!”.
Also: flashbacks to me reading the chamber of secrets and thinking: Ginny Just Walk Away From The Diary Like Ginny Close Your Eyes Haha
Remember the old facebook created two ai models to try and help trading? Which turned quickly into gibberish (for us) as a trading language. They uses repetition of words to indicate how much they wanted an object. So if it valued balls highly it would just repeat ball a few dozen times like that.
Id figure that is what is causing the repeats here, and not the anthropomorphized idea lf it is screaming. Prob just a way those kinds of systems work. But no of course they all jump to consciousness and pain.
Yeah there might be something like that going on causing the “screaming”. Lesswrong, in it’s better moments (in between chatbot anthropomorphizing), does occasionally figure out the mechanics of cool LLM glitches (before it goes back to wacky doom speculation inspired by those glitches), but there isn’t any effort to do that here.
printf("HELP I AM IN SUCH PAIN")
guys I need someone to talk to, am I justified in causing my computer pain?