Horse is a concept, to the machine. That’s the point. It cares even less about grammar than you do, as you insist there’s no verb in the sentence ‘what is horse.’ (It’s is.)
Type in “purple apple” and it’ll give you the concept of apple plus the concept of purple.
The concept of nudity is in every wonky NSFW content detection network. The machine absolutely has some model of what it means. Rage about it to someone else, if you can’t handle that.
Correct, and these models produce hallucinations. That’s literally what the process is called.
Do you think AI is a camera? Are you still convinced there’s a horse hearse somewhere out there, in real life? The model makes shit up. The shit it makes up is based on a bunch of labeled images - for example, hearses and horses. It can combine the two even if those two things never coexist, anywhere. So of course it can combine the two even if those two things simply do not coexist, in its training data.
How does this relate to you being unable to understand that a car with wheels replaced to be horse legs is incomparable to inferring what a naked child looks like from adult porn?
Horse is a concept, to the machine. That’s the point. It cares even less about grammar than you do, as you insist there’s no verb in the sentence ‘what is horse.’ (It’s is.)
Type in “purple apple” and it’ll give you the concept of apple plus the concept of purple.
The concept of nudity is in every wonky NSFW content detection network. The machine absolutely has some model of what it means. Rage about it to someone else, if you can’t handle that.
Removed by mod
Correct, and these models produce hallucinations. That’s literally what the process is called.
Do you think AI is a camera? Are you still convinced there’s a horse hearse somewhere out there, in real life? The model makes shit up. The shit it makes up is based on a bunch of labeled images - for example, hearses and horses. It can combine the two even if those two things never coexist, anywhere. So of course it can combine the two even if those two things simply do not coexist, in its training data.
The training data includes children.
The training data includes nudity.
That’s enough.
Removed by mod
Correct.
Which of those things do you think AI produces? Hallucinations, or reality?
How does this relate to you being unable to understand that a car with wheels replaced to be horse legs is incomparable to inferring what a naked child looks like from adult porn?
What about it is incomparable, dingus? That’s literally what it does. As surely as it combines any other two things.
Removed by mod
The machine doesn’t know the difference. It’s just pixels and labels.
Steampunk isn’t a thing, but AI can generate the hell out of it.