I for sure can conceptualize it, but please explain how I’ve seen it, where and when, since it’s not a physical object, has no shape, color, smell, mass…
I can’t even really imagine a naked child in my mind and even if I tried to, it would be my imagination and not what the real thing actually looks like, and I can’t know what real thing looks like unless I saw it.
You’re a complete moron if you think “nudity”, a concept, is equivalent to an adjective without a verb which you omitted because otherwise you cannot even cope with sending it since you know that you’re wrong.
Horse is a concept, to the machine. That’s the point. It cares even less about grammar than you do, as you insist there’s no verb in the sentence ‘what is horse.’ (It’s is.)
Type in “purple apple” and it’ll give you the concept of apple plus the concept of purple.
The concept of nudity is in every wonky NSFW content detection network. The machine absolutely has some model of what it means. Rage about it to someone else, if you can’t handle that.
Correct, and these models produce hallucinations. That’s literally what the process is called.
Do you think AI is a camera? Are you still convinced there’s a horse hearse somewhere out there, in real life? The model makes shit up. The shit it makes up is based on a bunch of labeled images - for example, hearses and horses. It can combine the two even if those two things never coexist, anywhere. So of course it can combine the two even if those two things simply do not coexist, in its training data.
That’s very clearly a modern car on horse legs, nothing that I didn’t see before.
You’ve seen both children and nudity.
What’s “nudity”, how can I see it?
I for sure can conceptualize it, but please explain how I’ve seen it, where and when, since it’s not a physical object, has no shape, color, smell, mass…
I can’t even really imagine a naked child in my mind and even if I tried to, it would be my imagination and not what the real thing actually looks like, and I can’t know what real thing looks like unless I saw it.
What’s horse?
What’s modern?
You’re a complete moron if you think “nudity”, a concept, is equivalent to an adjective without a verb which you omitted because otherwise you cannot even cope with sending it since you know that you’re wrong.
Horse is a concept, to the machine. That’s the point. It cares even less about grammar than you do, as you insist there’s no verb in the sentence ‘what is horse.’ (It’s is.)
Type in “purple apple” and it’ll give you the concept of apple plus the concept of purple.
The concept of nudity is in every wonky NSFW content detection network. The machine absolutely has some model of what it means. Rage about it to someone else, if you can’t handle that.
Removed by mod
Correct, and these models produce hallucinations. That’s literally what the process is called.
Do you think AI is a camera? Are you still convinced there’s a horse hearse somewhere out there, in real life? The model makes shit up. The shit it makes up is based on a bunch of labeled images - for example, hearses and horses. It can combine the two even if those two things never coexist, anywhere. So of course it can combine the two even if those two things simply do not coexist, in its training data.
The training data includes children.
The training data includes nudity.
That’s enough.
Removed by mod