Just about everyone looks better when they smile. It’s true regardless of gender. I don’t see where sexism enters the equation.
I feel pretty oblivious. What am I missing?
Just about everyone looks better when they smile. It’s true regardless of gender. I don’t see where sexism enters the equation.
I feel pretty oblivious. What am I missing?
I agree with the sentiment, but I, a man, actually have customers tell me to smile more weirdly often working retail.
And they say it’s because “you’re prettier when you smile” or something like that?
That’s one exception that doesn’t surprise me. Do you have any sense of how often they are doing this with intentional irony compared to with genuine obliviousness?
It’s really only creepy old dudes I get it from. It seems pretty genuine most of the time. These comments are more frequent and more egregious with my women coworkers, though, as one might expect.
How interesting! That makes it even less surprising.