You should smile more. Ever heard that sentence before? probably sounds familiar.
Either you’re a man and you’ve said it before or you’re a woman and you’ve heard men making comments like these. I sure have, way too many times. Teachers, male friends or anonymous strangers passing by — the list goes on.
But the most frightening time that a guy ever commented about my smile was when I was at the cashier, about to pay for my burrito and he just looked at me, handed me a coupon for a free meal and said : “hopefully you’ll smile more next time”. See, a man would never tell another man to smile. Simply because men are not expected to smile the same way women are. But as a woman we should just deal with it. This is the kind of thinking I’m not willing to accept anymore and needs to change.
So what, women should always smile even they don’t feel like it? What if I’m just tired, or perhaps had a bad day? Am I supposed to fake a smile, to assure myself I won’t have to hear such an inappropriate comment? And even if I am happy or enjoying myself, I should be expected to physically show it?
It’s condescending and women shouldn’t be reduced to being a “pretty face that should always smile”.
Let me clear the air by saying that I am in no way a radical feminist, but I have talked about this before and let me tell you that just about every other women has a story similar to mine.
“Men tell women to smile because society conditions men to think we exist for the male gaze and for their pleasure. Men are socialized to believe they have control over women’s bodies. This [is the] result in them giving unsolicited instructions on how we should look, think and act.” says Bené Viera, writer and activist.
I’m not trying to bash men, neither am I saying that all men tell women to smile. But they’re have been enough men who’ve commented on my smile to make this worth addressing. Although you’re intent might be purely innocent, it’s dictatorial and should not happen.
I just think that some men don’t realize the impact it has on women. The worst part is that I’ve gotten so used it that I don’t even bother to react anymore, but it automatically puts me in a bad mood. It makes me feel extremely uncomfortable and it’s annoying.
The most common excuse is that men, are just trying to flirt. But that’s not the way to do it. You are basically ordering me to smile and I feel objectified. Someone I don’t know wants me to look pretty or happy, seemingly for his benefit and not mine. Some people mean well when the say to smile and that is wonderful, it is great to genuinely care about the wellbeing of others, however more often than not, this is perceived as unwanted.
I just want to tell all the men out there that telling a woman to smile is not charming, or cute or something we women should lighten up about. You should ask yourselves why you do it and how you would react if another man asked you to smile. We women shouldn’t be told what to do, and especially by random strangers we don’t even know. We’ll smile whenever we feel like it, and not because you asked us to.