Why aren't girls taught that men lie?

I mean, granted, I don't think my parents taught me that, but consumption of old-timey books and such taught me that men will say everything to get women to have sex with them. (There's tons of folk songs and tales about men who promise marriage and then run away after getting what they want.)

Have those cautionary tales gone out of fashion?

I am a massively naive person; it took me years to grasp the concept of people intentionally saying things that are not true. I am very bad at lying myself. But still, I notice when men are lying to me in order to get into my pants. Because here's the thing: I expect them to. Now, I am still bad at determining lies I am not expecting. If a man tells me that he has committed a crime, or that he is involved with another woman, I take that at face value, because it is nothing that would attract me. I've been told some men invent such things to make themselves interesting, but that's where my lie detecting fails.


How on earth do those young girls grow up not being warned about that if a man can get into your pants with one easy lie - he will tell you that lie.

That song, Let No Man Steal Your Thyme, is practically a manifesto to me. I first heard it as part of the Alias Grace miniseries and it's haunted me since.