In case you're one of those people who doesn't give a flying fuck about rape, here's one now.
By now we should all be aware that myths that surround rape are merely that: myths.
Sadly many cultures still have these myths. These include those areas where a raped woman is presumed to have brought it upon herself and is considered entirely to blame for what has happened. In these cultures it is sometimes demanded of the woman that she kill herself “in order to get her honour back”.
Which is kind of like driving your car off a cliff in order to fix a chip in the paint.
But these myths are obviously patriarchal bullshit, and thus they are of little interest to me.
What I want to talk about is something closer to home.
Because it seems to me that the idea that a woman who is raped is inevitably scarred for life is basically just patriarchy, disempowering women.