Men do need to be taught that rape isn't okay. Women do need to be taught that standing up to men is okay. I'm not seeing anything retarded here.
What.
What. I'm a man, I never needed to be taught that sticking my man parts in a female, or a male, without their express consent, is not acceptable behaviour. Just like I didn't need to be taught that murdering people is
not okay. Most human beings are equipped with a sense of right and wrong, and while a lot of things need to be taught, some things just feel wrong on a basic level. Violating a person's basic physical space and harming and/or killing them (which is the end result of rape) tends to trigger something called "empathy."
Ultimately though, the reason why the whole concept of "men need to be taught not to rape" is ludicrous is because rape is not a gender specific crime. Any person can sexually assault any other person.
And it happens. Daily. Against all genders. So the idea that only one sex needs to be taught not to rape is just... Ridiculously offensive and a sexist concept in and of itself.
Still, I'm sure that wasn't your intention, as most who state that line make no intent of dehumanizing a group of people, or painting one sex as being "more prone to evil" than the other.
You want to fix sex-related crimes? First, you need to teach
everyone exactly what consent is in legal terms. You need to teach everyone to look at people as just that--people, first. A woman is a
person first. A man is a
person first. A white is a
person first. A black is a
person first. And people all deserve equal rights and freedoms, all deserve some level of respect, privacy, freedom of speech and association and all that lovely stuff many of our forefathers fought and died for.
Telling men not to rape isn't going to stop the rapists among them. There's something seriously pretty wrong with them already for them to be doing that.
tl;dr: Please don't pretend that rape is a male only crime. That doesn't help your argument and it certainly doesn't help rape victims.