I had a talk with my teenage cousin, and his friends. This occurred a few days ago. Well, discussions went everywhere, and we were talking about anal sex and sexual education.
It seems that sexual education doesn't include anal sex, yet they talk about oral sex and vaginal sex. When I was growing up, nobody talked about that. And I talked to a friend of mine, an educator for the county, and he said that they don't talk about anal sex unless somebody brings it up... and nobody brings it up. Otherwise, it's willfully ignored.
That REALLY shocked me.
Girls are having anal sex and they're thinking that they're still virgins because they haven't done it vaginally.
Guys are forcing anal sex on girls too, thinking it's safe...
Gay guys are rubbing each other up unprotected, not knowing that genital warts could infect them even if it's just dry humping. And some gay guys (and girls) have gotten genitals just like that.
Why aren't sex education talking about this?
I mean, I know this is taboo... but it becomes important because comprehensive education is the best way to prevent people from unwittingly hurting themselves.
I think the only reason why this subject is still taboo...
because people fear how people are gonna cry, "Gay Agenda!" And things like that. This despite the fact that girls are at risk too. After all, most cases of anal cancer come from genital warts. So many of them are participating in it thinking that's a way to prevent pregnancy.
In the end, our schools can't call our sex education comprehensive if they leave any aspect out. You cannot teach only abstinence. You cannot teach only safer protection. You cannot leave some information just because it's taboo, or people are afraid of being accused of pushing the gay agenda on kids.