So Jerry just cited a study done by the CDC surveying 13,000 teenagers across the US, and they reported that 50% of teenagers have had oral sex by age 17.
He pointed out what my generation grew up believing: "that oral sex isn't a big deal, it's not like it's sex... you don't even need to be in a relationship...no body's going to get pregnant."
He then went on to say that when he was young oral sex was considered a bigger deal than sex..."You could have sex, but if you got to oral sex, then you were really intimate, then you were really serious."
I agree with that, there is a really intimacy with oral sex that doesn't have to be present during non-oral sex (I may be gay but I have had sex with a man)...you can detach from sex much easier (that's why I was able to do it). But I have to admit, when thinking about other people, say my high school students, I wouldn't be as worried to hear they were having oral sex, because "it's not as big of a deal"...that is just the idea that has been socialized into me, and it is hard to overcome that.
(...this post got too personal...but I'm going to leave it...)
What needs to be done to reinstate this idea of oral sex actually being a big deal?