my thought is coming from women that I know. It just seems like the majority are not confident about themselves... and want to leave.

Perhaps it is the lifestyle I was brought up in (very traditional and 1950s-like) but it has been my experience to see women not as confident and searching to impress the opposite sex to have them like them more or even want them more. Just a thought that was going through my mind.

Anyway, I apologize for my generalization. And also everyone is entitled to their own opinion. I know everyone does things to impress the opposite sex, but in this case I was referring to women, because it seems like women have more influences when it comes to physical, what the ideal women is.