The americans are at the forefront in the war against Islam..this is just one aspect of the attack. The US is targetting the Islamic social system; the rules related to men and women; the family and the ethics/morals in Islam.


Why propogate and defend such a unnatural and filthy idea? What does this say about the western culture?