Feminism

Medical Dictionary -> Feminism

Feminism


The theory of the political, economic, and social equality of the sexes and organized activity on behalf of women's rights and interests. (Webster New Collegiate Dictionary, 1981)


© MedicalDictionaryweb.com 2012 | Contact Us | Terms of Use | Teeth Whitening | Low Carb Foods and Diets