Associations to the word «Feminism»

Wiktionary

FEMINISM, noun. (dated) The state of being feminine; femininity. [from 1851; less common after 1895]
FEMINISM, noun. A social theory or political movement which argues that legal and social restrictions on women must be removed in order to bring about equality of both sexes in all aspects of public and private life.

Dictionary definition

FEMINISM, noun. A doctrine that advocates equal rights for women.
FEMINISM, noun. The movement aimed at equal rights for women.

Wise words

The pen is mightier than the sword.
Edward George Bulwer-Lytton