Associations to the word «Feminism»

Wiktionary

FEMINISM, noun. (dated) The state of being feminine; femininity. [from 1851; less common after 1895]
FEMINISM, noun. A social theory or political movement which argues that legal and social restrictions on women must be removed in order to bring about equality of both sexes in all aspects of public and private life.

Dictionary definition

FEMINISM, noun. A doctrine that advocates equal rights for women.
FEMINISM, noun. The movement aimed at equal rights for women.

Wise words

Since a politician never believes what he says, he is quite surprised to be taken at his word.
Charles de Gaulle