Associations to the word «Feminism»


FEMINISM, noun. (dated) The state of being feminine; femininity. [from 1851; less common after 1895]
FEMINISM, noun. A social theory or political movement which argues that legal and social restrictions on women must be removed in order to bring about equality of both sexes in all aspects of public and private life.

Dictionary definition

FEMINISM, noun. A doctrine that advocates equal rights for women.
FEMINISM, noun. The movement aimed at equal rights for women.

Wise words

The most important things are the hardest things to say. They are the things you get ashamed of because words diminish your feelings - words shrink things that seem timeless when they are in your head to no more than living size when they are brought out.
Stephen King