Associations to the word «Delegitimization»

Wiktionary

DELEGITIMIZATION, noun. The act or process of delegitimizing.

Wise words

It is only with the heart that one can see rightly; what is essential is invisible to the eye.
Antoine de Saint-Exupery