Associations to the word «Centrism»

Wiktionary

CENTRISM, noun. Any moderate political philosophy that avoids extremes.

Dictionary definition

CENTRISM, noun. A political philosophy of avoiding the extremes of left and right by taking a moderate position or course of action.

Wise words

Words, like nature, half reveal and half conceal the soul within.
Alfred Lord Tennyson