Associations to the word «Imperialism»


IMPERIALISM, noun. The policy of forcefully extending a nation's authority by territorial gain or by the establishment of economic and political dominance over other nations.

Dictionary definition

IMPERIALISM, noun. A policy of extending your rule over foreign countries.
IMPERIALISM, noun. A political orientation that advocates imperial interests.
IMPERIALISM, noun. Any instance of aggressive extension of authority.

Wise words

Words are always getting conventionalized to some secondary meaning. It is one of the works of poetry to take the truants in custody and bring them back to their right senses.
William Butler Yeats