Associations to the word «IMPERIALISM»

Wiktionary

IMPERIALISM, noun. The policy of forcefully extending a nation's authority by territorial gain or by the establishment of economic and political dominance over other nations.

Dictionary definition

IMPERIALISM, noun. A policy of extending your rule over foreign countries.
IMPERIALISM, noun. A political orientation that advocates imperial interests.
IMPERIALISM, noun. Any instance of aggressive extension of authority.

Wise words

It is only with the heart that one can see rightly; what is essential is invisible to the eye.
Antoine de Saint-Exupery