Associations to the word «Imperialism»


IMPERIALISM, noun. The policy of forcefully extending a nation's authority by territorial gain or by the establishment of economic and political dominance over other nations.

Dictionary definition

IMPERIALISM, noun. A policy of extending your rule over foreign countries.
IMPERIALISM, noun. A political orientation that advocates imperial interests.
IMPERIALISM, noun. Any instance of aggressive extension of authority.

Wise words

We should have a great fewer disputes in the world if words were taken for what they are, the signs of our ideas only, and not for things themselves.
John Locke