Associations to the word «Germany»

Pictures for the word «Germany»


GERMANY, proper noun. (geography) The Central European state formed by West Germany's 1990 absorption of East Germany, with its capital in Berlin.
GERMANY, proper noun. (geography) (historical) The Central European state formed by Prussia in 1871 or its successor states, with their capitals in Berlin.
GERMANY, proper noun. (geography) (historical) A nominal medieval kingdom in Central Europe forming a region of the Carolingian and Holy Roman empires, with various capitals; by extension, the Holy Roman Empire itself, the empire of the Austrian Habsburgs.
GERMANY, proper noun. (geography) (chiefly historical) The nation of the German people, regardless of their political unification (see usage note).
GERMANY, proper noun. (countable) (geography) (historical) West or East Germany or any other German state (see usage note); (in the plural) both, several, or all of these states, taken together.

Dictionary definition

GERMANY, noun. A republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990.

Wise words

Trust only movement. Life happens at the level of events, not of words. Trust movement.
Alfred Adler