Associations to the word «Germany»

Pictures for the word «Germany»

Wiktionary

GERMANY, proper noun. (geography) The Central European state formed by West Germany's 1990 absorption of East Germany, with its capital in Berlin.
GERMANY, proper noun. (geography) (historical) The Central European state formed by Prussia in 1871 or its successor states, with their capitals in Berlin.
GERMANY, proper noun. (geography) (historical) A nominal medieval kingdom in Central Europe forming a region of the Carolingian and Holy Roman empires, with various capitals; by extension, the Holy Roman Empire itself, the empire of the Austrian Habsburgs.
GERMANY, proper noun. (geography) (chiefly historical) The nation of the German people, regardless of their political unification (see usage note).
GERMANY, proper noun. (countable) (geography) (historical) West or East Germany or any other German state (see usage note); (in the plural) both, several, or all of these states, taken together.

Dictionary definition

GERMANY, noun. A republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990.

Wise words

The most important things are the hardest things to say. They are the things you get ashamed of because words diminish your feelings - words shrink things that seem timeless when they are in your head to no more than living size when they are brought out.
Stephen King