Associations to the word «Germany»

Pictures for the word «Germany»

Wiktionary

GERMANY, proper noun. (geography) The Central European state formed by West Germany's 1990 absorption of East Germany, with its capital in Berlin.
GERMANY, proper noun. (geography) (historical) The Central European state formed by Prussia in 1871 or its successor states, with their capitals in Berlin.
GERMANY, proper noun. (geography) (historical) A nominal medieval kingdom in Central Europe forming a region of the Carolingian and Holy Roman empires, with various capitals; by extension, the Holy Roman Empire itself, the empire of the Austrian Habsburgs.
GERMANY, proper noun. (geography) (chiefly historical) The nation of the German people, regardless of their political unification (see usage note).
GERMANY, proper noun. (countable) (geography) (historical) West or East Germany or any other German state (see usage note); (in the plural) both, several, or all of these states, taken together.

Dictionary definition

GERMANY, noun. A republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990.

Wise words

Life has no meaning unless one lives it with a will, at least to the limit of one's will. Virtue, good, evil are nothing but words, unless one takes them apart in order to build something with them; they do not win their true meaning until one knows how to apply them.
Paul Gauguin