Associations to the word «Deutschland»

Dictionary definition

DEUTSCHLAND, noun. A republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990.

Wise words

Men govern nothing with more difficulty than their tongues, and can moderate their desires more than their words.
Baruch Spinoza