Associations to the word «Deutschland»
Noun
Verb
Adverb
Dictionary definition
DEUTSCHLAND, noun. A republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990.
Wise words
A kind word warms a man throughout three winters.