Associations to the word «Deutschland»

Dictionary definition

DEUTSCHLAND, noun. A republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990.

Wise words

Words differently arranged have a different meaning, and meanings differently arranged have different effects.
Blaise Pascal