Associations to the word «Deutschland»

Dictionary definition

DEUTSCHLAND, noun. A republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990.

Wise words

Watch your thoughts, they become your words. Watch your words, they become your actions. Watch your actions, they become your habits. Watch your habits, they become your character. Watch your character, it becomes your destiny.
Anonymous