Associations to the word «Uruguay»

Pictures for the word «Uruguay»

Wiktionary

URUGUAY, proper noun. A country in South America. Official name: Oriental Republic of Uruguay

Dictionary definition

URUGUAY, noun. A South American republic on the southeast coast of South America; achieved independence from Brazil in 1825.

Wise words

Life has no meaning unless one lives it with a will, at least to the limit of one's will. Virtue, good, evil are nothing but words, unless one takes them apart in order to build something with them; they do not win their true meaning until one knows how to apply them.
Paul Gauguin