Associations to the word «Uruguay»

Pictures for the word «Uruguay»

Wiktionary

URUGUAY, proper noun. A country in South America. Official name: Oriental Republic of Uruguay

Dictionary definition

URUGUAY, noun. A South American republic on the southeast coast of South America; achieved independence from Brazil in 1825.

Wise words

Wisdom does not show itself so much in precept as in life - in firmness of mind and a mastery of appetite. It teaches us to do, as well as talk, and to make our words and actions all of a color.
Lucius Annaeus Seneca