Associations to the word «Uruguay»

Pictures for the word «Uruguay»

Wiktionary

URUGUAY, proper noun. A country in South America. Official name: Oriental Republic of Uruguay

Dictionary definition

URUGUAY, noun. A South American republic on the southeast coast of South America; achieved independence from Brazil in 1825.

Wise words

It is better wither to be silent, or to say things of more value than silence. Sooner throw a pearl at hazard than an idle or useless word; and do not say a little in many words, but a great deal in a few.
Pythagoras