Associations to the word «Uruguay»

Pictures for the word «Uruguay»

Wiktionary

URUGUAY, proper noun. A country in South America. Official name: Oriental Republic of Uruguay

Dictionary definition

URUGUAY, noun. A South American republic on the southeast coast of South America; achieved independence from Brazil in 1825.

Wise words

Don't use words too big for the subject. Don't say "infinitely" when you mean "very"; otherwise you'll have no word left when you want to talk about something really infinite.
C. S. Lewis