Associations to the word «Uruguay»

Pictures for the word «Uruguay»

Wiktionary

URUGUAY, proper noun. A country in South America. Official name: Oriental Republic of Uruguay

Dictionary definition

URUGUAY, noun. A South American republic on the southeast coast of South America; achieved independence from Brazil in 1825.

Wise words

We cannot always control our thoughts, but we can control our words, and repetition impresses the subconscious, and we are then master of the situation.
Florence Scovel Shinn