Associations to the word «Haiti»

Pictures for the word «Haiti»

Wiktionary

HAITI, proper noun. A country in the Caribbean. Official name: Republic of Haiti. Capital: Port-au-Prince.

Dictionary definition

HAITI, noun. A republic in the West Indies on the western part of the island of Hispaniola; achieved independence from France in 1804; the poorest and most illiterate nation in the western hemisphere.
HAITI, noun. An island in the West Indies.

Wise words

Men govern nothing with more difficulty than their tongues, and can moderate their desires more than their words.
Baruch Spinoza