Associations to the word «Haiti»

Pictures for the word «Haiti»

Wiktionary

HAITI, proper noun. A country in the Caribbean. Official name: Republic of Haiti. Capital: Port-au-Prince.

Dictionary definition

HAITI, noun. A republic in the West Indies on the western part of the island of Hispaniola; achieved independence from France in 1804; the poorest and most illiterate nation in the western hemisphere.
HAITI, noun. An island in the West Indies.

Wise words

However many holy words you read, however many you speak, what good will they do you if you do not act upon them?
Dhammapada