Associations to the word «Haiti»

Pictures for the word «Haiti»

Wiktionary

HAITI, proper noun. A country in the Caribbean. Official name: Republic of Haiti. Capital: Port-au-Prince.

Dictionary definition

HAITI, noun. A republic in the West Indies on the western part of the island of Hispaniola; achieved independence from France in 1804; the poorest and most illiterate nation in the western hemisphere.
HAITI, noun. An island in the West Indies.

Wise words

Since a politician never believes what he says, he is quite surprised to be taken at his word.
Charles de Gaulle