Associations to the word «Haiti»

Pictures for the word «Haiti»

Wiktionary

HAITI, proper noun. A country in the Caribbean. Official name: Republic of Haiti. Capital: Port-au-Prince.

Dictionary definition

HAITI, noun. A republic in the West Indies on the western part of the island of Hispaniola; achieved independence from France in 1804; the poorest and most illiterate nation in the western hemisphere.
HAITI, noun. An island in the West Indies.

Wise words

The most important things are the hardest things to say. They are the things you get ashamed of because words diminish your feelings - words shrink things that seem timeless when they are in your head to no more than living size when they are brought out.
Stephen King