Associations to the word «Haiti»

Pictures for the word «Haiti»

Wiktionary

HAITI, proper noun. A country in the Caribbean. Official name: Republic of Haiti. Capital: Port-au-Prince.

Dictionary definition

HAITI, noun. A republic in the West Indies on the western part of the island of Hispaniola; achieved independence from France in 1804; the poorest and most illiterate nation in the western hemisphere.
HAITI, noun. An island in the West Indies.

Wise words

Love. Fall in love and stay in love. Write only what you love, and love what you write. The key word is love. You have to get up in the morning and write something you love, something to live for.
Ray Bradbury