Associations to the word «Nigeria»

Wiktionary

NIGERIA, proper noun. Country in West Africa, south of the country of Niger. Official name: Federal Republic of Nigeria.

Dictionary definition

NIGERIA, noun. A republic in West Africa on the Gulf of Guinea; gained independence from Britain in 1960; most populous African country.

Wise words

The most important things are the hardest things to say. They are the things you get ashamed of because words diminish your feelings - words shrink things that seem timeless when they are in your head to no more than living size when they are brought out.
Stephen King