Associations to the word «Nigeria»


NIGERIA, proper noun. Country in West Africa, south of the country of Niger. Official name: Federal Republic of Nigeria.

Dictionary definition

NIGERIA, noun. A republic in West Africa on the Gulf of Guinea; gained independence from Britain in 1960; most populous African country.

Wise words

Occasionally in life there are those moments of unutterable fulfillment which cannot be completely explained by those symbols called words. Their meanings can only be articulated by the inaudible language of the heart.
Martin Luther King Jr.