Associations to the word «Namibia»

Wiktionary

NAMIBIA, proper noun. Country in southern Africa. Official name: Republic of Namibia.

Dictionary definition

NAMIBIA, noun. A republic in southwestern Africa on the south Atlantic coast (formerly called South West Africa); achieved independence from South Africa in 1990; the greater part of Namibia forms part of the high Namibian plateau of South Africa.

Wise words

Love. Fall in love and stay in love. Write only what you love, and love what you write. The key word is love. You have to get up in the morning and write something you love, something to live for.
Ray Bradbury