Associations to the word «Namibia»

Wiktionary

NAMIBIA, proper noun. Country in southern Africa. Official name: Republic of Namibia.

Dictionary definition

NAMIBIA, noun. A republic in southwestern Africa on the south Atlantic coast (formerly called South West Africa); achieved independence from South Africa in 1990; the greater part of Namibia forms part of the high Namibian plateau of South Africa.

Wise words

The most important things are the hardest things to say. They are the things you get ashamed of because words diminish your feelings - words shrink things that seem timeless when they are in your head to no more than living size when they are brought out.
Stephen King