Associations to the word «Namibia»

Wiktionary

NAMIBIA, proper noun. Country in southern Africa. Official name: Republic of Namibia.

Dictionary definition

NAMIBIA, noun. A republic in southwestern Africa on the south Atlantic coast (formerly called South West Africa); achieved independence from South Africa in 1990; the greater part of Namibia forms part of the high Namibian plateau of South Africa.

Wise words

Wisdom does not show itself so much in precept as in life - in firmness of mind and a mastery of appetite. It teaches us to do, as well as talk, and to make our words and actions all of a color.
Lucius Annaeus Seneca