Associations to the word «Africa»

Wiktionary

AFRICA, proper noun. (historical) A province of the Roman Empire containing what is now modern Tunisia and portions of Libya.
AFRICA, proper noun. The continent that is south of Europe, east of the Atlantic Ocean, west of the Indian Ocean and north of Antarctica. It holds the following countries:

Dictionary definition

AFRICA, noun. The second largest continent; located to the south of Europe and bordered to the west by the South Atlantic and to the east by the Indian Ocean.

Wise words

He that hath knowledge spareth his words.
Francis Bacon