Associations to the word «Egypt»

Pictures for the word «Egypt»

Wiktionary

EGYPT, proper noun. A country in North Africa. Official name: Arab Republic of Egypt.
EGYPT, proper noun. (historical) A civilization based around the river Nile, on its lower reaches nearer the Mediterranean.

Dictionary definition

EGYPT, noun. A republic in northeastern Africa known as the United Arab Republic until 1971; site of an ancient civilization that flourished from 2600 to 30 BC.
EGYPT, noun. An ancient empire to the west of Israel; centered on the Nile River and ruled by a Pharaoh; figured in many events described in the Old Testament.

Wise words

You can change your world by changing your words... Remember, death and life are in the power of the tongue.
Joel Osteen