Associations to the word «Algerie»
Dictionary definition
ALGERIE, noun. A republic in northwestern Africa on the Mediterranean Sea with a population that is predominantly Sunni Muslim; colonized by France in the 19th century but gained autonomy in the early 1960s.
Wise words
Speak clearly, if you speak at all; carve every word before
you let it fall.