Associations to the word «Katar»

Dictionary definition

KATAR, noun. An Arab country on the peninsula of Qatar; achieved independence from the United Kingdom in 1971; the economy is dominated by oil.
KATAR, noun. A peninsula extending northward from the Arabian mainland into the Persian Gulf.

Wise words

False words are not only evil in themselves, but they infect the soul with evil.
Socrates