Associations to the word «Katar»

Dictionary definition

KATAR, noun. An Arab country on the peninsula of Qatar; achieved independence from the United Kingdom in 1971; the economy is dominated by oil.
KATAR, noun. A peninsula extending northward from the Arabian mainland into the Persian Gulf.

Wise words

It is only with the heart that one can see rightly; what is essential is invisible to the eye.
Antoine de Saint-Exupery