Associations to the word «Katar»
Noun
1 |
Dictionary definition
KATAR, noun. An Arab country on the peninsula of Qatar; achieved independence from the United Kingdom in 1971; the economy is dominated by oil.
KATAR, noun. A peninsula extending northward from the Arabian mainland into the Persian Gulf.
Wise words
In words are seen the state of mind and character and
disposition of the speaker.