Associations to the word «Katar»
Noun
| 1 |
Dictionary definition
KATAR, noun. An Arab country on the peninsula of Qatar; achieved independence from the United Kingdom in 1971; the economy is dominated by oil.
KATAR, noun. A peninsula extending northward from the Arabian mainland into the Persian Gulf.
Wise words
We should have a great fewer disputes in the world if words
were taken for what they are, the signs of our ideas only,
and not for things themselves.