Associations to the word «Katar»
Noun
| 1 |
Dictionary definition
KATAR, noun. An Arab country on the peninsula of Qatar; achieved independence from the United Kingdom in 1971; the economy is dominated by oil.
KATAR, noun. A peninsula extending northward from the Arabian mainland into the Persian Gulf.
Wise words
Think twice before you speak, because your words and
influence will plant the seed of either success or failure
in the mind of another.