Associations to the word «Katar»
Noun
| 1 |
Dictionary definition
KATAR, noun. An Arab country on the peninsula of Qatar; achieved independence from the United Kingdom in 1971; the economy is dominated by oil.
KATAR, noun. A peninsula extending northward from the Arabian mainland into the Persian Gulf.
Wise words
You can change your world by changing your words...
Remember, death and life are in the power of the tongue.