Associations to the word «Katar»
Noun
1 |
Dictionary definition
KATAR, noun. An Arab country on the peninsula of Qatar; achieved independence from the United Kingdom in 1971; the economy is dominated by oil.
KATAR, noun. A peninsula extending northward from the Arabian mainland into the Persian Gulf.
Wise words
The chief difference between words and deeds is that words
are always intended for men for their approbation, but deeds
can be done only for God.