Associations to the word «Spain»

Wiktionary

SPAIN, proper noun. A country in Europe, including most of the Iberian peninsula. Official name: Kingdom of Spain (Reino de España).

Dictionary definition

SPAIN, noun. A parliamentary monarchy in southwestern Europe on the Iberian Peninsula; a former colonial power.

Wise words

Occasionally in life there are those moments of unutterable fulfillment which cannot be completely explained by those symbols called words. Their meanings can only be articulated by the inaudible language of the heart.
Martin Luther King Jr.