Associations to the word «Spain»

Wiktionary

SPAIN, proper noun. A country in Europe, including most of the Iberian peninsula. Official name: Kingdom of Spain (Reino de España).

Dictionary definition

SPAIN, noun. A parliamentary monarchy in southwestern Europe on the Iberian Peninsula; a former colonial power.

Wise words

Words are always getting conventionalized to some secondary meaning. It is one of the works of poetry to take the truants in custody and bring them back to their right senses.
William Butler Yeats