Associations to the word «Spain»

Wiktionary

SPAIN, proper noun. A country in Europe, including most of the Iberian peninsula. Official name: Kingdom of Spain (Reino de España).

Dictionary definition

SPAIN, noun. A parliamentary monarchy in southwestern Europe on the Iberian Peninsula; a former colonial power.

Wise words

Wisdom does not show itself so much in precept as in life - in firmness of mind and a mastery of appetite. It teaches us to do, as well as talk, and to make our words and actions all of a color.
Lucius Annaeus Seneca