Associations to the word «Spain»

Wiktionary

SPAIN, proper noun. A country in Europe, including most of the Iberian peninsula. Official name: Kingdom of Spain (Reino de España).

Dictionary definition

SPAIN, noun. A parliamentary monarchy in southwestern Europe on the Iberian Peninsula; a former colonial power.

Wise words

Words to me were magic. You could say a word and it could conjure up all kinds of images or feelings or a chilly sensation or whatever. It was amazing to me that words had this power.
Amy Tan