Associations to the word «Dermis»
Wiktionary
DERMIS, noun. (anatomy) The tissue of the skin underlying the epidermis.
Dictionary definition
DERMIS, noun. The deep vascular inner layer of the skin.
Wise words
Words differently arranged have a different meaning, and
meanings differently arranged have different effects.