Associations to the word «Dermis»
Wiktionary
DERMIS, noun. (anatomy) The tissue of the skin underlying the epidermis.
Dictionary definition
DERMIS, noun. The deep vascular inner layer of the skin.
Wise words
Whatever words we utter should be chosen with care for
people will hear them and be influenced by them for good or
ill.