Associations to the word «Doctorship»

Noun

Adjective

1

Wiktionary

DOCTORSHIP, noun. Professional position or title of a doctor.

Wise words

Strong and bitter words indicate a weak cause.
Victor Hugo