A Definition Of Dermatology
Source: Adapted from the National Institutes of Health
What does the term “dermatology” mean? The term “dermatology” refers to a medical specialty that deals with the skin. To find out more about this term, please search the news section of this website for related articles and information.