A Definition Of Caries
Source: Adapted from the National Institutes of Health
What does the term “caries” mean? The term “caries” refers to decay of the teeth or bone. To find out more about this term, please search the news section of this website for related articles and information.