A Definition Of Plantar
Source: Adapted from the National Institutes of Health
What does the term “plantar” mean? The term “plantar” refers to having to do with the sole of the foot. To find out more about this term, please search the news section of this website for related articles and information.