A Definition Of Podiatry
Source: Adapted from the National Institutes of Health

What does the term “podiatry” mean? The term “podiatry” refers to the branch of medicine that pertains to the foot and its ailments. To find out more about this term, please search the news section of this website for related articles and information.


Share: