A Definition Of Ingrown Nail
Source: Adapted from the National Institutes of Health
What does the term “ingrown nail” mean? The term “ingrown nail” refers to edges of the nail become trapped under the skin, causing inflammation and, sometimes, infection. To find out more about this term, please search the news section of this website for related articles and information.